Abstract
Sextech is currently experiencing a golden age, promising technological innovation to improve sexual health and well-being. However, the privacy and security vulnerabilities of smart sex toys have been the subject of media attention. Dating apps, menstrual trackers and sex toy companies have paid millions in compensation for non-consensually collecting or sharing intimate data. In this article, we share findings from a research workshop with prospective sextech industry professionals about how they approach data governance. The conversations reveal disconnections between the emancipatory, collective and rights-based possibilities offered by feminist and queer tech cultures, broader public interest in data commons and the technosolutionist narratives of start-up cultures. We conclude that there is a need for collaborations between industry, community and researchers to develop approaches to governance that reorganise, redistribute and decentralise the data economy of sex tech.
Keywords
‘Don’t Get Your Valentine an Internet-Connected Sex Toy’ warns a headline in Wired, proclaiming ‘It’s all sexy fun and games until someone hacks a WiFi-enabled butt plug’ (Dreyfuss, 2019). ‘High-tech sex toys are a living nightmare with big dreams’, declares an article in Mashable, where the author asserts that high-tech sex toys turned their love-life into ‘a Kafka-esque labyrinth of horny troubleshooting’ (Joho, n.d.). The privacy and security vulnerabilities of smart sex toys have been the subject of designated media attention in recent years, cautioning users about how their intimate data may be shared. Sexual wellness is currently experiencing a golden age, estimated to be a US$122bn industry by 2024 and promising technological innovation that can improve sexual health and well-being (Cookney, 2020). Forbes reports that ‘sex toy sales are buzzing with social distancing from COVID-19’ (Lee, 2020), citing various industry reports that sales increased significantly during state-enforced lockdown periods.
And yet emerging research has identified multiple privacy and security vulnerabilities in sextech devices that potentially leave users open to stigma, blackmail, prosecution and remote sexual assault (Giusto and Pastorino, 2021; Wynn et al., 2017). To support users navigating their way through a marketplace of interactive gadgets, devices and apps, Mozilla even released a special Valentine’s Day section of its ‘Privacy Not Included’ guide, ranking smart vibrators on a scale from ‘Not Creepy’ to ‘Super Creepy’ (Mozilla, 2019). A white paper on Sex in the Digital Era: How secure are smart sex toys? produced by IT security software service ESET reports that
[t]hough many experts have devoted time to identifying and reporting security flaws within this industry, with every passing year these devices incorporate an ever wider range of features . . . Each time their code is re-engineered, some vulnerabilities are corrected, new vulnerabilities may be created, and many more remain unchanged in the updated versions. (Giusto and Pastorino, 2021: 2)
What is unclear is how prospective sex tech entrepreneurs conceptualise and approach intimate data governance. While data issues are identified in existing sextech industry discussions, the topic tends to be presented in terms of the need to guard against data leaks, in order to protect the privacy, rights and security of individual data subjects (including both tech users and content producers) (Sex Tech Space, 2020). There has been less explicit discussion of more collective approaches to data justice or data sovereignty within sextech development and production. One notable exception to this rule is the Security issue of Sex Tech Space, an online magazine (founded by Alison Falk) that seeks to facilitate collaboration between technologists and marginalised communities, in order to foster more expansive and less stigmatising tech policies and practices (Falk, 2019; Sex Tech Space, 2020). Falk, however, is a minority voice in industry landscape that primarily views data as a source of market insights.
In this article, we reflect on findings from a research workshop with prospective sextech entrepreneurs which sought to understand how they approach the governance of intimate data. In an aspirational industry that attracts careerists from a variety of backgrounds, sex tech entrepreneurs are immersed in a start-up culture that foregrounds business, marketing and branding. However, we find that prospective sex tech professionals draw upon their previous experiences as data subjects to inform their professional attitudes towards data governance and are eager for information and guidance about best practice. Their liminal status reveals tensions between a rights-based approach, the emancipatory possibilities offered by queer and feminist tech cultures, broader public interest in data commons and the technosolutionist narratives of start-up cultures. We conclude that there is a need – and an appetite – for collaborations between industry, community and researchers to develop ethical approaches to governance that work to reorganise, redistribute and decentralise the data economy of sextech.
‘The next blue ocean’: sex tech start-up culture
In the back cover blurb for her 2019 book Sextech Revolution: The future of sexual wellness, Andrea Barrica, founder of online adult sex education video-clip platform O.School, argues that ‘sexual wellness is the next blue ocean for tech entrepreneurs and investors alike’. Barrica traces the lineage of the contemporary sextech industry from early US feminist and queer sex-shops in the 1970s and 1980s (as comprehensively documented by Comella, 2017) and the rise of online sex chat and information websites (such as Scarleteen) in the 1990s (Scarleteen, 2020). She attributes coining of the term ‘sextech’ to advertising and tech consultant Cindy Gallop (Barrica, 2019: 17), and her 2009 TEDtalk featuring her curated amateur sex video platform, MakeLoveNotPorn.
According to Barrica’s (2019: 18) definition (which reflects commonsense or vernacular usages of the term), sextech is a broad church nomenclature that can be applied to dating apps, lubricants, vibrators or sex robots. Noting that within the commercial space of technology and innovation, the categories of ‘sexual wellness’ and sextech are often conflated, Barrica (2019: 19) distinguishes between the two by observing that while both categories feature ‘products and services that focus on pleasure rather than just reproduction’, the term sexual wellness encompasses ‘all companies serving people’s sexual needs, including medical, pharmaceutical, healthcare, mental health, media, and other innovations, technologically based or not’. Additionally, the adjacent field of femtech focuses on ‘innovation that supports and improves female health by way of software, products, pharmaceuticals, and technology’ (Capriccio, 2021: 6).
Both tech cultures and broader labour markets have co-evolved since the late 1990s, normalising practices of what Gina Neff (2012) terms ‘venture labour’ – or the individual assumption of the risks associated with unstable economies and workplaces. As Neff (2012: 157) explains, start-up cultures encourage tech founders and employees alike to internalise and personalise market risks and systemic challenges – indeed an individualised responsibility for both risk and reward is now an essential attribute for most workers in this space. Established (and would-be) sextech and femtech founders display fluency in digital branding and combined self/product promotion. These attributes are deployed at conferences, tech conventions and across social media platforms, as vehicles for connection with both potential funders and tech consumers (Greene, 2021; Marwick, 2015).
Additionally, as Dan Greene observes in his 2021 study of digital ‘bootstrapping’ within US public service organisations, as government funded public education and welfare continues to decline, there has been an increasing tendency in both popular discourse and public policy to frame tech innovation as a solution for social inequality and injustice. This approach (which Greene dubs ‘the access doctrine’) is evident in Barrica’s (2019: 72) assertion that ‘support for private sector [sextech] innovators’ via venture capital is an essential antidote to insufficient school-based sexual and reproductive health education, and underfunded public health systems. Unlike public institutions like schools or public health organisations, successful start-ups need to demonstrate a capacity to pivot – or reframe core business strategies, mission and identity to continuously enhance market expansion and profits (Greene, 2021). Given that profit is an essential aspect of market success, it is unsurprising that ethical issues associated with the collection, aggregation, circulation and storage of user data have arisen for many sextech companies.
Data governance and sextech markets
Across both sextech and femtech industries, data have been positioned as a tool for knowledge and user empowerment (Flore and Pienaar, 2020). Tereza Hendl and Bianca Jansky (2022: 35) write that such technologies offer users the opportunity to ‘better understand their bodies; be in control of their bodies; and take ownership of their reproductive health’. Indeed, on an individual level, built-in sensors and real-time data can make devices more responsive and personalised, which can help people to track their sexual experiences and understand their bodies. However, Hendl and Jansky (2022: 50) conclude that the empowerment narrative is ‘contradictory, empirically unsubstantiated and gender oppressive . . . sell[ing] users a mediated, selective and inaccurate notion of knowledge’. There are often major discrepancies in the amount of data collected by apps compared with the information shared with users.
Dating apps, menstruation tracking apps and sex toy companies have been ordered to pay millions in compensation for collecting non-consensual intimate data or sharing their users’ personal information with third parties (e.g. We-Vibe, Flo and Grindr) (Privacy International, 2019). Such cases demonstrate the value of intimate data in an age of surveillance capitalism (Zuboff, 2019) and raise questions about how such data are being elicited, monetised and analysed. Rather than contributing to what Vinsel (2021) calls ‘criti-hype’, a trend that inflates the potential dystopian risks of big tech, or positioning ‘ethical AI’ design as something straight-forward, universal and scalable, we examine how individuals at the beginning of their entrepreneurial careers transition from being data subjects to becoming data brokers, to show how this influences their approaches to data governance.
Methods and approach: sextech goes to school
As noted above, sextech and femtech are rapidly emerging fields in the ‘start-up’ domain. To better understand the ways that ‘industry newcomers’ currently understand industry data practices, and how academics, industry and community might best collaborate to develop meaningful standards for data privacy, governance and ethics, we conducted a 3-hour virtual professional development and knowledge exchange workshop with 15 people seeking to enter the sextech industry in August 2021. The research was conducted in partnership with Sextech School, an established private training academy for sextech start-ups and would-be professionals, situated between Australia and the United States and coordinated online, founded by industry podcaster and speaker, Bryony Cole.
Cole is the presenter and CEO of the podcast Future of Sex, and Sextech School’s founder. She has a professional background in communication and strategy – primarily within the tech industry – and an education in commerce and sexual health. Cole has run a series of successful sextech hackathons in Australia, Singapore and the United States that build on the popularity of her podcast. Sextech School was launched in 2018, initially as a partnership between Future of Sex and tech consultancy The Disruptors Handbook. The school aims to teach newcomers to sextech (including absolute beginners) how to position their ideas in order to attract partners for ‘future development and investment’ (Future of Sex, 2018). A 2018 press release described Sextech School as a ‘pre-accelerator’ for those who lack tech industry connections and experience:
The teams that compete at our hackathons are always keen to take their solutions to the next level, but sometimes they just don’t know where to start. What they need is program that can help convert their ideas into viable businesses. (Future of Sex, 2018)
Our workshop with Sextech School comprised the first stage of a two-pronged project on the ethical governance of sextech. The second stage was a Public Interest Sex Tech Hackathon held in February 2022, which built upon insights from the workshop to investigate how stakeholders from LGBTQ+, disability, HIV positive, sex work and data activist communities understand inclusive sex technologies for public benefit (Stardust et al., 2022). Research participants were recruited by Sextech School from their August 2021 cohort. While we did not have ethical permission to collect detailed demographic information (and the sample was too small to be generalisable), Sextech School enrols students from global locations including Australia, the United Kingdom, Europe, North America, Asia and South America. Of those attending the 6-week online training course, the predominant demographic is women-identified people in the 30–40 age bracket. The curriculum includes four key modules: Industry Knowledge, Brand Building, Community Engagement and Business Models.
In the spirit of reciprocity, the research workshop combined knowledge exchange and data-elicitation activities, drawing on participatory methodologies previously deployed by K.A. . The authors’ approach was informed by our positionality as researchers with both insider and collaborative experiences of creating digital content within sex industry (Z.S.), sexual health promotion (K.A.) and multi-media production (J.K.). In the first part of the workshop, the research team offered short presentations about our past research on smart devices, feminist data governance, algorithmic profiling and user experiences of sextech. This was followed by a facilitated conversation among participants inviting them to speculate on how sextech professionals might apply both theoretical frameworks and practical approaches to their own design and data governance models. Participants were offered the option of contributing to a recorded Zoom discussion and/or typing directly into a shared Google doc.
While acknowledging the limitations of a one-off pilot conversation (as opposed to an extended research collaboration), we approached this workshop with dual aims. On one level we sought insights from prospective sextech entrepreneurs regarding the ways they currently understand the politics and economics of intimate data. At another, we had observed anecdotally that while research collaboration and sharing takes place in many sextech conventions and similar spaces, in most cases the focus tends to be biomedical or sexological, as opposed to socio-technical. Consequently, we hoped to explore the potential for ongoing collaborations drawing on approaches advanced by members of the Design Justice Network’s ‘Principles at Work’ group (Ruiz and DeCou, 2022; Spitzberg et al., 2020). That is, we sought to collaborate with sextech entrepreneurs (as opposed to purely offering critique), in order to both learn from them, and explore the potential for ethical and accountable sextech design, research and practices in the ‘grey areas’ of start-ups and professionalised tech workplaces.
We sought to investigate if and how sexual technologies can be governed at scale in ways that prioritise public interest benefit and feminist data ethics. The discussion was divided into five key topic areas: current approaches to existing sextech on the market, ethical data governance frameworks, enablers and barriers to operationalisation, equitable distribution of benefits of sextech, and interface design. Workshop transcripts and Google doc content were shared with participants for review, editing and feedback. They were then aggregated and thematically coded by all three researchers via an iterative series of conversations between the three authors. We interpreted interview data in dialogue with extant literature, via a process of abductive analysis (Tavory and Timmermans, 2014). That is, we combined inductive and deductive methods to undertake what Timmermans and Tavory (2022) term ‘focused coding’, seeking out themes we might expect to see (e.g. data security and design ethics) while remaining open to novelty and ‘surprises’ in the interview transcripts. Key themes identified via this process included sex and big data; data sensitivity; data ethics; data rights; data transparency, capability and capacity; platform infrastructure; data and consent; data integrity and security; data capture; data as a market insight; data brokerage and monetisation; data beneficiaries and design justice. Finally, a draft of this article was shared with industry partner Bryony Cole for the purposes of accuracy and sense-checking before submission for publication. No changes were suggested.
‘I’ve been dabbling’: participant visions, aspirations and pathways to sextech
Sextech is promoted as a gendered career aspiration and attractive career switch for professionals. The recruitment materials for Sextech School (as archived on Cole’s LinkedIn page) prioritise creativity, ‘mindset’ and passion over tech industry knowledge and experience. This framing of sextech as an inclusive space with low barriers to entry is echoed by Alison Falk’s introduction to the first Sex Tech Space magazine which states ‘if you own tech and use it on the reg, congrats! Consider yourself a technologist’ (Sex Tech Space, 2020). Consistent with this, the predominant pathway to Sextech School involved a personal catalyst or experience that sparked interest and drew an individual to the field. In a start-up culture where curiosity and enthusiasm are positioned as sufficient qualifications for entry, participants occupied a liminal space, as both outsiders/newcomers and insiders/proto-entrepreneurs.
Participants referenced interest in developing a variety of sextech, from platforms and apps to virtual experiences and physical objects. Some participants arrived at sextech following professional histories in sexual health, pleasure, education and business. For example, one participant described, ‘I am a sex educator and have recently just moved into the sextech space launching an online cohort-based sex education platform for adults’. For such participants, although they held personal and professional knowledge of sex education and health, their technical skills and data literacy were still emergent. Some had an entry level of familiarity and described having ‘been dabbling in the sextech industry for a few years’ while others reported being ‘completely new to the sextech space’. A few participants had arrived at the school from the tech industry, looking to apply their skills to sex. One data scientist had come to sextech with a pre-existing idea to develop: ‘I have a rough prototype for a web app which allows users to anonymise media content (basically creates an entirely new face, replacing theirs in videos). I thought this could be applied to an adult content platform’.
These divergent pathways meant that participants were cognisant of the need for more diverse designers and engineers to be involved in conceptualising new sextech, including people with diverse lived experiences and stakes in the technologies themselves. Participants were struck by our discussion of the gendered scripts and aesthetics programmed into sextech interfaces design of gynoids such as Realbotix’s Harmony (Strengers and Kennedy, 2021), with one describing them as ‘very het, and white, and monogamous’. Some expressed excitement about the increasing range of gender-inclusive sextech that was emerging, moving beyond gender and anatomical binaries and contouring to a wide variety of erogenous zones:
We’re starting to see more toys in particular, designed to be gender neutral, which is fantastic. But I think there’s still a long way to go and many more potential avenues to make products and services more inclusive.
This investment in inclusion-by-design is echoed by stakeholders across the tech industry, beyond the field of sextech. Critical technology scholars such as Sasha Costanza-Chock (2020) have called for more than simply inclusion, but instead an approach that centres upon ‘design justice’. As one of our participants reflected, creating technologies that are accessible, relevant and useful requires more than simply an ‘add and stir’ approach, but changes to the structures and processes of operation: ‘I think the design process would need to shift also to account for the collaboration required to be inclusive’. During their Sextech School course, participants had the opportunity to hear from the founders of Bump’n, a sex toy company designed for people with diverse mobility and access needs: ‘I think they’re a perfect example of when you have people from different lived experiences designing toys or any product, you increase the usability of that product for people who aren’t just white and able-bodied’.
When we asked about inclusive interface design, participants generally agreed on the need to develop sextech that supports the needs of marginalised communities who are most impacted by tech design, deployment and data governance – what Afsaneh Rigot (2022) calls ‘design from the margins’. However, they did not offer explicit detail on how this could be achieved or how these intentions might translate into actionable practices. This could be due to the early stage in their sextech journey or because they did not have experience in developing community-driven, community-owned and community-led technologies. The appreciation of design justice also sat in tension with more market-driven approaches to data elicitation and use that participants later espoused.
‘A pulse check on the industry’: approaching intimate data as a market insight
Drawing from their adjacent business endeavours and preliminary sextech initiatives, participants unsurprisingly saw data as providing a valuable market insight. Participants spoke about data generally as a floating signifier for a range of information, including front-end data (as elicited and expressed by users) and back-end data (such as IP addresses, geolocation and metadata). Data were seen as having potential use for multiple stakeholders: developers, society and users, and many described a positive role for big data in improving products, user experiences and health and well-being interventions. One participant, who was preparing their adult sex education platform, approached data as a tool for evidence-based design. They used data to understand and tailor sex education content to meet the needs of potential students, reporting, ‘I’ve . . . used data to get figures on people’s past sex education, and their current sexual activities and the trends that change with age, gender and orientation’.
Participants were not necessarily under the illusion that the data elicited from users was always a true or realistic record or representation. In our workshop, participants observed that individual’s sexual identities and experiences are complex and cannot always be quantified, verified or boxed into drop-down menus or text-boxes. Describing social media and dating app profiles as highly ‘curated’, one participant suggested ‘the data mined from user habits or location is . . . superficial’. Nevertheless, participants overwhelmingly emphasised the benefits of collecting data in order to create more personalised and relevant user experiences, across various sizes and scales of business. For example, a sex worker participant reflected on her experience gathering data from new clients in order to tailor her sessions:
I specifically want to know about that person’s entire sexual behaviour. So, when I craft a profile and I go and meet them in real time or I meet them in virtual, I don’t waste time doing acts that they’re not interested in. I get to the good stuff.
Some saw a productive role for data in revealing broader patterns and trends at a local, national and international level. A participant who was launching their own online e-commerce pleasure store remarked that ‘patterns and trends are important, as are personal/intimate data points’. Another, who had been involved in capturing data on sex stores in Australia and conducting market-based research on sextech trends, reflected that ‘Data is vital in understanding and sense checking what is going on – a pulse check on the industry at large’. Data therefore were seen to serve different stakeholders.
One of the key tensions in sextech is the discrepancy between how the benefits from sextech data are weighted and distributed. As Paro Mishra and Yogita Suresh (2021: 598) identify in femtech, although apps and devices ‘may enhance user capacity for self-knowledge’ they also ‘create possibilities for unprecedented levels of reproductive surveillance’. While menstrual tracking apps elicit large quantities of user data about cravings, health and habits, users predominantly receive data about their fertility window. Michele Estrin Gilman writes, ‘The technology industry has recognized that periods are profitable. However, these profits do not flow to menstruators; rather, they enrich private businesses’ (Gilman, 2021: 101). Indeed, Alnoor Bhimani (2020: n.p.) refers to this as ‘blood money’.
During the workshop we discussed who sextech data serves. Participants identified multiple beneficiaries:
Let’s say there’s multiple people benefitting . . . The users of the products or services are hopefully benefitting from the products and services they’ve purchased. The companies, through growth and revenue, are benefitting. You’d hope that society at large is benefitting from a greater understanding of sexual wellbeing and an understanding of sexual diversity.
While this is a generous understanding of how benefits are (or ought to be distributed), it is unlikely that the value of these benefits is equivalent. Data are used by corporations for company growth, monetisaton and expansion. One participant described working for a healthcare technology company with multiple brands focused on sexual health and wellness. As part of the advertising and creative services team, their role involved ‘understanding the efficiency of data and how it interrelates between the four brands . . . and also how they grow into new brands and new sectors’. They describe the company as being ‘very performance marketing driven. So, data and performance marketing is a big part of their growth strategy’. Where data is a core part of business growth, there may be a discrepancy in its value between the user and the company.
The monetary value of sextech data also influences the kinds of businesses and investors that arrive in this space. As Anita Gurumurthy and Nandini Chami (2022: 8) write, ‘The knowledge agenda on women’s health – based as it is in a growing market for big data and AI – is now set by Big Tech and Big Pharma’. Mishra and Suresh (2021: 599) note that in the global femtech industry, ‘most of its start-ups are either North American or European’ and that the market push ‘for greater penetration of Femtech into South Asia and other developing countries’ positions its products ‘as solutions for controlling the population in the global South and their “savage” tendencies to reproduce, lack of awareness about reproductive health; unsafe abortions and so on . . .’ (Mishra and Suresh, 2021: 599–600).
As a global business academy that trains students in marketing, branding and engagement, Sextech School provides a curriculum that encourages students to become sextech entrepreneurs, with a focus on attracting investors, launching a new career, finding profitable industry sectors and market success. The website invites participants to ‘learn from top sextech innovators to grow your business or leap into a new career’ (Sextech School, 2022). Within this framework, however, the workshop participants were mindful of their own struggles (and disappointments) with market-based approaches to data collection and use.
‘The consequences are much more severe’: understanding accountability, responsibility and user trust
Although participants were ambivalent or pessimistic about the sextech industry’s ability to self-regulate, they were similarly dubious about state regulation. In transitioning from their position as data subjects to data brokers, sex tech entrepreneurs are required to navigate complex legal obligations. These regulatory frameworks were confounding for some participants. Depending on where their business is in operation, sextech businesses could be subject to potentially hundreds of different data privacy laws around the world, at national, regional and provincial levels. The focus of such data protection legislation usually involves restrictions on collection and use of data and requirements for reporting, notification, transparency and consent. Many of these laws have extra-territorial application, meaning that they can apply to companies registered outside the country where the laws are enacted.
The participants experienced regulation as a minefield of potentially contradictory laws. Some participants had open questions about what regulatory or privacy standards would apply to their products and devices. They expressed a desire to comply but a sense of overwhelm about how to. The confusion over which standards apply for businesses operating across jurisdictions was seen as a barrier to even engaging:
I know that lots of toys are designed all around the world, and whose governance sits around that and how that applies to me in my home in Melbourne [Australia] is still really unclear for me. So, it tends to be that I tap out all together at this stage.
Other participants wanted further clarity about the legal frameworks around data use, such as whether they were required to put in place expiration dates and sunset clauses for the destruction of data. This is unsurprising, given that an investment in understanding this complex web of information is crucial for sextech businesses, especially as they are dealing in sexual data that is often afforded a higher level of privacy protection where it is considered ‘sensitive information’ relating to health, sexual orientation, sex life or sexual practices.
Participants were cognisant of the risks of collecting sensitive data in their upcoming business ventures and the potential for privacy breaches. One participant reflected that sextech is not unique in this space, but rather provides a useful case study for broader issues around data governance that many businesses are grappling with:
I think current data governance practices are problematic in general, not specific to sextech. Companies (especially start-ups) are pretty fast and loose with data today. However, while data management behaviour might be similarly poor across most companies, the consequences of mismanaging/leaking user data are potentially much more severe for sextech companies.
These consequences are particularly salient given that sextech is largely targeted to women and sexual minorities, groups who ‘have long been highly visible to technologies of surveillance’ (Shephard, 2020: 5). It is complicated by the problematic ways that queer data is collected and used (Guyan, 2022). The Entrepreneurs Handbook presents ‘lack of trust’ as a key issue affecting the sextech industry. In response, the Handbook suggests engendering respectability from users through marketing: ‘it’s about creating a credible, professional, and authentic image’ (Bosch, 2021). To this end, some concerns about data misuse among our participants were driven by reputational concerns. One participant pointed out that sensitive data breaches impacted not only users but contributed to compounding stigma that affects the sextech industry more generally:
I worry that bad data practice will reflect badly on the industry – there is already so much stigma and mis assumptions about the use of sex and tech and worry that one bad example of data misuse could impact on how all sextech companies are viewed by the public.
As upcoming sextech entrepreneurs, our participants deliberated over how to best balance user privacy and data collection. For those looking to apply data, collection practices were of utmost concern. As one participant reflected, ‘I do want to be reassured that the data available is accurate and gained ethically and responsibly’. Others were considering how to collect just enough data that would allow them to enhance and evaluate their service while still maintaining user trust. This was often a work-in-progress, and participants had more questions than answers. The participant who wanted to establish their own e-commerce pleasure store reflected:
Presently I am interested in understanding how to best create safe space and how data privacy plays a role in that. What data is enough? Where and how is it stored? What is the bare minimum information/data is required to create a safe experience and also continue to enhance service and benefit customer/client long term?
The participant who was creating their own prototype for online anonymity was particularly concerned with data storage. They proposed an innovative platform using synthetic or manipulated media content whereby adult performers would be able to digitally create new faces (and new virtual identities) to protect their own anonymity:
Data governance would be a huge issue for me – if users are using my platform to anonymise themselves, it would be imperative that there are no data leaks.
The same participant reflected that platforms sometimes have an ad hoc approach to data storage whereby privacy issues are an afterthought rather than a design decision at the point of development:
I briefly had a play with the Tinder API years ago for curiosity’s sake (I’m a web dev/data scientist). I found something pretty interesting/poor, which was they didn’t properly fuzz ages. So they were effectively returning all users’ exact date of birth. Not malicious on their part, just incompetent.
Participants deliberated over how data could be best managed to ensure data integrity and security, to prevent it from being leaked or hacked, and what the best approaches are to encryption, de-identification and anonymisation, in a context where data can be triangulated. They also shared accounts of where their own professional data had not been handled well:
So, essentially, I had all this email history and it’s a huge amount and I had a [personal assistant] corrupt my data and also hijack two of my key Gmail accounts and I had about two dozen Gmail accounts . . . I’m terrified that I will look incompetent that I didn’t manage all these people’s trust.
Participants reflected that businesses often start collecting data organically or ad hoc before they develop systematic approaches to dealing with it. One participant described this as ‘just capturing everything for the sake of it and then working out what they can do with it later’. It was generally believed that data governance and ethics was rarely built into the design phase of sextech businesses. Participants suggested that where subsequent data capture is unintentional or inadvertent, and sextech companies haven’t been thoughtful about the extent of their collection or its potential uses, there is a tendency towards over-capture – a practice one described as ‘greedy’:
I think companies unknowingly capture more than they need to and individuals divulge more than they are prepared to. I also believe that companies once they have data and work out what they want to do with it should disclose and be transparent to their users.
While less extractive models of data collection are being developed, trialled and successfully utilised around the world, these were not raised by participants. Data donation models, data co-operatives, data trusts and data stewards, who owe a fiduciary obligation to use data in the best interests of users, offer more collective benefits than protections based on individual privacy rights (Burgess et al., 2022). However, some participants expressed doubts regarding the likelihood of large sextech companies building robust consent or ethical data practices into their business models. As one remarked, ‘I’m filled with a bit of sadness because . . . in the age of data monetisation . . . one of their major income streams is selling data to third parties’. Another reflected, ‘I think collecting data ethically when driven solely by commercial intent is unlikely’.
Participants also grappled with the issue of convenience versus privacy in sextech design. One said, ‘we often use a social sign-in because it’s so much easier and then you don’t ever have any other engagement around how your data’s being used. It’s like click and forget’. Another reflected,
[W]ith dating apps, how you sign-in using Facebook, and it never occurred to me that data on how I’m using it could be sold to advertisers and then they’ve got my Facebook profile. I just thought, well, this is easiest, fastest way for us to sign-in. I didn’t really think through how that all works.
Some participants reflected that this was the first time engaging with issues of data governance from their position as a prospective business owner.
‘You can pay me for some of my cake’: balancing privacy, convenience and consent
Where they were not familiar with best practice data governance, participants turned to their own experiences as social media users as a reference point, especially regarding consent. Participants critiqued the simple tick box ‘Click: I agree’ manner in which privacy policies and terms of service were presented to users. They described this process as disingenuous because users are expected to consent to all terms without any genuine options to qualify consent or opt out of specific uses. Some were unsure about their rights as users: ‘I just want to know, if I don’t agree, will the product still work? It’s never clear so I always assume I have to consent to everything’.
Some data protection legislation sets out specific instructions for how consent is to be collected in order to be valid – for example, through positive actions (e.g. ticking a box), through choice of options (yes/no, agree/disagree, allow/block), avoiding bundling consent into the Terms and Conditions and avoiding the use of nudges (whereby the user has to agree as a condition of accessing the service) (Mehrnezhad and Almeida, 2021: n.p.). Despite this, many sextech and femtech businesses position themselves as ‘wellness’ (rather than health) businesses outside the purview of such legislation, and subsequently ‘privacy and data security practices are left to the discretion of the company’ (Rosas, 2019: 325). For example, in their study on fertility apps, Mehrnezhad and Almeida (2021) found that 40% of apps studied did not present any privacy-related content to users, making it difficult for them to obtain informed consent.
Participants drew upon discourses of sexual consent and power dynamics in order to understand consent in relation to personal data. One described how sextech businesses could learn from offline discourse on sexual consent, ‘trying to take the best practices from real-life consent sex negotiation into how we negotiate with our data’. Participants described consent as something that ought to be explicit, and payment as a means to recognise the value of that data rather than simply expecting users will give up their sexual data for free:
So, I’m of the view that if you concede any consent, you are basically undermining all of your power . . . You must ask loudly and claim it back . . . It’s ‘I consent. I say yes, I’m informed like this’ . . . I feel like we should hog it like a cake, and this is my cake, and you can’t have any of my cake, no way. But you can pay me for some of my cake.
There was general consensus among our participants that consent around data collection ought to be dynamic – as people’s situations change, consent ought to be able to be withdrawn, and companies ought to check in with users to see if the current, future, new uses of data are still acceptable. As one described it, ‘I don’t want to be served with sex toy adverts on my Instagram every single time I log on anymore, I’ve changed my mind’. Another participant, who worked as an escort, described consent as a dynamic process. As part of her sextech project she wanted to renegotiate consent with her clients to use their data for new purposes, but was concerned about the sensitivity of re-contacting past clients:
I’m now collating all of it and . . . I want to re-establish consent and I’ve really lost a lot of sleep over contacting these people through email . . . Because I’m very concerned that, especially when it comes to commercial sex, that when people’s relationship status change, that they can be a bit more lackadaisical about access to their emails.
Given that businesses are incentivised by profit, some participants called for individuals to be compensated for the use of their personal data, one calling for ‘kick-backs’ where companies are ‘profiting off data . . . mining from my usage of products’:
If I consent to a company using my data for profitable activity or research and development, I would like to see how and why that benefits me. How are they using those profits? Lining CEO’s pockets, investing back into production, donating to social causes? If I can see this, I then should be able to opt in or opt out of that based on my thoughts around their decisions.
Speaking from their positions as tech users, participants were concerned with their rights to access and delete their own data. While various jurisdictions now impose requirements on organisations to release personal data (with deadlines to hand over any user data they are processing), participants noted that information about how to retrieve one’s own data was not always readily available: ‘I think that companies and business hold the power at this point and a shift is required to balance the playing field’. In these moments, the participants positioned themselves as data subjects rather than as prospective data brokers:
I would like to be able to easily request and see what data a company actually holds on me . . . I would also like to know how to easily request that my data is deleted.
These calls for compensation and data rights not only recognise the significant disparity in the value of data between user and company but provide an opportunity for users to control and monetise their own data. We note, of course, that such an approach is still an individualist one, which, as Gilman (2021: 100) writes, ‘envisions an atomized person pursing their own self-interest in a competitive marketplace’. Certainly, an individual start-up founder’s (or researchers’) commitment to ethical practice does not change the overarching economy and infrastructure through which data (and knowledge) is extracted, shared and owned.
Sextech provides an apt case study for discussions about data feminism, which is fundamentally concerned with power ‘about who has it and who doesn’t, and about how those differentials of power can be challenged and changed using data’ (D’Ignazio and Klein, 2020: 19). Participants defaulted to popular narratives around consent that relied on consumer choice rather than the cultivation of more democratic data cultures or business models. They proposed various mechanisms for how automation might be used to facilitate more robust consent practices and assist users better understand privacy policies. In doing so they reflected techno-utopian approach that positions automation as efficient means to achieve ‘ethical AI’, in turn seen as something easily to implement and scale.
A recurring theme that emerged in our discussion was how to support informed consent, given the tension between offering users in-depth information about privacy while also offering information that was accessible, readable and easy to make decisions about. Our participants suggested that automation could be useful in offering summaries of privacy policies for users – for example, using natural language processing (NLP) to flag and identify common problematic clauses in contracts or privacy policies and alert users:
From an AI perspective . . . I suppose you could do something like use NLP to best summarise terms of use/privacy policies etc., but I’d be wary of anything AI that’s overly complicated.
Automation could similarly be used to build tailored contracts that respond to individual’s preferences and needs. The same participant suggested that blockchain could provide an interesting potential means to ‘manage’ consent:
I am a data scientist and I do a fair bit of AI, but what jumped out to me, was the potential for maybe blockchain being used for consent management . . . You have a ledger for every piece of consent you’ve given and hopefully that matches up . . .
Some participants suggested that automation could also be useful in facilitating dynamic consent. One suggested that automation could be used to help users navigate opt-in or consent to specific users of their data, in addition to prompting users to reconsider or renegotiate consent over time:
I like the idea, if your data’s being used for a new thing that that company is excited about, [they say] this is what’s happening, this is why it serves you and our community, here’s why we’re using it . . . And I think that would be appreciated.
Consent was viewed as an interaction, discussion and an iterative process rather than simply a transaction:
I wonder if, with the way at the moment, so much of this process is a really passive act, you just click a button and then it’s done. I wonder if there’s an ability, in particular as AI gets more sophisticated for that to be a discussion.
Overall, participants viewed consent through what might be termed a feminist lens, informed in part by their experiences as sexuality educators or sex workers, or by personal and political beliefs. Some responded to interview prompts around automation of sextech by drawing on technical knowledge and experiences (i.e. knowledge of programming). However, the majority drew on more everyday user experience of mundane technologies and platforms (i.e. Gmail and Facebook) to speculate on how automation decision-making should (or should not) be deployed in sextech.
Conclusion: opportunities and barriers for feminist data ethics
As an industry, sextech itself is both unexceptional and unique. It is unexceptional in the sense that it faces similar issues to other tech industries: accountability demands are being issued by aggrieved users, investigative media, ethical hackers, privacy organisations, researchers and regulatory bodies. These stakeholders are raising parallel questions about the value of data, its brokerage and monetisation, and who profits and benefits from technological entrepreneurship. Sextech, however, is unique in the sense that it invites individuals to enter the industry through unconventional pathways. While it encourages people from minority populations to participate, the industry simultaneously positions data from those communities as an untapped mine or ‘next blue ocean’ for investment, exploitation and business potential. Sexual stigma marks the industry, posing barriers to accessing finance, hosting, advertising and markets. However, stigma also forms part of sextech’s appeal, forming a central part of the industry’s narrative and acting as a platform for media, branding and recruitment.
Our study suggests that there is an appetite and interest among upcoming sextech professionals for greater literacy, awareness and skills in ethical data governance. We found that participants at the Sextech School drew from their experiences and frustrations as users (who expected their own data to be protected), reflected on their own sense of responsibility in handling sensitive data (aspiring to do it ethically), expressed desire for more information on data protection (how to comply with regulatory requirements) and sought improved accountability in industry standards and practices (looking for guidance on alternative approaches). It is unclear yet how the backgrounds of this particular group will influence their futures as sextech founders, especially as they comprised a cohort who were being schooled in marketing, branding and fundraising, but not necessarily in data governance. As such, we conclude that prospective sextech entrepreneurs would benefit from and be receptive to specific training in design justice, public interest technologies and feminist data ethics.
Techno-solutionist approaches position scale as the goal of start-ups and automation as the method. This was consistent among the prospective sextech entrepreneurs we spoke to, who framed data accountability through a lens of individual rights, privacy and consent. Accordingly, we found that because start-up cultures involve a boot-strap narrative and empire-building fantasy, they do not necessarily recognise the kinds of resources, social mobility or media power of start-up entrepreneurs, nor encourage them to devolve their own political power, redistribute their wealth or prioritise collective ownership. As a result, when generated in a start-up model, sextech may be limited in its capacity to practice feminist data ethics (and by corollary, in how it responds to systems of oppression or contributes to sexual justice). By contrast, conversations about data sovereignty and collective justice were more salient in the second phase of this project – the public interest sex tech hackathon where a panel of community activists issued political provocations for data futures. Instead of emphasising commercialisation and scalability (as start-ups do), those participants engaged in a more distinct focus on local, co-operative approaches and political outcomes, including opportunities to use sextech for counter-surveillance (Stardust et al., 2022).
While industry-research sextech partnerships do exist, many of the current collaborations are focused upon collecting and analysing data for therapeutic or sexological insights rather than examining the complexities of data justice. Given the increasing public scrutiny around data sharing in femtech and sextech industries, and the high stakes of intimate data, our findings suggest that there are productive possibilities for future collaborations between sextech industry, feminist researchers and affected communities. If feminist data ethics is concerned with examining how data flows, ‘decentralising data power’, and ensuring ‘control of the Internet architecture is wrested back from the privatized platforms’ (Gurumurthy and Chami, 2022: 10), then individualist approaches that are solely concerned with consent, privacy, notice and transparency will not be sufficient towards this end. Our research demonstrates that the existence of non-linear pathways to sextech and the liminal space that prospective sextech entrepreneurs occupy (transitioning from data subjects to data brokers) provides a ripe opportunity for re-thinking industry practice, including how sextech could function more equitably and collectively than comparable tech start-ups. The challenge for the upcoming generation of sextech workers will be to develop alternative governance mechanisms and business structures that operate not simply as corporate ethics-washing but in ways that devolve power, decentralise data, uplift marginalised communities, and collaboratively produce and equitably distribute knowledge.
Footnotes
Disclosure
We confirm that all authors have agreed to the submission and the article is not currently being considered for publication by any other print or electronic journal.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship and/or publication of this article: This research was conducted by the ARC Centre of Excellence for Automated Decision-Making and Society and funded fully by the Australian Government through the Australian Research Council (grant number CE200100005).
