Abstract
AI assistants such as Siri and Alexa have historically been coded as female. Even ELIZA, the first chatbot and an early AI therapist, was coded as female. Interestingly, male- and female-sounding voice assistants respond differently to situations, according to gendered social expectations and stereotypes. This raises ethical concerns, as care work has historically been associated with women and has been the basis of oppression in the capitalism–patriarchy nexus. Simultaneously, the rise of artificial intelligence (AI) sex robots, chatbots and AI-generated pornographic content is leading to increased commodification of sex, higher rates of addiction and further reproduction of notions of non-consensual sexual activity.
In this article, we aim to conduct an impact-oriented analysis of AI applications on cisgender women (as trans women face unique challenges due to AI, such as inaccurate facial recognition) through the process of othering and objectification. We demonstrate the negative impacts of AI technologies on women by studying their condition in the domains of sex and care work. We then place these trends in the larger context of ethics and political polarisation between the sexes, as seen by the recent gender ideology gap across nations. Lastly, we aim to predict the future of AI-regulated gender relations and the possibility of AI neutrality by undertaking a socio-technical approach to AI policy.
Keywords
Get full access to this article
View all access options for this article.
