Abstract
AI-powered “undressing” apps generate sexually explicit deepfakes from authentic images without consent, representing a profound escalation in image-based sexual abuse. This study analyzes 29 such apps, examining their technical affordances, marketing strategies, revenue models, and privacy policies. Findings reveal that these platforms facilitate and encourage the creation of nonconsensual intimate images (NCII), normalize women's objectification, and contribute to a culture in which women's privacy and autonomy are undermined. By framing deepfake abuse as a form of gender-based violence, this research underscores the urgent need for regulatory interventions to mitigate NCII-related harms and protect victims from exploitation in digital spaces.
Keywords
Get full access to this article
View all access options for this article.
