Abstract
Over the past year, many young creators who use the Chinese-owned social networking platform TikTok have claimed that its underlying algorithm surveils and suppresses the reach of content by Black, brown, fat, queer, and disabled creators. However, despite these algorithmic biases, these marginalized creators have continued to find new and ingenious ways to not only create but also successfully share anti-racist, anti-misogynistic, LGBTQIA+supportive, and body-positive content on the platform. Using this tension, this essay engages visual content analysis and critical technocultural discourse analysis to examine the innovative ways marginalized creators employ TikTok’s various medium and technological affordances to evade algorithmic surveillance and oppression. Building on Simone Browne’s concept of dark sousveillance, I theorize these practices as acts of digital dark sousveillance, defined within the essay as the use of digital tools to enact surveillance subversion, obfuscation, inversion while operating within systems of racializing surveillance.
Get full access to this article
View all access options for this article.
