Abstract
Through a survey amongst 100 users who post pole dancing content, this study evaluates the efficacy of Instagram's Account Status tools for the transparency and appeal of shadowbanning. Instagram's 2023 Account Status update appeared, on paper, to be a revolution for shadowbanned users: finally, after having to guess if they were shadowbanned since 2019, users could check if their content and profile violated Instagram's newly published Recommendation Guidelines, making it ineligible for recommendations on the Explore page and to non-followers. However, this paper's findings show that pole dancers – a demographic greatly affected by algorithmic precarity in the creator economy, crucial in providing some of the first examples of shadowbanning – found Account Status tools to be ineffective and discriminatory. Users surveyed found that the update merely informs them of violations’ detections, falling short of educating them about improving posting behaviour and of providing significant redress mechanisms. As such, this paper finds Account Status to be an exercise in performative transparency, a corporate box-ticking exercise engaging with a form of disclosure surrounding governance decisions that only serves the purpose of dodging opacity accusations without meaningfully engaging in communications about governance.
Introduction
Through a survey amongst 100 users who post pole dancing content, this study evaluates the efficacy of Instagram's Account Status tools for the transparency and appeal of shadowbanning. A light censorship technique whereby Instagram hides or avoids recommending content and accounts to its Explore page, limiting the visibility of posts and profiles (Are, 2021a; Cotter, 2021), shadowbanning is one of platforms’ most controversial content moderation techniques, since, often, users are not notified of its deployment, leaving them to wonder whether their lack of visibility is due to their content's quality or to algorithmic demotion (Are, 2021a; Cotter, 2021). This form of content moderation is becoming far-reaching, affecting largely users who post nudity and/or sex-related content, but increasingly also political content, particularly within the realms of activism related to the pro-Palestinian cause (Abokhodair et al., 2024). And although research has found that many users who believe they are shadowbanned are often cisgender Republican men (Nicholas, 2022), the reality seems to be that women and marginalised communities are the main target of this mysteriously applied content moderation action (Haimson et al., 2021).
Instagram's Account Status appeared, on paper, to be revolutionary for transparency over and user understanding of platform governance. Finally, after having to guess if they were shadowbanned since 2019, users could check if they violated Instagram's newly published Recommendation Guidelines, making them ineligible for recommendations to non-followers (Instagram, n.d.), or if, as Instagram often suggested, they were blaming the platform for their own boring posts (Cotter, 2021). Meta, Instagram's parent company, implemented these notifications in 2023 via Account Status, a section in the app's settings, supposedly in response to demands for further clarity and transparency about content moderation (Gerken, 2022) and to mounting user conjectures and conspiracy theories (Nicholas, 2022; Savolainen, 2022).
This paper focuses on users who post pole dancing content – as a hobby or professionally as instructors, performers, strippers, photographers, brands, etc. – because, as a section of the creator economy straddling fitness, performance art and sex work, pole dancing has been crucial towards the identification and understanding of the platform's approach to algorithmic prioritisation of content, constituting the first proven case of Instagram admitting they tampered with algorithmic recommendations (Leybold and Nadegger, 2023).
It is following digital protests and direct apologies from Instagram to pole dancers that more knowledge about shadowbans has been gained amongst users (Leybold and Nadegger, 2023). Shadowbanning became an example of how the Meta-owned app's largely algorithmic moderation and its Community Guidelines inhibit user expression and largely operate without context, making digital spaces for sensuality and sexuality shrink (Are and Paasonen, 2021; Blunt et al., 2020) amidst pressures from conservative interest groups and to protect the company's public image with governments and advertisers (Griffin, 2023; Nolan-Brown, 2022). Given pole dancing's initial relevance in the understanding of shadowbanning, it is the ideal case study to evaluate Instagram's Account Status tool. This paper approaches this evaluation with qualitative survey data, analysed via a thematic analysis and centring on pole dancing users’ experience. In doing so, it contributes to platform governance literature by shining a light on a previously under-researched aspect of platforms’ reactions to public perceptions of their governance through an emblematic community, who provide a starting point to examine overlapping content such as education, performance, art, news and activism (e.g. see Haimson et al., 2021).
This paper engages in this evaluation from the position of academic activism and an approach rooted in sex work studies, meaning it attempts to be enabling to those with less power (Macioti and Garofalo Geymonat, 2016). Since pole dancing is an art and a sport originating from the sex industry (Are, 2021b), and since pole dancers may be adjacent to and/or involved in sex work, researching as an academic activist means following academic rigour, but producing knowledge in service of sex workers and other sex work-adjacent vulnerable users (Connelly and Sanders, 2020) – in this case by highlighting power imbalances between platforms and users and zeroing in on platform governance practices to elicit their change and improvement.
Shadowbanning on Instagram: from deliberate opacity to Account Status
Transparency is crucial for democratic systems, as it allows citizens to ‘make more informed decisions about how to govern themselves’ when deciding which services to use or products to purchase (Kosack and Fung, 2014: 83). In platform governance, or the processes of moderation exercised by platforms and state regulation of platform companies (Gorwa, 2019), transparency ‘can unlock access to much-needed information to develop evidence-based and informed public policies’ incentivising companies to become more ‘visible’ to the public (De Castris, 2024: 13). Yet, platforms are opaque about the running of their companies, their workers’ working conditions and the making, development and governance of algorithms (Mirghaderi et al., 2023).
Shadowbanning – a user-generated term that Instagram avoid using, but that has become shorthand for tampering with a post or a profile's visibility since its first use in the 2000s Something Awful forum (Are, 2021a) – has always been shrouded in vagueness and surrounded by conspiracy theories (Savolainen, 2022), often alimented by users’ response to platform opacity itself. Dubbed as a policy targeting ‘vaguely inappropriate content’ by TechCrunch (Constine, 2019), Instagram's shadowban has often relied on haphazard, convoluted or confused public-facing policy descriptions.
This practice illustrates the information and power imbalances between platform companies and wider society. Enacted by algorithms, or codified step-by-step processes that solve problems and perform tasks (Bishop, 2019), content moderation is essentially an example of governance by technology and software – a governance that is largely private, aimed at profit maximisation and the avoidance of scrutiny to pursue said profit maximisation (Just and Latzer, 2017). User and advertiser buy-in continues to be central to the accruing of profit, meaning that platforms will attempt to dodge accusations of wrongdoing and have been known to create governance rules that are akin to advertising standards – for example, leaks of internal memos found that Instagram drafted nudity-related Community Guidelines based on lingerie brand Victoria's Secret advertising guidelines (Salty, 2019). Therefore, concerned about appearing biased or unaccountable, social media companies tend to strategically deny or remain circumspect about shadowbanning at the expense of users (Gillespie, 2022), whose experiences are denied or minimised while cases of mismoderation continue to go unacknowledged (Divon et al., 2025). And while visibility reduction may allow platforms to further investigate potentially problematic content without outright removal, it also serves to avoid public accountability, shifting all responsibility of governance onto users (Gillespie, 2022). As a result, particularly those who work through platforms such as content creators and other gig workers find themselves performing ‘extra work’, or ‘encompassing additional cognitive, social, and emotional work that is intertwined with regular tasks’ to avoid algorithmic detection (Bucher et al., 2021: 45).
Shadowbans ‘reflect networks of oppression and marginalisation that circulate in offline cultural discourses’ (Rauchberg, 2022: 197). This is clear following Instagram's obvious change in moderation after the 2018 approval of FOSTA/SESTA, one of the main drivers of online censorship of nudity central to this study's topic (Blunt et al., 2020; Bronstein et al., 2021). The Allow States and Victims to Fight Online Trafficking Act of 2017 (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA) exceptions to Section 230 of the United States Telecommunications Act made internet companies liable for facilitating sex trafficking (a crime) and sex work (a job), triggering snowballing censorship of bodies, sexual and LGBTQIA+ expression as platforms applied machine learning to swathes of content to moderate at scale, trying to protect themselves from civil liability to the joint bill (Bronstein, 2021; Haimson et al., 2021). Crucially, FOSTA/SESTA saw platforms implement a priority – the removal of content that potentially violated this exception to Section 230 – and extend it worldwide, applying US policies globally irrespective of jurisdiction (Are, 2021a), in line with whistle-blower testimonies arguing that platforms tend to enforce laws from countries where they are headquartered more strongly (Wynn-Williams, 2025). Embedded with internal priorities, ‘brand safe’ governance – or the protection of a company's public image with advertisers and profit in mind – and responses to legislation (Salty, 2019; Griffin, 2023), platform governance is therefore not always clear and transparent to users due to platforms’ attempt to appease various stakeholders without attracting too much scrutiny (Divon et al., 2025).
The priority created by FOSTA/SESTA, at the time one of the few laws threatening platforms’ freedom from intermediary liability, makes communities that are adjacent to the content it targets a valuable case study. Following a direct apology to pole dancers and communications with the platform, it became clear that Instagram's shadowban does disproportionately target nude, sexual and LGBTQIA+ content and profiles, as well as whole hashtag groups (Are, 2019, Leybold & Nadegger, 2023).
Given this, knowing one is shadowbanned and being able to appeal said shadowban has the potential to mitigate the distress, frustration and confusion followed by diminished visibility (and potential income loss). Situated in Account Status, appeals for shadowbanning are not dissimilar to the appeals for violations of Community Guidelines that trigger full content removal. In creating a specific profile section to monitor and appeal decisions made over user content, Instagram created a potentially unique affordance, or opportunity a technology can offer for action (Norman, 1999). Indeed, while at present on platforms like TikTok the user needs to monitor each individual post's analytics to check whether it can be recommended to others, creating several administrative hoops to jump through, having one single space to be able to contest moderation decisions has the potential to reduce user admin and improve transparency about one's content's performance.
However, while repeated violations of Community Guidelines on Instagram may result in harsher punishment such as de-platforming, violating Recommendation Guidelines results in perennial invisibility.
Recommendation Guidelines govern what content Instagram recommend to people, helping them ‘avoid making recommendations that could be low-quality, objectionable or sensitive’, or ‘inappropriate for younger viewers’ (Instagram, n.d.). These are stricter than Community Guidelines because they show users content they have not chosen to follow (Instagram, n.d.).
‘[C]ontent that may be sexually explicit or suggestive, such as pictures of people in see-through clothing’ is non-recommendable according to the platform (Instagram, n.d.). The may here is crucial, almost as if Instagram were trying to put: ‘It's up to us to decide’ into binding policy that regulates content worldwide. This way, the platform turns value judgements into policy by labelling a loosely defined stream of content as a risk that should not be adjacent to its brand or to the brands that want to advertise or be present on its app. Thus, Instagram disguise brand safety with online safety (Griffin, 2023), aligning with notions of risk rooted in the US financial services and in societal stigma against sex rather than in harm reduction (Beebe, 2022). Indeed, as Gillett et al. (2022: 2) write, safety is not merely ‘the absence of harm, danger, risk, or injury’ – it is also ‘the cultivation of social and cultural inclusion, access, and justice in addition to the presence of transparency and processes of accountability’.
The version of online safety constructed through Instagram's Recommendation Guidelines is therefore not based on cultivating transparent, accountable and inclusive digital spaces but on identifying and labelling supposed bad actors and enacting censorship and surveillance through technosolutionist approaches that allow for little nuance (Gillett et al., 2022). A blend of responses to media panics, call-outs about lack of transparency and attempts at brand safety means that the guidelines reflect the ‘vaguely inappropriate’ label Constine (2019) gave to shadowbanning: they show a general dislike for content and users that are not undesirable enough to be removed, but that Instagram nonetheless would rather distance itself from.
Accounts that post non-recommendable content become non-recommendable themselves, making the term a re-brand for ‘shadowbanned’ (Gerken, 2022). Non-recommendable accounts may have recently violated Community Guidelines, shared content Instagram do not recommend, posted vaccine-related or general misinformation, engaged in purchasing likes, been banned from running ads on Meta-owned platforms, been associated with organisations Instagram deem to be linked to offline violence, or may ‘discuss or depict suicide and self-harm in the account name, username, profile photo or bio (with the exception of accounts focused on providing support, raising awareness and recovery)’ – and, of course, post nudity (Instagram, n.d.).
Debates over the visibility of nudity, pornography, bodies and sex work are of course nothing new: following high-profile free speech trials in the previous century's USA, where the influence of evangelical groups has often attempted to remove any mention to sex from communications (Connelly and Sanders, 2020; Nolan-Brown, 2022), they have always been a battleground where freedom of speech and safety are the main factions. But while transparency and accountability are still staples of the criminal justice system, Instagram's Recommendation Guidelines are enforced by a private company and exhibit a governance approach influenced by the typically North American, puritan values typical of Silicon Valley where they are produced (Paasonen et al., 2019; Roberts, 2019). They also betray the neoliberal business models blending safety governance with advertising strategies that are at the centre of platforms’ appeal to the masses (Stardust, 2024). These parameters are inevitably written into their algorithms, which reflect this mentality.
Similarly to the over-enforcement of the de-platforming of nude and sexual content (Are and Briggs, 2023), Instagram often avoid recommending content ‘in error’, as was the case with the incorrect demotion of pole dancing hashtags triggering the 2019 apology to pole dancers (Are, 2019). For Register et al. (2024: 23: 5), such ‘errors’ are also the result of ‘normative cultural values reproduced by platforms’, meaning the conflation of sexual content with harm could be the result of technologies governed conservatively working correctly. This argument seems reflected in Meta's effort to increasingly ditch human moderation towards an AI-powered and often fact-checking free governance (Allyn and Bond, 2025), despite previous research finding that over-enforcement of censorship of content containing nudity and sexual expression is often caused by a snowballing of machine learning decisions onto more and more adjacent content (Are, 2021a; Are and Briggs, 2023).
It is therefore crucial to understand where Account Status sits in this new, potentially more transparent reality.
Research has so far focused on performative allyship by influencers, who may profess support for a marginalised group in times of crisis to show their brand aligns with popular causes without sacrificing their social or economic capital to challenge the system they benefit from (Wellman, 2022). However, there is scope to examine this performativity in relation to platform policy changes, which for Marchal et al. (2024) often follow bad press. For instance, even though Meta changed its breast cupping and grabbing policies following a campaign by Black plus-size model Nyome Nicholas-Williams, this was later found to create ‘absurd’ new norms about breast squeezing that could trigger further censorship detailed in leaked internal memos (Gilbert, 2020).
Since increased transparency has not been found to always lead to more efficient governance (Kosack and Fung, 2014), and given that institutions and corporations alike have become efficient and selective in the information they present in transparency reports and notifications (Parsons, 2019), it is crucial to evaluate the effectiveness of Instagram's Account Status as a transparency and education tool for content governance to keep the platform accountable with users and society. While Koja et al. (2025) have briefly touched upon Account Status in the experiences of the creators they interviewed, finding that users across different communities resorted to self-censorship in order to evade detection in Account Status, it warrants a deeper investigation, given that no other studies at present seem to have examined this.
Study background and research questions
Originating in the sex industry and blending extreme sports with sensuality, pole dancing throws social media platforms’ governance processes and algorithms into disarray.
Pole dance cemented itself as part of the creator economy, a form of work powered by social media platforms where users strike brand partnerships or advertise their services, facilitating new kinds of cultural production by creating flexible, aspirational options for labour based on entrepreneurship (Are and Briggs, 2023). Now, many pole dancers are self-employed or engage in side hustles, promoting polewear brands, fitness programmes, tutorials, performances and studios (Are, 2021a, 2021b) and are greatly affected by the invisibility brought by shadowbanning (Are, 2021a, 2021b; Cotter, 2021). Pole dancing content – not a sex act per se, but a sensual, diverse performance and sport originating from the sex industry constituting both work and self-expression – is therefore the prime case study to understand the effectiveness of Instagram's new transparency tools. Pole dancers have already been at the forefront of algorithmic experimentation, using what Delmonaco et al. (2024) have called ‘collaborative algorithm investigation’ or ‘collective theory building to attempt to define and prove shadowbanning’ as a form of resistance, spreading folk theories about this governance technique and relying on each other to test, confirm or deny such theories (2024: 154: 22). These theories, which Bishop (2019) called ‘algorithmic gossip’ in the case of beauty vloggers, are a direct consequence of platforms conveying their rules and effects of their enforcement poorly and opaquely (Caplan and Gillespie, 2020).
The idea for this study took shape while observing my own experiences of using Account Status appeals at the end of January 2023 as a pole dancing content creator on Instagram.
As a White, bisexual, cisgender, thin woman's experiences of shadowbanning via the @bloggeronpole account, with over 31,400 Instagram followers at the time of writing, I use Instagram to share performances and training posts, sell classes or advertise products not as a main source of income, but a side income which has a sizeable impact on my finances. Instagram is also crucial towards making my research more accessible to my audience, and it is partly thanks to the profile I have built that I have been part of several knowledge exchanges and meetings with Meta policy workers, although these connections have not seemed to improve my content's moderation.
I began seeing an orange triangle with an exclamation mark inside (now an orange circle, after a change in layout) in my Account Status at the end of January 2023, when initially three and then five of my most recent pole dance videos – always a blend of performance or choreographies wearing heels and skimpier clothing with fitness style tricks, reflecting my love for our discipline's versatility – appeared on my Account Status as Recommendation Guidelines violations. I was not notified of this by Instagram, but by several connections who found their content was not getting any traction, and went on an admin-intensive deep dive of Instagram's recently-changed profile settings. When my reached tanked even more than usual, I followed their footsteps and behold: I was ‘non-recommendable’(Figures 1, 2).

Non-recommendable.

Recommendable in 2023.
Next to the videos, a sentence stated my content ‘may be’ in violation of Recommendation Guidelines for nudity and sexual activity, which has now changed to ‘This decision is based on our guidelines’.
At the time, I noted with confusion that upon appealing one video I couldn’t appeal the others and was subsequently told by my Meta contacts that once one post is appealed, the whole account is reviewed to allegedly empower the user to contest decisions and, crucially, to reduce their labour in appealing.
And a laborious process it is. Previous research has already argued that platform companies thrive off exploiting users’ unpaid labour, an exploitation which is hidden ‘behind the fun of connecting with and meeting other users’ (Fuchs and Sevignani, 2013: 288). Often, this unpaid labour presents a higher emotional toll on women and femme-presenting users, who must intimately engage with audiences towards visibility and earnings in ways that ultimately add advertising value to social media companies themselves by feeding preferences, tastes and concerns to the algorithm (Arcy, 2016). For sex workers and users whose content features nudity, this digital labour exacerbates offline inequalities for the sheer amount of attention users must pay to navigate platform governance compared to other accounts (Gorissen, 2024) and often requires ‘extra work’ (Bucher et al., 2021) towards self-protection and appealing removals (Stegeman et al., 2024; Are, 2024).
In my experience, after going ‘green’ again – that is, having all my violations successfully appealed, making my profile recommendable and my handle searchable in the Instagram search bar without typing my username in full – all it took was posting one more piece of content, often a video, for all the previously appealed posts to be flagged by the algorithm, landing me again into what my pole dance coach Josh Taylor dubbed as ‘sexy jail’, now shorthand for what I define as the state of punishment for posting content that Instagram deems suggestive enough to be rendered invisible. Ironically, ‘sexy jail’ is a colloquial term amongst pole dancers and sex workers on Instagram, but it reflects the platform's carceral, punitive and often irreversible governance (Gillett et al., 2022), which is why it was deemed worthy of definition above (Figures 3, 4).

Appeals.

A mix of content detected by Meta.
The result of this appeal process – ‘sexy jail’ – is in stark contrast with intuitive assumptions I made as a user. Since appeals are a key democratic procedural mechanism determining fairness and accountability in situations of offline and online governance (Are, 2024), since Instagram deemed Account Status worthy of a whole space in its settings, and since they made a point of announcing it almost as a ‘cure’ for invisibility (Gerken, 2022), I had assumed and imagined it would work. In short, in Nagy and Neff (2015:1)’s words: I had imagined an affordance, building up ‘expectations for technology that are not fully realized in conscious, rational knowledge but are nonetheless concretized or materialized’ because of what that technology promised to do. But this imagination could not be further from the design and enforcement of Account Status in my experience, which felt either like the designers imagined this affordance without the user in mind, or like Account Status was meant to keep me busy, distracted and appealing into a void.
I wanted to see if this disconnect between the affordance itself and what users imagine it could do extended to other pole dancers.
To this end, this paper aims to answer the following research questions:
How does Instagram's Account Status affect pole dancing content creators’ perceptions of transparency on the platform? How effective are Instagram's Account Status appeals for pole dancing content creators? How has the introduction of Account Status notifications for shadowbanning affected pole dancing content creators?
Methodology
Data for this study were collected via a qualitative survey, which I shared with my Instagram network after securing ethical approval from my institution and which was chosen as a method for the ease of recruitment via social media and of completion, anonymity and opportunity for self-expression it provided.
Characterised by open-ended questions centred on a particular topic and presented in a fixed order to all participants, open surveys produce rich accounts of their experiences, allowing them to answer in their own words instead of adhering to language set by multiple-choice answers (Braun et al., 2021). They ask participants few but generative questions, and they are often appropriate to represent stigmatised or niche communities’ nuanced experiences (Braun et al., 2021), making them a suitable method for this study.
Questions asked participants about their experiences with appealing shadowbanning via Account Status, about how this affected their practice and about whether Account Status improved their experience on the platform (see Appendix).
Consistently with research finding that sex workers, users with a disability, People of Colour (POC), and plus-size users tend to be moderated more harshly by platforms (Blunt et al., 2020;Haimson et al., 2021; Rauchberg, 2022), the survey concluded with demographic questions surrounding ethnic background, sexuality, gender identity, location and age and asking participants whether they had a disability or identified as plus size.
I recruited participants via my Instagram stories and the link in my Instagram bio and by posting three Instagram reels in a 3-month data collection period, at the rate of one short-form video per month shared from May to July 2024. During recruitment, I purposefully used the term ‘pole dancing content creators’ to include pole hobbyists, instructors, performers, photographers, brands, studio owners, strippers and sex workers, as I was interested in the moderation of the content and also whether this differed according to the user's stated or perceived identity.
All reels were posted as a collaboration with Pole Junkie, the polewear brand I am a brand ambassador for, whose account has over 120,000 Instagram followers around the world. I asked my followers to help circulate the reels and the Qualtrics survey link, which I also posted in a series of pole dance groups on Facebook. One of the recruitment reels’ sound was ironically muted by Instagram in North America due to copyright restrictions, meaning that the first recruitment push may have received fewer interactions due to lower engagement caused by lack of hearing States-side.
Analysis
The survey received 100 full responses. Most participants (69%) were between the ages of 25 and 34 years old, followed by 35–44-year-olds (20%) and a small number of 18–24-year-olds and 45–55-year-olds. Most of them were White (77%), 6% identified as Mixed, 6% as Asian, 5% as Black, with smaller numbers of Latine or Arab participants. Participants were largely based in Europe (61%), followed by North and Central America (28%) and Australia (4%), with the rest choosing the ‘Prefer not to say’ option regarding location. Most of them identified as heterosexual (43%), followed by 25% bisexual and 8% pansexual participants, with lesbian or queer participants below 5%. Moreover, 93% of participants were cisgender women, with 5% identifying as non-binary and 1% as trans man. Only 7% of participants identified as plus size, and 9% shared that they had a disability. Most participants seemed to engage with pole dancing as hobbyists, content creators or instructors rather than as strippers or performers.
Given the aforementioned studies of the demographics most affected by censorship (Haimson et al., 2021), these findings – largely reflecting the experiences of White, heterosexual cisgender women in the Global North – may be a snapshot of Pole Junkie's following and of the necessary partnership with them since my own experiences of shadowbanning greatly limit my recruitment opportunities. Nonetheless, although this sample does not reflect the diversity of the pole dance industry, the responses below show that even White, heterosexual cisgender women with little or no experiences of sex work seem to face heightened censorship when pole dancing on Instagram.
Survey responses were analysed via thematic analysis (TA), a method which allows researchers to identify, analyse and report themes or patterns within data (Braun and Clarke, 2019). Thanks to its minimal data organisation, TA allows to describe data in rich detail, staying true to participants’ experiences and providing insights into their realities, particularly when researching stigmatised and excluded communities.
Themes are the result of researchers’ theoretical assumptions, their analysis and the data and they often reflect the research or data collection questions (Braun and Clarke, 2019). In line with this paper's feminist, academic activist and sex worker-centric approach aimed at being enabling of those with less power and acting in service of users (Connelly and Sanders, 2020; Macioti and Garofalo Geymonat, 2016), the themes picked up in the analysis focused on perceptions of justice and on participants’ perceived impact of Account Status on their lives. They were selected through an initial scanning of the survey responses based on the survey questions. Data was coded inductively and hosted in a spreadsheet, with the most relevant quotes selected for each theme presented as answer excerpts.
The themes are as follows:
Performative transparency Justice and usability challenges Account Status notifications’ financial, creative and emotional impact on pole dancing content creators
Performative transparency
Most pole dancing content creators (65%) found that Account Status notifications of the alleged violation of Recommendation Guidelines leading to shadowbanning did not improve their experience on Instagram. Although 25% of users argued that notifications may have improved their experience, and the remaining 10% found the situation had definitely improved, an overall majority of pole dancers did not find Instagram's attempts at transparency helpful. Most users claimed to be happy to at least know they were shadowbanned, citing previous feelings of gaslighting associated with having to reverse engineer governance following Instagram's denials (Cotter, 2021; Divon et al., 2025). Still, they found Account Status tools to be largely unhelpful due to the vagueness of the reasons Instagram gave to explain Recommendation Guidelines violations.
‘The whole button feels like an appeasement strategy’, argued P6, in line with my experience of an unfriendly user interface, the feeling that the appeal may not have gone through and the reappearing of the same content as a violation upon posting something new. She added: ‘It feels like a way to say, “See? We tried but this is your fault and not ours”’.
Participants found being able to check the reasons behind low reach helpful, but that Account Status was still ‘not enough to explain the reason why such action will be done against that specific published content’, argued P15, echoing various participants’ opinion that vagueness and arbitrary decisions were often connected to Meta's puritan world view and to its attempt to govern an ever-growing user population with the same rules (e.g. Paasonen et al., 2019).
Despite Account Status, several users still found Instagram's application of shadowbanning ‘a blur’ and ‘arbitrary’, with P10 comparing it to the infamous ‘I’ll know it when I see it’ comment by Supreme Court Justice Potter Stewart during the 1964 US obscenity trials. She added that she found Account Status ‘inadequate’ and ‘ridiculous’: ‘This is all up to the perspective and interpretation of an algorithm and perhaps a human who may have a vastly different life experience from you […]. Algorithms are written by humans, humans can be racist, misogynistic, sexist, ableist’. Echoing this view, P11 added that: ‘Knowing that they are arbitrarily removing content or punishing creators based on, let's face it, religion, is ridiculous!’.
Participants argued that far from improving their perception of Instagram, Account Status, its shadowbanning detection and appeals left them feeling gaslit instead. ‘Being in the dark was bad’, wrote P11, ‘but knowing that all of our hypotheses were correct and we were just being gaslit sucks’.
In the attempt to drag more information out of Meta and understand how to improve their visibility, several participants attempted to buy a subscription to Meta Verified, the paid-for Blue Check mark through which Meta promised better, quicker customer service and – although this was later retracted – more engagement (Iovine, 2023). This experience made them feel scammed, cementing their perception of Instagram's gaslighting, with P7 saying Meta Verified ‘was extremely unhelpful and the conversations went around in circles over and over again’ and another participant adding: I have the Meta Verified blue tick and the help desk just repeat back that you’ve broken guidelines but do not clarify which ones or offer any help – it must be a robot response. (P2)
The above experiences mirror Divon et al. (2025)’s participants’ experiences of gaslighting both when dealing with Instagram's automated tools and human workers and hint at participants’ awareness of Meta's attempt to fully automate content moderation (Allyn and Bond, 2025).
Having taken up an Instagram prompt encouraging creators to get in touch with the Meta ‘Media Expert’ team to improve engagement myself to test this, I had a call with them in May 2024, which was offered to be although my account was given a Blue Check mark for notability instead of through paying for Meta Verified. Their expert complimented me on my content and on my posting frequency, but when I asked them about my non-recommendable status, they suggested refraining from posting pole dance content altogether – despite it being one of the key elements of my content and of my profile, and part of my username. The call ended with them agreeing Meta's content moderation was not clear or fair, and without them being able to provide any advice to lessen the censorship I was facing.
Overall, participants felt that the ability to check one's Recommendation Guidelines violations within Account Status – and even direct interaction with Meta workers – did nothing beyond lip service: shadowbanning and its disproportionate application on content showing nudity and on pole dancers remained, while Account Status was perceived as a PR stunt. Some participants went as far as arguing that these attempts were nothing but a strategy to discourage them from posting altogether.
Participants’ belief that Meta showed contempt towards pole dancing content was not mitigated by Account Status – in fact, this only served to prove their point about the moderation they faced, arguing they were previously being gaslit. This was exemplified by the many experiments they ran, that is, in the case of P15, a professional communication manager with over a decade's experience with managing different social media accounts: similar to my own experiments (Are, 2021a), the user tracked their performance through Instagram analytics from 2017 and 2018 (pre-FOSTA/SESTA), to 2019 and beyond (post-FOSTA/SESTA), finding severe reductions in the number of likes and engagement, with tens of thousands of drops in likes.
The fact that Account Status notifications now confirm the beliefs and allegations users made through their own experiments after years of Instagram's denials added to the feeling that Meta's transparency was merely performative, as a means to placate users and avoid scrutiny.
Although it will be more extensively defined later, this performative transparency was perceived as a corporate exercise in self-serving disclosure in times of scrutiny, set to help Meta more than its users – especially because the Sisyphean appeal process performed by pole dancers led them to invest in (and essentially be scammed by) Meta Verified. These experiences generated feelings of injustice amongst participants, addressed in the next theme.
Justice and usability challenges
The injustice felt by pole dancing content creators was compounded by the ineffectiveness of the tools to appeal and reverse governance decisions. Indeed, when asked whether being able to appeal Recommendation Guidelines violations improved pole dancing creators’ experiences on Instagram, a striking 72% said no. While 22% said maybe, only 6% said yes. Participants’ stories paint a strikingly inefficient picture of the platform's appeal mechanism for shadowbanning, with P2 stating that: ‘Appealing takes weeks and then the same old videos get reflagged that have previously been flagged and approved’.
Several participants described similar games of whack-a-mole with Account Status: IT DOESN’T WORK. Nothing happens when you appeal. NOTHING. The only thing that worked for me was archiving the content, and then activating the content again later when Instagram has passed me…. One day it banned a post – I archived it – and the next day it just found another random of my posts to ban. I was “hunted” for over a month where it kept banning random posts. (P1) Every time a piece of content is flagged, it shows in the Account Status area but the appeal button is not available to click. It then becomes available about 5 days later but the appeal usually does not get resolved (no response at all), leaving me shadowbanned. There is also no record of appeal submission, and no way to check status or contact anyone. (P3)
Participants argued Instagram's introduction of Account Status did not sufficiently support or empower them to exit the shadows. P13, for instance, said having ‘to manually check in my Account Status to find out’ is not a ‘true notification’, as the onus remains on the user to check, keep track of violations and constantly appeal them – a process they viewed as an unjust burden.
Consistent with the previous theme, participants highlighted that the only attempt at transparency in the appeals process was the communication that a guideline had been violated. No log or record of appeals and of specific reasons behind decisions was shared by Instagram.
Account Status appeals were also found to be difficult to access: the notifications themselves were not notifications per se but a section at the bottom of one's Settings that participants struggled to find; the appeals were glitchy, consistently with previous de-platforming research finding appealing is ‘often unreliable or may not even appear as an option for some users’ (Register et al. 2024: 23: 5; Are, 2024). The system always put the onus on the user to find and to appeal violating content, contrary to Meta’s policy view that the tools were to ‘empower’ creators.
Instead, the platform appeared to detect violations with no rhyme or reason, failing to inform users about whether appeals were resolved and why, contributing to the perception of a governance process that is invisible and unjust. While P12 found appeals ‘unfair and elitist’, P14 called out Account Status inconsistency: The waiting times and fact that it flags old videos and reflags constantly is a waste of time and energy. There is no support for flagged videos and no guideline for how long it will take to appeal. I’ve waited less than 24 hours and more than 2 weeks!! (P14)
The consequence of this lacking transparency and of this apparently random, snowballing detection of violating content added more work on participants’ plate: all of them claimed to have experimented with different posting and Instagram biography formats as well as appealing and archiving processes to beat the shadowban, to no avail, in direct opposition to Meta's statements to me about Account Status empowering potential. At first I would appeal and it usually would be overturned so myself and the content would go back to being visible. About 5 months ago the appeals process started taking longer and longer to be reviewed, as well as the platform deciding my content was still inappropriate even if it was the tamest videos with most of my skin covered. […] Currently I just appeal the videos I really want to have on my feed and if they aren’t important I will delete them. I change my bio every other week just to get my bio approved again if it's flagged. (P7)
All participants claimed to be appealing multiple times, sometimes daily, with no luck as the videos ‘just keep coming up’ (P9). This, added to the existing load and time needed to build up a profile to work or express themselves, felt like unpaid and fruitless labour. Many connected this with an increasingly automated governance, which they saw as linked to recent lay-offs at Meta making their experiences worse and appeals lengthier (Allyn and Bond, 2025; Scanlon, 2024). Others linked it to a perceived crusade Instagram seems to be waging against pole dancing: some found that only by removing links to their pole dance studios or tutorials, their pole dancing profile image or mentions to ‘pole’ in their bio and username their status was reviewed. For others, even a complete tabula rasa of mentions to pole dance did not result in any change. This inefficient, unjust, confusing and arbitrary governance had several impacts on pole dancing content creators’ earnings, creativity, community and practice, addressed in the final theme.
Account Status financial, creative and emotional impact on pole dancing content creators
The above issues with Account Status had several effects on participants’ pole dancing practice and businesses. Participants felt slut-shamed by Instagram, leading some of them to change their pole practice, their offline dancing behaviour or even their confidence and perceptions of self, with P24 going as far as saying they felt ‘wrong’ for showing skin even in a pole class on the back of censorship. Their quotes convey their frustration, the shaming they felt and the disproportionate waste in time and resources in the face of censorship. A lot of stuff I just won’t make, even though it makes me happy, because I don’t want my accounts in trouble. I notice pole content I make with less clothes does way worse. (P22) Means you regulate yourself when training/Choreo etc cause you want it to be viewable on insta. Many times I’ve had to take something down or change a routine that I’m proud of etc cause it won't allow me to share. It regulates the creative process and self-expression and can be frustrating. Many times I've thought about deleting Instagram because of it. (P23) I work super hard and probably spend some money (Pole/aerial gear and clothes are not cheap!) creating content, for it just to be completely wiped into oblivion. And I do a lot of pole sport with shorts too not only heels stuff, but they don’t like that either! It seems like if there's a vertical shiny metal on frame they’re just like “nope”…. (P8)
This self-censorship triggered another powerful feeling: loss. Those with less performing experience felt they would have to ‘work twice as hard’ to put themselves out there because of censorship (P16), or even that promoting themselves as teachers and performers was now impossible (P17). Crucially, the loss participants felt was connected to feeling that censorship negated their achievements in the art and confidence they worked hard to master, as well as in their attempt to build a community and to learn from others. Pole has given me so much confidence and helped so much with my body image – it's gutting that IG censors us when we’re feeling so good about ourselves and our bodies are doing incredible things. It leaves you feeling so deflated, they’re videos, combos and achievements I’m super proud of and want to put out to the pole world. It makes me think what's the point if it's so hard so early on? And it's a shame because it's so important as polers we track our progress through video - and to connect with other polers in the community! (P20) Shadowbanning prohibits not only my own content not being seen by other polers but I also miss on other polers shadowbanned content. I am missing a whole suite of creativity and videos that I could use to inspire me as a pole dancer and help me grow and develop my skills. (P18)
The sense of loss experienced by participants encompassed several layers and was either experienced as a sense of mourning of pre-shadowbanning times for users active before censorship or a sense of missed opportunities due to an awareness of users’ experiences before it.
An initial layer of this loss was represented by the tangible financial element of losing work opportunities and therefore income, a significant issue in itself which, however, betrayed a deeper loss: the mourning of the dream of becoming a self-employed creator, performer or instructor, something participants felt had to be mediated by Instagram and that was strictly connected to the visibility provided by the platform.
Even more concerningly, this nostalgia for the pre-shadowban era and this sense of loss translated into offline loss of enjoyment for the practice of pole due to censorship's implicit message: that Instagram did not value pole dancing skills, growth and artistic talent. The community and collective creativity aspect of this feeling of loss was also striking, and a characteristic of many responses: users were not only demoralised because their own content was censored – they were negatively affected by not seeing people who used to inspire them and felt worse off in their practice and pole dancing development as a result.
Still, some users remained defiant in the face of shadowbanning, claiming they refused ‘to bow to the dictatorship’ (P25) and that even if their appeals were rejected, they refused to edit their content and conform (P26).
This defiance can be seen as a reaction to what users highlighted as Meta's paternalistic, puritan and overly strict governance of content, art and bodies. Pole dancers viewed this governance as patronising, one-sided and lacking nuance, enacted by a platform that wants to be the arbiter of taste (e.g. Gillespie, 2010) without the necessary knowledge, and that coded women's bodies as sexual by default (Are and Paasonen, 2021; Paasonen et al., 2019): ‘It only wants one specific type of art’, P12 said. ‘It's like that teacher in art school that wouldn’t let you explore your own style and wanted you to just copy some famous academy approved painting’.
Overall, pole dancing content creators felt unsupported by Instagram not just personally, but as an industry and as a community: they discussed censorship not just on a personal level, but on a group level, to show how it affected their bonds with and inspiration from other pole dancers. They viewed the presence of a pole as the common denominator behind shadowbanning, given that other bikini content by celebrities and/or users whose aesthetics are not adjacent to or deriving from sex work seems to still be visible on Instagram (Are and Paasonen, 2021; Paasonen et al., 2019). As such, they found Account Status and Instagram censorship to have a devastating emotional and financial effect their practice, community and earnings.
Discussion
Participants’ experiences highlight novel reactions to new affordances’ attempt at transparency, and they showcase a previously under-researched sense of loss triggered by platform governance. This loss is not merely the loss of visibility or engagement, but a sense of ‘future loss’: a loss for what may have come if Instagram had not gone back on their word to allow people the freedom, flexibility and opportunity of becoming a content creator even when they came from usually overlooked backgrounds (Glatt, 2022), and a staggering loss of artistic inspiration arising from one's community.
Participants attempted to mitigate this loss by self-censoring, by over-researching the context in which censorship arises and by performing emotionally taxing ‘extra work’ (Bucher et al., 2021) to evade detection, but often to no avail. Instead, they poured their time, energy, training and creativity into spaces that seem inherently hostile to their community, and that ultimately seem to exist to generate profit for Meta itself (Arcy, 2016; Fuchs and Sevignani, 2013).
Already explored in research surrounding content creators’ hopes and fears about artificial intelligence (Are et al., 2025), this loss contrasts creative, emotional and physical engagement with learning and artistic disciplines with the unforgiving power of Big Tech.
Despite having ridden the wave framing platforms as democratising and empowering spaces that afforded previously marginalised and excluded demographics a place in art and entertainment (Duffy and Meisner, 2022; Glatt, 2022), platforms are the new unaccountable arbiters not just of taste (Gillespie, 2010), but of opportunity. The experiences of gaslighting that participants reported upon hearing about and then using Account Status do not only arise from previous moderation and communications with platforms (e.g. Are, 2021a; Are and Paasonen, 2021; Divon et al., 2025) but also from the lack of clarity in the tool itself.
In short, Account Status may be making governance somewhat more visible to the public (DeCastris, 2024), but it fails to correctly inform users about how to improve their behaviour (Gorwa, 2019) or to make the platform more accountable (Kosack and Fung, 2014), therefore providing a case study in a platform governance that is, once again, rooted in brand safety. As increasing tech regulation valuing transparency (e.g. Gorwa, 2019; Griffin, 2023) and negative media coverage (Marchal et al., 2024) pressures platforms to engage in public relations efforts that respond to moral panics and tangible issues alike through changes in policy that are swiftly communicated through their Newsroom sections (Gillett et al., 2022), being seen to respond to users’ and governments’ concerns becomes currency to show compliance and continue accruing power.
However, the introduction of Account Status, and even of apparently better, paid-for customer service options such as Meta Verified did not empower users or improve transparency about governance: it made them feel scammed and gaslit instead of making Meta appear more transparent. Account Status therefore appears merely performative and is a relevant case study in the additional labour users must perform to navigate Instagram as a social connection, self-expression and work space.
Limitations
Besides the aforementioned recruitment limitations, this paper presents another, striking limitation due to its sample: while all participants mentioned the mental and physical health benefits of working out and expressing themselves through pole dancing, citing how this allowed them to heal from feelings of shame regarding their sexuality and physical appearance, most of them somehow skirted around mentions to sex work, and did not link pole dancing to its stripping origins when referring to potential causes of the censorship they faced. They independently raised witch hunts and censorship related to art without addressing the elephant in the room: pole dancing is censored largely because of its history. As such, reflections on art with a sexual dimension, or indeed on the fact that art and extreme sports with a sexual element can still be impressive and relevant in modern society, are missing from their accounts, perhaps because they do not train in studios that employ sex workers, or centre pole's stripping origins, or because they prefer not having to answer their loved ones’ or colleagues’ questions which may complicate their engagement with a practice with such a rich history.
Nonetheless, given platform's moderation of pole dance as a sexual practice no matter what, and given this paper's approach surrounding the enablement of those with less power, it is worth adding that decontextualising this practice from its origins and sanitising it doesn’t only bend to Big Tech's power – it does a disservice to pole dancing altogether, while simultaneously not improving matters of censorship.
Pole dancing is an art precisely because it pushes the boundaries of acceptability through a hypersexuality also demonstrated through physical prowess and sensual, theatrical narration. It's an interdisciplinary, multidimensional art that perfectly exemplifies the failure of platform governance and machine learning when nuance is required – in a similar vein to which pole dancers themselves, or indeed their audience, wish to detach this art from its origins to be able to discuss its value.
The origins of pole dancing manifest in Meta's contradictory moderation: the company insists on wanting to empower users by merely giving them a platform and through its new Account Status appeals, while simultaneously patronising them, judging them, slut-shaming them and, as participants argued, deciding which art is worthy of exposure. But art can be sexual, and the debate and learning sexual art can provoke have social and creative benefits (Are and Paasonen, 2021), meaning that recognising pole dance's history when discussing censorship is crucial towards tackling this issue.
Conclusion
Through a qualitative survey amongst 100 users, this paper provided a first step in evaluating Instagram's attempt at transparency in shadowbanning via the Account Status violations monitoring and appeals function, as well as a novel example of platforms’ impact not just on creators’ wellbeing and earnings, but on their enjoyment and learning of artistic practices too.
Pole dancers’ experiences with Account Status show that Instagram's mechanism to detect, notify and allow appeals of potentially non-recommendable content appears inefficient at best, negligent at worst. The labour reduction and user empowerment Meta promised through it seems either poorly thought-through, or a cleverly designed illusion of empowerment, a strategy based on performative transparency to avoid further scrutiny, cash in on further earnings and consolidate power, akin to platforms’ already documented use of gaslighting as a communications strategy (Divon et al., 2025) bringing me to define it as follows: Performative transparency is a self-serving corporate box-ticking exercise engaging with a form of disclosure surrounding content moderation that only serves the purpose of dodging opacity accusations without meaningfully engaging in communications about governance. It does not inform users and the public about the issues presented by their content and/or platform processes.
Akin to what previous research (e.g. Divon et al., 2025; Register et al., 2024) has identified as platforms’ deliberate strategies to navigate their power through opacity aimed at protecting business secrets and reputation, Account Status did not provide insights into violations or repeated appealed content detections, or constitute an efficient, timely and effective means of appealing.
In its current shape therefore, Account Status merely informs users that violations have been detected – a significant step, given that Instagram before denied that shadowbanning ever happened (Cotter, 2021) – but falls short of educating them about doing better, and of providing significant redress mechanisms. This is where its inherent performativity lies: like some of the influencers it hosts (Wellman, 2022), Instagram's care for and desire to empower users is a PR exercise in showing alignment with their concerns, without significant social and economic investment in improving their conditions through meaningful transparency.
Account Status and its appeals were not empowering – they were, instead, discriminatory, engaging in slut-shaming and in a restrictive, judgemental governance of art and expression that greatly clashed with users’ perspective on the expertise, hard work and time and resource investment that pole dancing required. In fact, Account Status felt so disempowering that it affected users’ offline lives, creativity and enjoyment of the practice of pole dance. And if pole hobbyists were so greatly affected by its emotional and financial impacts, the reader can imagine how devastating an impact Account Status may have on sex workers, who have always been the main target of content moderation since FOSTA/SESTA (Blunt et al., 2020; Bronstein, 2021).
The now hyperawareness of sexy jail – the state of punishment for posting content that Instagram deems suggestive enough to be rendered invisible – as a weapon used to limit the growth and visibility of pole dancing content created an implicit knowledge of pre-censorship days amongst users, and triggered a two-fold sense of loss: a loss of hope over participants’ own prospects as performers, instructors and pole dancing creators, whereby censorship made pole dancers feel as if the dream they see their more established peers achieve is not within reach; and a loss of creativity and inspiration triggered by their community's invisibility which started on Instagram, before seen as a generative and inspiring platform, and which bled onto their training and choreography creation. Once a source of inspiration, Instagram is now a necessary evil and a censor for those who make pole content and who felt patronised, hidden and under-appreciated by the company.
The experiences reported in this paper provide a snapshot of how the awareness of (and hopelessness over) a governance technique can greatly affect offline creative practices and communities. Although some users remained defiant in fighting the algorithm and still perform as their unadulterated selves, this paper shows that design and enforcement can have significant impacts on behaviours and creative practices.
Pole dance's governance and the issues highlighted in this paper can be of interest to adjacent communities (e.g. those posting nudity, education, activism as already discussed by Abokhodair et al., 2024; Haimson et al., 2021; and Kojah et al., 2025), to regulators and governments pushing platforms towards more transparency, as it is a snapshot into Instagram's ineffective, performative and opaque moderation.
For starters, further regulatory investigation is needed to understand whether Meta Verified constitutes false advertising. Secondly, given that appeals are at the centre of accountability measures demanded in several global tech regulation attempts (Are, 2024), regulators and governments must create accessible avenues for users to pursue outside-Meta appeals, to also create further knowledge about Meta's internal appeal processes. Ideally, these investigations and efforts would push platforms to hire more human moderators acting as points of contact with users, rather than pursuing an AI-first strategy (Allyn and Bond, 2025).
Lastly, on a platform level, the apparent absence of internal bookmarking of already successfully appealed content is anything but efficient. Platforms should instead provide users and moderators with a record of their appealed content and ensure that algorithms and human workforces alike learn from previously actioned appeals.
To save Instagram the awkwardness of having to decide whether their moderation of pole dancing means algorithms are working correctly in their blanket bans or over-moderating, Meta should provide additional definitions of ‘sexualised dancing’ in their Community Guidelines and improve personalised notifications of shadowbanning. This is because users’ distrust towards Instagram following its previous denials of censorship means mere boilerplate communications of violations are simply not enough to truly appear transparent in situations which require nuance, context and user trust.
Scrutinising platforms through an activist researcher approach therefore means breaking through these performative attempts and demanding that transparency tools serve users and not private companies alone. Future research can do so by further examining Account Status transparency and efficacy in terms of Instagram (2024)’s move to limit political content, particularly in the moderation of contested political topics such as Palestinian freedom (Abokhodair, 2024).
Supplemental Material
sj-docx-1-pns-10.1177_29768624261426793 - Supplemental material for ‘Sexy jail’ and performative transparency: Evaluating shadowbanning notifications and appeals in Instagram's Account Status amongst pole dancing content creators
Supplemental material, sj-docx-1-pns-10.1177_29768624261426793 for ‘Sexy jail’ and performative transparency: Evaluating shadowbanning notifications and appeals in Instagram's Account Status amongst pole dancing content creators by Carolina Are in Platforms & Society
Footnotes
Ethical committee
This project (Project ID 7285) received Ethical Approval from Northumbria University on 17 May 2024.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by Research Councils UK, Engineering and Physical Sciences Research Council, EP/T022582/1.
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data availability statement
The data that support the findings of this study are available on request from the corresponding author. Further information is available at: Are, Carolina (2025). Account Status qualitative survey amongst pole dancing content creators. Northumbria University. Dataset. https://doi.org/10.25398/rd.northumbria.28382021.v1.
Supplemental material
Supplemental material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
