Abstract
New trends in online extremism are currently unsettling the typical classifications used to assess violent threats to democratic societies. While extremism is usually perceived to be a matter of extreme ideologies and methods, social media enables and shapes distinct hybridization processes by which conspiracy beliefs, personal grievances, and various ad hoc convictions are combined with ideology fragments, consequently producing new extremist narratives. However, research into hybridized extremism has not yet accounted for the specific role of digital platforms and social media. This article develops the concept of hybridized prefatory extremism (HYPE) spaces to account for these recent changes and offers a heuristic framework for future studies to pinpoint the participatory engagement of digital publics in co-creating hybrid forms of extremism which may evolve into violent extremism. Based on five quantitative and five qualitative datasets collected through digital ethnography, the article identifies three domains, which shape HYPE spaces: (1) actors, (2) practices, and (3) content. Through these three domains, we are able to point to how emerging processes of hybridization of extremism are not only a matter of content hybridity, but also a hybridity of technologies, aesthetics, and participation. The conceptualization of HYPE spaces allows researchers and practitioners to carry out further empirical studies to elucidate the distinct role of social media in current trends of extremism and identify and monitor potential risks of hostile and violent action in online spaces.
Introduction
Recent years have seen growing public and scholarly interest in online extremism and the spread of conspiracy theories on social media (see Sutton & Douglas, 2020). Research into violent extremism, terrorism, and mass shootings emphasizes how perpetrators are increasingly driven by an amalgamation of beliefs, ideologies, and grievances as indicated by content uploaded on their social media profiles (Kupper et al., 2023). This trend of amalgamation is labeled by Danish Security and Intelligence Services (DSIS, 2023) as hybridization, while commentators and scholars of violent extremism have used terms such as “DIY ideology” (Stickings, 2023), “salad bar extremism” (Meier, 2023), and “composite violent extremism” (Gartenstein-Ross et al., 2023).
Much of this research, which is typically rooted in disciplines like security studies (Svensson & Nilsson, 2022), psychology (Obaidi et al., 2022), and forensic linguistics (Kupper et al., 2024), recognizes social media as a place where violent extremists share their grievances and find communities. Through social media, extremist communication reaches larger mainstream audiences than ever before (Bryant, 2020). Although media studies have previously dealt with the emergence of new extremist narratives on social media (Massanari, 2015; Phillips & Milner, 2021), the hybridization trend needs further research. This article situates current developments in online extremism within the hybridization framework and investigates the distinct role of social media as both technology and culture in transforming the nature of extremist expression.
Responding to Gartenstein-Ross et al.’s (2023, p. 20) call to study extremism in the context of “the transformation of the information environment through digital connectivity and social media,” the article develops a heuristic theoretical framework around the notion of hybridized prefatory extremism, or HYPE, spaces. The “prefatory” in HYPE spaces refers to the risk of these spaces evolving into the promotion of or even leading to violent extremism. The goal is to bring existing literature on extremism and the recent emphasis on hybridization processes into close conversation with social media research. Through digital ethnography, we have collected 10 sets of data from different social media platforms, which inform this theoretical contribution to the fields of social media and extremism research.
In conceptualizing HYPE spaces, three domains, we suggest, are key to understanding digital cultures that may precede and inform hostile and violent extremism. The three domains are (1) actors, such as (semi-)public figures, internet celebrities, and influencers; (2) practices, such as those associated with self-help or fan communities and established digital trends; and (3) content, such as videos, memes, and the appropriation of entertainment genres, popular culture, and current events.
First, we outline our approach to conceptualizing the HYPE framework, followed by a discussion of the current ideological fragmentation as it relates to extremism, conspiracism, and hybridization while focusing on the distinct role of social media in enabling their current configuration. We then present the core characteristics of the three aforementioned HYPE domains anchored in our empirical findings, to reflect on the types of hybridity involved in the hybridization trend before outlining the definition of HYPE spaces.
Conceptualizing HYPE spaces: methodology
In addition to hybridization, a range of terms have been used to describe how extremists combine previously unrelated beliefs, ideologies, and grievances. The terms include “DIY ideology” (Stickings, 2023), “salad bar extremism” (Meier, 2023), “fringe fluidity” (Gartenstein-Ross & Blackman, 2022), “cross-ideological mobilization” (Davey & Ebner, 2017), “fused extremism,” and “composite violent extremism” (both in Gartenstein-Ross et al., 2023). While terminologies differ, scholars agree that a type of extremist violence based in an “amalgamation of disparate beliefs” (Gartenstein-Ross et al., 2023, p. 1) is on the rise. Although there is some truth to Sutton and Douglas’ (2020) claim (p. 118) that the evident “profusion of constructs, measures, hypothesis, and theoretical perspectives has outpaced efforts to prune or critique them,” our goal of introducing spaces of hybridized prefatory extremism, HYPE, as a distinct term is to synthesize some of these cross-disciplinary perspectives while simultaneously calling attention to the communicative context of social media.
Our approach follows an abductive style of reasoning, moving back and forth between existing knowledge and empirical observations with the aim to offer a heuristic framework that is particularly alert to the mediated nature of hybridization processes. The conceptualization is informed by comprehensive readings and analyses of four types of material: (1) research literature, (2) security reports and threat assessments, (3) news content and online sources, and (4) observed social media practices from both quantitative and qualitative datasets.
First, our review of academic literature across various fields (e.g. extremism studies, terrorism studies, conspiracy, and misinformation studies) enabled us to establish an overview of existing accounts of contemporary extremism. We discovered a theoretical gap in the literature regarding the specific role of social media in the processes of new manifestations of extremism. While most extremism research both acknowledges the influence of social media in individuals’ trajectories of radicalization and analyzes extremist social media content, less attention is paid to theorizing the complex interplay between media logics, networked community practices, online cultures, platform vernaculars, and communicative affordances in generating distinct hybridized social spaces.
Second, we investigated contemporary discourses in threat assessments, research reports, and policy recommendations by practitioners in law enforcement, intelligence services, and nongovernment organizations (NGOs) working to minimize the consequences of extremism. Aside from the Danish Intelligence threat assessment addressed above, reports published by the Institute for Strategic Dialogue (ISD) and the Global Network on Extremism and Technology (GNET) offered useful empirical insights into experienced practitioners’ ongoing work of classifying contemporary extremism and types of violent attacks. This enabled a more comprehensive understanding of the practical implications of the theoretical gap and allowed us to identify the need for deeper insights into digital environments and practices that precede violent attacks.
Third, based on an ethnographic routine of “catching up” (Postill & Pink, 2012) on Danish and international news media’s coverage of various cases of extremism, conspiracy theories, and misinformation, we were able to connect the hybridization trend to specific empirical evidence. Several cases related to our field of interest were covered in the mainstream media during our research period (March 2023−January 2025), including influencer-driven communities promoting a Pizzagate-like conspiracy theory to systemically harass a Danish TV host (Jungersen, 2023), right-wing terrorist recruitment in online communities (Kloch, 2023), and hybrid communities of Putin supporters, anti-vaxxers, and QAnon followers (Staghøj, 2023). In addition, we used the basic digital ethnographic method of searching for information as any internet user would do to stay as true as possible to the user experience of participants in HYPE spaces (Hine, 2017).
Fourth, we engaged in non-participant observation on social media sites, such as Facebook, Instagram, X, YouTube, and Reddit to familiarize ourselves with the narratives and practices related to some of the cases mentioned in the news (see de Wildt & Aupers, 2023). We observed how users participate in conspiratorial or hate-based communities on social media to better understand “how technologies and material infrastructures become players in social relations” (Pink et al., 2016, p. 45). Using the digital method of repurposing various platform features to do research, we explored and consumed content, followed recommendations, hashtags, and scrolled through comments to identify connected social media accounts and communities.
For example, we observed the spread of a QAnon-related conspiracy theory about actor Tom Hanks, suggesting he is part of a satanist, pedophile American elite, in the comments sections on three YouTube videos (n = 62 comments). We followed the topic to Reddit, where we collected data from three subreddits about the same conspiracy theory (n = 12,457 Reddit comments). In other instances, the quantitative dataset was collected first. On Instagram, we observed how users posted clusters of hashtags underneath anti-government memes that were not related to the post itself. We identified six highly active hashtags and collected a network of 1166 hashtags where each node connects to another node with at least 100 posts. Following this quantitative study, we collected 91 memes from the six starter hashtags. Other datasets are exclusively qualitative, such as the conspiracy theory humorously labeled #KateGate on X (n = 133 comments/posts), or quantitative, such as the YouTube network of 30 anti-government channels (n = 1106 videos). Thus, our conceptualization of HYPE spaces is based on five quantitative and five qualitative datasets investigating new trends in extremism on social media (see Table 1).
Overview of Datasets.
The following conceptualization does not allow for deep insight into the individual datasets. Nonetheless, our collective analysis of these 10 datasets reveals significant patterns, which are synthesized to inform the HYPE framework developed in the upcoming pages.
Processes of hybridization on social media platforms
Extremism is often used as a shorthand for political extremism (Cassam, 2021). Inherent to this perspective is a positional understanding of the phenomenon, that is, the notion that political ideas and practices are situated somewhere between the extreme left and the extreme right on an ideological spectrum. While ideology is a key component of extremism, such a spectrum cannot account for contemporary forms of hybridized extremism because it fails to contain ideas, ways of thinking, aesthetics, and communicative practices which cut across typical political divides.
For example, one of the key features of the hybridization trend is the proliferation and increased centrality of conspiracy theories. Although “general belief in conspiracy theories is strongest at either extreme of the political spectrum,” conspiratorial beliefs are not reserved for any particular ideology (Sutton & Douglas, 2020, p. 119, emphasis added). Rather, they are based on an amalgamation of ideology fragments, conspiratorial beliefs, and ad hoc convictions. The compound nature of online extremism requires researchers and practitioners to consider a wide range of content, which may not easily lend itself to traditional classifications by positional ideology. The workings of online culture have “eroded the boundaries between and within ideologies (and, indeed, between ideologies and entertainment, fan-culture, self-help and other genres of public communication)” (Finlayson, 2021, p. 17). This means that various new content forms, genres, and aesthetics have entered the mix of communication which may be perceived as potentially extremist.
This is also pointed out in the terrorist threat assessment by the DSIS (2023, p. 8) in which hybridization is defined as “a process in which the mixing of different ideologies, world views or communities leads to the creation of new extremist narratives and modus operandi among individuals, groups or communities.” The report describes how these new extremist narratives “are built on established ideological and religious ideas, but are supplemented with perceived enemies, conspiracy theories and methods that are not necessarily aligned with these ideas” (DSIS, 2023, p. 8). In the current threat landscape, it appears that ideology boundaries are becoming increasingly blurred, while the link between extremism and conspiracism is taking a more central role.
The basic linkage between extremism and conspiracism is by no means a new discovery. According to Cassam (2021, p. 110), “extremists tend to be conspiracy theorists.” Conspiracy theories and associated types of misinformation are often used by extremist groups to discredit—or even legitimate violence against—their out-group opposition (Obaidi et al., 2022). This emphasizes how sociality, for example, community dynamics and groupthink (Malcolm, 2023), is a key dimension of (online) extremism. Cassam (2021) distinguishes between what one believes and how one believes. This is particularly useful in cases of hybridized extremism where correspondence between belief and method is lacking, a key challenge also highlighted in the intelligence report cited above. For instance, one may hold views that are positioned far away from the political mainstream while also being willing to compromise on these views or unwilling to impose them on others. This is the case for “armchair extremists,” whose extremism is purely theoretical (Cassam, 2021, p. 137). Clearly, extremist online spaces are populated by a majority of armchair extremists, since only a small minority takes the decisive step from posting extreme opinions to actually carrying out violent attacks. Moreover, social media extremists are in fact likely to be purely ironic or performative, rather than theoretical or ideological. One explanation for this is that hybridized forms of extremism online are based on platform-specific communicative logics and influenced by certain online vernaculars, which often have humor as an integrated component. On one hand, extremist content appears to be spreading because algorithms recognize the attention-generating effect of controversial utterances and transgressive behavior (Bryant, 2020). On the other hand, users cultivate transgression through non-conformist media practices, which are at once serious and perpetually ironic (Greene, 2019).
This points to a fluidity of both ideological reference points and forms of expression used to convey extremist messages on social media. Here, individuals and different groups of both ideological extremists, so-called anti-authorities extremists, and militant Islamists inspire each other in terms of aesthetics, methods, propaganda strategies, modes of aggression, and a distinct fascination with violence (DSIS, 2023). This development mirrors an earlier shift in extremist networks associated with terrorist movements of the late 1990s where the “emergence of a flexible worldwide network . . . without any real central body” (Bauer, 2014, p. 61) essentially prompted a reconfiguration of the nature of terrorism.
Social media appears to have accelerated the decentralization of not only extremist networks but of the ideologies that have traditionally brought these networks together. This tendency manifests in a distinct classification, used in counterterrorism efforts in the United Kingdom, which applies to threats with “mixed, unstable, and unclear” (MUU) motivations (Gartenstein-Ross et al., 2023, p. 3). According to a study by Gartenstein-Ross et al. (2023, p. 14), this type of hybridized motivation characterizes “nearly three times as many cases” of violent extremism “between 2017 and 2022 than between 2010 and 2016.” This was also clearly illustrated by the US insurrection on 6 January where researchers found “that only 69 of the 940 defendants charged over their actions on January 6 were affiliated with an organized extremist group” (Jones & Comerford, 2023, NA). Instead, extremist actors are more likely to be mobilized by “more amorphous ideological convictions, political grievances, and shared hatred and hostility toward specific groups and institutions. This fractured landscape is being fueled by a range of issues that extend beyond extremism: conspiracy theories, disinformation, and polarized information spaces” (Jones & Comerford, 2023, NA). Under these circumstances, intelligence services, law enforcement, and researchers alike face difficulties when attempting to conceptualize current digital threats of extremist violence. We propose that part of the problem with conceptualizing this new hybridization paradigm stems from the lack of attention paid to the distinct role of social media and the digital infrastructure which appears to be a key driver of ideological fragmentation. Hybridization, we suggest, not only occurs on an ideological level, but it is also a technological and aesthetic hybridization and a hybridization of digital practices. The workings of hybridized online spaces may be illustrated through explorations of three domains: (1) actors, (2) practices, and (3) content. These three domains are not exhaustive in HYPE spaces but were prominent in our 10 datasets and demonstrate key dynamics and ways in which the hybridization trend manifests on social media.
HYPE actors
In HYPE spaces, a range of “democratically legitimized actors,” including politicians, celebrities, and intellectuals, may boost the visibility and public impact of extreme or conspiratorial content by transferring their credibility and legitimacy onto the content (Rothut et al., 2024, p. 55). HYPE actors may be clearly identifiable individuals, both (in)famous public figures like Elon Musk or lesser-known users, such as Danish conspiracy theory influencer, Connie Ringgard on Facebook, who address their audience in the typical style of influencers, but also anonymous (sometimes meme-based) accounts.
In several cases, influencers who started out in domains typically considered non-political, like fitness or mental health, gradually move into political domains, for example, by sharing opinions, conspiracy theories, or using extreme rhetoric. Although the domain of business is certainly also political, Elon Musk has in recent years translated his authority as a successful entrepreneur into a new level of political influence via extreme rhetoric and conspiratorial claims. Since his acquisition of Twitter in 2022, Musk has taken on the role as influencer in a hybrid domain of institutional politics, far-right advocacy, and conspiracism. With an average of over 80 posts or retweets per day (31 May 2024 to 2 January 2025), Musk has frequently promoted conspiracy theories and made multiple, borderline violence-inciting claims about the inevitability of civil war in Europe.
While Musk constitutes an extreme case, his constant flow of controversial communication about a vast variety of topics appears to be a typical strategy of HYPE actors to stay relevant through outrage maintenance (Finlayson, 2023). This points to a professionalization of conspiracy- or hate-based influencer practices (Frejborg & Pettersson, 2024), which was further supported by our observations of a highly active Danish conspiracy theorist, Ringgaard, who in the spring of 2023 led a much-publicized QAnon-inspired harassment campaign against a children’s TV host, of the program Uncle Shrimp, on Facebook. In addition, this influencer routinely posts about deaths of various public figures, suggesting that the COVID-19 vaccine is responsible while also encouraging her followers to do their own research about the Bilderberg Group, chemtrails, and the moon landing. In line with Hyzen and van den Bulck (2021, p. 181), it seems this type of communication is “not so much about a theory’s specifics” but more about “higher-order beliefs like distrust of authority.” Maintaining a high visibility through constant engagement with conspiracy-themed content serves to generate attention and cultivate a subscriber base first, while ideology comes second. As a HYPE actor, it does not pay to be a one-issue influencer as platform algorithms promote active users with lots of engagement.
Similarly, during COVID-19, alternative health influencers adapted far-right extremist rhetoric into their non-extreme Instagram aesthetic of pastel colors and beauty filters, consequently blurring the boundaries between what extreme and non-extreme content looks like (Demuru, 2022). The influencer-driven fusion of alternative health tips and aesthetics, ideology, and conspiracy theories is associated with “conspirituality,” a prime example of hybridized narrative constructions (Ballinger & Hardy, 2022). For example, conspiracy influencer and “ideological entrepreneur” Alex Jones operates in a hybrid space of political commentary, conspiracy theories, and the promotion of various health products (Hyzen & van den Bulck, 2021).
The online influencer space is not necessarily less mainstream than the institutionalized circuit of established media brands and news outlets. This means that controversial internet personalities like Andrew Tate or Russell Brand who promote extreme and conspiratorial viewpoints on YouTube cannot be reduced to fringe phenomena. While HYPE actors do not always promote conspiracy narratives directly, they frequently give voice to conspiracy theorists and controversial people, who would typically have been filtered out by established media gatekeepers and broadcasters, ultimately leading audiences in a more extreme direction. The ambiguous, exploratory, and hybrid nature of influencer spaces enables entertainment, fandom, political commentary, and social criticism to co-exist with conspiracy theories and misinformation in the same discursive universe. This is illustrated by the popular podcast The Joe Rogan Experience, twice guested by Jones and frequently subject to controversy and debate. Host Joe Rogan’s level of political influence equals that of well-established legacy media like The New York Times (Munger & Phillips, 2020). Thus, when Rogan in an uncritical way platforms conspiracy theorists like Jones or promotes the Ivermectin conspiracy, as he did during COVID-19 (Baker & Maddox, 2022), more audiences than previously are potentially exposed to it. At the same time, Rogan frequently hosts a vast variety of celebrities, artists, scientists, authors, and politicians with no reference to conspiracies or controversial topics. This illustrates how HYPE actors navigate in a space in which ideologically diverse conspiracy theories and extremist views circulate alongside more mainstream and politically moderate content.
HYPE practices
Narratives in HYPE spaces flourish and spread through collaborative, cross-mediated, playful, and forensic practices. Notably, users in HYPE spaces are often engaged in gossiping and speculation, often using forensic (Mittell, 2009) participation as their primary interpretative strategy, sometimes as self-labeled “bakers” and “researchers,” a vernacular on 4Chan and Reddit that became mainstream in the QAnon movement (Marwick & Partin, 2024). For example, followers of the QAnon conspiracy theory hijacked the comments sections on several interview clips with Tom Hanks on YouTube around his COVID-diagnosis in 2020, suggesting, for example, that the videos were deep fakes and that he was in fact imprisoned on a military ship somewhere off the coast of California. The comments referred to details in the videos or in Hank’s previous work as an actor as perceived evidence for their claims. This practice links to the notion of “collective intelligence,” as defined by Levy (1997), by which digital communities are perceived as “self-organized, or molecular, groups” (p. 51). This molecular structure also shapes HYPE practices, where participants may belong to any demographic, but come together in “search for answers.” These users “find evidence” in other available material online, from YouTube videos and old newspaper articles to content from social media profiles and encourage each other to “do their own research” (Tripodi et al., 2024) and chip in “findings” to a narrative multiplicity.
The aforementioned 2023 harassment campaign against the main actor of a Danish children’s television show, Uncle Shrimp, reveals similar practices which serve to negotiate in-group boundaries and determine out-group threats (Berger, 2018). The actor was targeted with threats and online harassment following users’ forensic play, searching for clues and evidence of the actor’s alleged aim to groom Danish children to accept satanism and pedophilia through the TV show (see Petersen & Johansen, 2025). Here, we observed how users co-constructed a diverse out-group, which, in addition to the targeted actor, included trans people, Muslims, Jews, elected officials, members of the British royal family, and people affiliated with the World Economic Forum. As a HYPE practice, and as opposed to Mittell’s (2009) concept of forensic fandom, this interpretive strategy is applied to real-world events and current politics (Petersen, 2025), although popular culture is widely used as source material, reaction gifs, and memes. Through sharing screenshots, links, and edited material of publicly available content, users appropriate every little bit of information (or absence of information) to develop theories that suit them. In most instances, forensic play as collaborative practice does not lead to a mystery being solved, but rather to the development and circulation of a conspiracy narrative: “[t]he practice of interpreting conspiracy is repetitive, endless, and faces continual frustration” (Fenster, 2008, p. 100). Rather than ideologically driven, it seems it is this eternal collaborative grind that drives HYPE practices. They are vehicles for endless speculation but also demonstrate how online communities “get drawn into or arise around new forms of political edutainment” (Wurst, 2022, p. 221). However, in several instances of our data, this practice leaves a space for the possibility of hostile action, for example, in the form of targeted harassment, threats, stalking, and, ultimately, physical violence, moving beyond the prefatory stages.
In our exploratory digital ethnography, we observed a high degree of interconnectedness as users in these grievance-based communities engage with each other’s posts by tagging each other, reacting with emojis, gifs, or memes alongside more or less elaborate analyses. We identified a network of 30 YouTube Channels (although the full network is certainly larger) in which users are sharing anti-government sentiments and various conspiracy theories. The channels were connected via comments on each other’s videos and mutual subscriptions. Many of the uploaded videos draw on news aesthetics and popular culture to signal credibility, for example, broadcasting from blue-toned interior studios, making talk shows and documentaries with archived footage, or even music videos. As such, this genre hybridity lends credence and authority to fringe topics and beliefs by drawing on mainstream communicative forms.
In one American music video, social media influencers and prominent figures, such as Michael Flynn, from the QAnon movement came together in a music performance singing, “We will all stick together, and we’ll never surrender. We won’t give up our freedom,” reaffirming the bonds of the in-group and a call to action to those who understand the underlying hybridized beliefs. For HYPE practices, ideological fragmentation and fusion with fan practices and mainstream formats are emphasized as users engage “in the process of mutation and deterritorialization” (Levy, 1997, p. 51). HYPE practices illustrate textual poaching (Jenkins, 2013) in the truest sense of the phrase, appropriating science, popular culture, and current events to suit their own usage and establish environments of belonging through online sleuthing, transmedia storytelling, and worldbuilding, but creating a prefatory space for hostile and violent action in the process.
HYPE content
The types of content which circulate in HYPE spaces call attention to the transformative nature of memetic culture as a key element in hybridization processes on social media. Memes referencing popular culture can easily be appropriated to a given conspiracy theory or grievance. As pointed out by Lee (2020, p. 101), “memes allowed alt-right propagandists to introduce concepts into mainstream debate, including the term red pill (borrowed from the 1999 film The Matrix) to indicate alt-right affiliates’ awareness of a truth hidden from everyone else.” Conspiracy theories, misinformation, and extremist content are spread through memetic practices as a playful, ephemeral way to emphasize a point of view, or as a pun or joke about popular culture, political ideologies, or current events. The built-in ambiguity and heterogeneity of memetics presents the attentive, online participant with an “ironical in-jokey maze of meaning” (Nagle, 2017, p. 7). While publicly available content, from news to social media platforms, is appropriated as “clues” or “evidence” in HYPE practices, popular culture offers levity to HYPE content, mixing the intensity and playfulness of fandom and internet vernacular (see also Shifman, 2014) with the grievances of conspiracy theorists (Figure 1). It offers a common frame of reference and suggests a cultural anchoring, but by opening a range of interpretations and meanings through its inherent ambiguity. We collected 91 memes from Instagram, and Figure 1 demonstrates how popular culture and internet vernacular is appropriated to communicate conspiracy theories.

HYPE content on Instagram.
Memes constitute a dominant content type in HYPE spaces as “their underlying logics—multimodality, reappropriation, resonance, collectivism, and spread—are greater than any individual text or any individual participant” (Milner, 2016, p. 40). Memes are also characterized by their anonymity or absence of known authors (Marin, 2021). Some of the cultural significance is constituted by the ability to be used and owned by anyone for any purpose. The absence of authorship also allows memetic content to push (or overstep) societal norms and allows them to evade censorship and moderation efforts on social media (Lee, 2020). Famously, memes with Pepe the frog were mobilized and appropriated by the extreme right and used in activist support of Donald Trump’s presidential campaign during the 2016 election (Merrin, 2019). By weaponizing irony, far-right trolls succeeded in “radicalizing potential supporters, challenging progressive ideologies . . ., shifting public conversation, and building a counterpublic” (Greene, 2019, p. 34). Pepe has since been deployed by pro-democracy youth group activism in Myanmar, illustrating its aesthetic and ideological adaptability (Lee, 2020).
Hybridity of content relates not only to the content itself, but also to its digital distribution. The ability of memetics to spread wide and far is supported by platform infrastructures and facilitates the dissemination of conspiracy theories and misinformation through social contagion, that is, “the spread of memes or ideas through personal contact,” enabled by social media across great distances, but “without the loss of intimacy” (Berger, 2015, p. 65). Our digital ethnography on Instagram revealed a large network of memetic media posted under clusters of conspiracy theory hashtags. In these clusters, memes about the Great Reset or the Great Replacement are often tagged with hashtags related to other conspiracy theories (see Figure 2). This hashtag clustering of topics illustrates how hybridization occurs as a result of digital infrastructure, as users tag their post to boost its visibility in other users’ feeds.

The hybridization process of hashtag clusters on Instagram.
In our social network analysis (1166 hashtags), we found that the top 50 hashtags in the dataset were a mix of conspiracy theory and anti-government hashtags with generic hashtags such as #weekend, #funny, and #picoftheday. Part of the hybridization process of content in HYPE spaces is its convergence of entertainment, popular culture, everyday content with conspiracy theories and extremist sentiments.
Defining HYPE spaces
Through the notion of HYPE spaces, we are able to pinpoint some of the specific communicative ways in which the current trend of hybridization of extremism manifests in the digital culture. It is crucial to note that the three central domains—actors, practices, and content—are not separate entities within HYPE spaces. They are contingent upon each other and contribute to the hybridization of prefatory extremism in both overlapping and specific ways, which can be elucidated through four characteristics of HYPE spaces, emerging from our explorations of the central domains.
First, HYPE spaces are counterpublics, that is, anti-mainstream social spaces based on conspiracy-centered interpretive strategies and resistance toward the dominant political discourse (Figenschou & Thorbjørnsrud, 2023). HYPE spaces, like other counterpublics, maintain a level of “awareness of [their] subordinate status” (Warner, 2002, p. 119). This is illustrated by a shared narrative of corruption and persecution, for example, accusations against governments, law enforcement, and mainstream media of constructing false narratives and oppressing alternative interpretations. Paradoxically, users in HYPE spaces typically imitate an ideal democratic behavior, for example, calling out power abuse while simultaneously embracing hostile and un-democratic tactics like harassment and threats of violence. Some see themselves as “freedom fighters” who rise up to “#savethechildren” against various satanic conspiracies (Roose, 2020), which is what may ultimately justify extreme hostility or even violence (Berger, 2018).
Second, HYPE spaces are platform-dependent; their publicness is conditioned by social media affordances and logics of spreadability and datafication (Bengtsson & Schjøtt, 2023). In extremism research, conclusions differ as to what role algorithms and affordances actually play in radicalization processes, moving beyond HYPE spaces and into the rabbit holes of violent extremism. Some studies find the affordances and online culture of Reddit to support misogynistic narratives and “toxic technocultures” (Massanari, 2017); others find evidence of algorithmically supported radicalization pipelines on YouTube (Ribeiro et al., 2020). Conversely, others find no support for claims of an inherent toxicity or echo chamber logic on Reddit in relation to conspiracy theories (de Wildt & Aupers, 2023), nor do they support a general understanding of YouTube as particularly prone to radicalization (Ledwich & Zaitsev, 2020). The point is that while different online platforms afford the emergence of HYPE spaces in different ways, the platform infrastructure on which HYPE spaces rest is dynamic and contingent.
Third, HYPE spaces are participatory. They are anchored in participatory media practices, which, in terms of media genres and modes of communication, are no different from other online communities. Users engage in discussion and cultivate a sense of belonging in fan or anti-fan communities, following current events or influencers. Participation in HYPE spaces is exploratory, speculative, and forensic in nature.
Finally, HYPE spaces are aesthetically adaptable. While the genres and modes of engagement are mundane and non-exclusive to HYPE spaces, there are certain patterns of expression. Ambiguous memes often formulate (or hint at) “stigmatized knowledge” (de Wildt & Aupers, 2023; see also Phillips & Milner, 2017) and suppressed opinions through humor or intertextual references. In other instances, content is popularized or mainstreamed by referencing established narratives, incorporating genre aesthetics, or repackaging content in a highly filtered, curated “Insta”-aesthetic or as news-like in the promotion of conspiracy theories. These characteristics are key to understanding the prefatory character of HYPE spaces. Although users explore extremist notions or share conspiracy theories, HYPE spaces are not in and of themselves arenas of violent extremism. They may in fact remain prefatory and never reach the level of hostility needed to turn violent. Nonetheless, however momentary, playful, or otherwise mundanely situated the extremism is in these spaces, it always carries a potential of pushing users in a more hostile—and then violent—direction.
Conclusion
In HYPE spaces, extremist narratives are molded and softened through participatory practices and mainstream aesthetics to become more digestible to a middle-range audience. HYPE spaces rest on a logic of translation by which a liquid sense of aesthetic and rhetoric makes extreme narrative expression easily adaptable to multiple mainstream media formats, practices, and modalities (Demuru, 2022). This logic, we claim, is part and parcel of the trend toward hybridization. The aesthetic and rhetorical fluidity between domains may introduce extremist expression to new audiences and allow diverse grievance-based communities to share methods of communication. As pointed out by Demuru (2022, p. 597), “Narrative plots, themes, and roles function as bridges between different spaces with the result of creating a space of its own, which does not correspond to any of the original spaces.” In the context of mainstreaming extremism on social media, this space, we suggest, may be termed a HYPE space.
In HYPE spaces, “prefatory” refers to a key characteristic of online hybridization, that is, the practice among HYPE participants of moving in and out of extremist discourse, for example, by sharing hateful content one day and pet videos the next. This means that the extremity of HYPE spaces is always tangled up with mundane messages and entertainment. In this sense, it is a form of extremism which is continuously diluted by other types of content and practices. Conversely, the prevalence of extreme messages also represents a radicalizing potential of HYPE spaces as it may lead users toward increased tolerance or support of violence. HYPE spaces are pockets in mainstream culture, which may serve as a prefatory stage in some people’s trajectory toward violent extremism, because the actors, practices, and content serve as legitimization of hostility and violence. HYPE spaces are gray areas: they are extreme in the sense that they facilitate the mainstreaming of extremist content and cultivate a hostile environment, but also non-extreme in that the hybridized character of the very mainstreaming process may soften or curb the violent aspects of extremist expression so that it is either experienced as more akin to other forms of entertainment or never evolves from prefatory to violent extremism.
The characteristics of HYPE spaces, and the types of beliefs circulating in them, significantly overlap with current trends identified in newer threat research and in law enforcement threat assessments (Kupper et al., 2024). When ideological reasoning behind violent acts is no longer easily defined and determined, preventing extremist violence becomes even more complicated. In addition, “the field lacks consensus about what it is and why it is occurring” (Gartenstein-Ross et al., 2023, p. 2). However, there remains little disagreement that digital networked culture plays a significant role in this paradigm (Heft & Buehling, 2022), although a thorough understanding of the ways in which digital media may contribute to shaping this emerging threat environment is yet to be established. The notion of HYPE spaces, we suggest, may be useful to fill out some of the theoretical gaps and thus return the invitation to extremism studies, threat research, media studies, and beyond to investigate the empirical ramifications of processes of hybridized prefatory extremism. Deploying the concept of HYPE spaces brings attention to the collaborative, mediated aspects of the hybridized trend, which in turn offers insights into how these environments shape the diffuse ideological motivations of “lone wolf attackers” (Gartenstein-Ross et al., 2023). Recognizing the inherent characteristics of HYPE spaces may allow researchers and practitioners alike to identify and monitor development toward violent action in these fluid and fragmented online spaces.
Footnotes
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This article is funded by the European Union under project ID 101095290. Views and opinions expressed are, however, those of the author(s) only and do not necessarily reflect those of the European Union or Horizon Europe. Neither the European Union nor the granting authority can be held responsible for them.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
