Abstract
Social media is integral to modern culture and social work, yet research on the impact of internet echo chambers is scarce. Since internet echo chambers contribute to polarization, normalize harmful ideas and violence, and adversely affect clients, this In-brief seeks to 1) raise awareness about their impact, and 2) initiate a conversation about their harm and how to address it. This In Brief examines the effects of echo chambers, particularly regarding violence against women as well as other minoritized and racialized groups. The In-Brief emphasizes the lack of research on this issue and offers recommendations for social work, including feminist strategies to prevent and stem echo chambers, such as calling in and focusing on systemic harm. It also advocates for integrating digital literacy into educational programs in social work. Resources that support understanding and disrupting echo chambers are included.
Against the dismantling of programs and practices that social workers hold dear, the profession needs to take a harder look at the ways that people enter and engage in echo chambers and develop strategies to disrupt them and the misinformation on which they thrive (Haskins, 2024) Also known as information silos, echo chambers are “enclosed media spaces that have the potential to both magnify the messages delivered within it and insulate them from rebuttal” (Arguedas et al., 2022, p.10), These spaces often act as sites for the absorption and spreading of misinformation, and can lead to grave harm. Consider Dylann Roof, who was radicalized following the trial of George Zimmerman for the murder of Trayvon Martin, by his internet search on ‘Black on White crime.’ Roof's search led him to the Council of Conservative Citizens’ article, “Brutal black on White murders,” which fueled his white supremacist impulses. In his manifesto following the murder of nine African Americans in their church, Roof stated that he had “never been the same since that day” when he found the Council of Conservative Citizens (Collins, 2017).
While echo chambers reinforce political polarization, reduce accountability, prey on vulnerable people, influence violent acts, and are counter to social workers’ values, our recent search using the keywords echo chambers or information silos and social work yielded no results. When using the keywords social work and social media, the results tended to cluster around ways social media can support teaching, research, and practice; victimization and online bullying; social media and the COVID-19-related lockdowns; and social media ethics. The first exception to our search included work surrounding members of the involuntary celibate community who identify as incels. These individuals are mostly men who are frustrated and often angered by their inability to form a romantic and sexual relationship with a woman (Gheorghe, 2023). The second exception was a call for social workers to develop interventions that address conspiracy theories (Cox, 2023). Considering the limited attention given to echo chambers in the social work literature, this In-Brief seeks to start a conversation on the implications of echo chambers and the role that social work and feminism can play in countering and disrupting them. One of the first steps toward disrupting the harm of echo chambers is to understand their mechanics and impact.
Echo Chambers: An Overview
Internet echo chambers such as those on Twitter, YouTube, Facebook, Reddit, Telegram, Gab, and other sites create platforms for influencers like self-proclaimed misogynist Andrew Tate. Before his arrest in 2022 on charges that included sex trafficking and rape, Tate had 2 million subscribers and an audience of 1.3 billion people. Tate's influence is evidenced by a significant uptick in the number and severity of misogynistic and sexual harassment incidences, the growth of the “your body my choice movement,” and his influence on young men (Hume, 2024). Presumably, Tate would only reach those who sought him out. However, YouTube's algorithm likely promoted Tate's content in its recommended section, as research has shown that YouTube's algorithm tends to endorse content that is inherently misogynistic, racist, and conspiracy-laden (Bryant, 2020).
The Pew Research Center helps to provide a context to the significance of online platforms. For example, a 2023 Pew Research Center survey found that 30 percent of adults in the US use Facebook as their primary source for news, which, like other social media platforms, is built to keep people engaged and on the platform for as long as possible (Pew Research Center, 2023). Sean Parker, the first president of Facebook, explained its system as giving users “a little dopamine hit every once in a while…and that [hit] is going to get you to contribute more content, and that's going to get you more likes and comments” (Fisher, 2022, p. 37). The dosing of hits becomes more important as research has shown that social media increases echo chambers, confirmation bias, and hateful narratives and that once an individual enters an online affinity community, a subculture develops in which members often flatten their preexisting values and beliefs to those of the community (Liggett O’Malley et al., 2020; Modgil et al., 2024).
The online community of incels provides insight into the negative impact of echo chambers as individuals who join the group then tend to limit their exposure to diverse points of view, reinforce one another's negative views, and enter additional misogynistic online spaces known as the “manosphere.” As the incel community has expanded rapidly in the last decade, so has the threat of incel violence. By 2020, about fifty people have been murdered by incel or incel-adjacent individuals (Hoffman et al., 2020; Sparks et al., 2022). The literature on incels also provides insight into the relationship between vulnerability and engagement in echo chambers. For example, 60 percent of the individuals in a sample of 250 incels reported having psychological challenges including depressive symptoms, posttraumatic stress, anxiety, and suicidal ideation. Twenty-five percent of the sample had symptoms consistent with autism spectrum disorder and 63% experienced ostracism in middle school. Of even greater concern is that 51.5% sought therapy, and only 6% found it to be helpful (Speckhard & Ellenberg, 2022).
At the same time, Haidt (2024) raises additional concerns about algorithms and mental health. He specifically focused some of the research on girls who are reported to spend more time on social media and have added stresses related to the way they process emotions, perceived strives for perfectionism, and the looming nature of sexual violence. Alarmingly, Lerman et al. (2024) explored algorithms related to anorexia and how girls who are seeking help for their eating disorder are sent to pro-anorexia websites on platforms that have low content moderation. These findings support the need for prevention in the areas of addressing bullying, mental health concerns, effective therapy, and engagement in echo chambers. In the next section, we offer some strategies and a short list of resources that can help disrupt echo chambers and misinformation.
Disrupting Echo Chambers
Social Workers need to recognize the relationship between social isolation, bullying, and mental health concerns with echo chambers. Social isolation often drives individuals toward homogenous groups that reinforce their existing beliefs. At the individual level, fostering open, empathetic communication and finding ways to break down social isolation are crucial. Social work practitioners and educators need to intentionally create spaces for inclusive and non-judgmental dialogue, whether through community groups, online forums, social gatherings, classrooms, or enjoying greenspaces together.
Within communities, neighborhoods, institutions, schools and classrooms social workers need to help build system-wide alternatives to shame-based interactions, as psychiatrist James Gilligan (1997) reminds us that violence is internalized shame. In this regard, feminist scholar Loretta Ross discusses the importance of engaging with those who offend us - and even with those who have perpetrated violence, in non-shaming ways. She calls this practice “calling in,” which differs fundamentally from “calling out.” Calling in emphasizes engaging with individuals in ways that open the door to dialogue and understanding. Ross recognizes that there is emotional labor involved in calling in, and, therefore, it should never be imposed, and it should only take place when it is safe to do so. With these caveats, calling in can also be an important skill for social workers to hone, especially, when working with individual clients, community members, or students who are a part of echo chambers. Calling in affirms the importance of human relationships and creates a space where people can have their bias challenged healthily. Calling in also helps to break some of the logic of echo chambers, which encourages a binary framework that pits “us against them.” The practice of calling in allows room for nuance and removing confirmation bias. Calling in is a relational act that opens possibilities for curiosity. It offers people the time and opportunity to reflect on statements they make, space to explore and question information, and it supports developing solutions instead of blame (Ross, 2025).
Other tools for disrupting echo chambers include the critical feminist idea that blame needs to be shifted from individuals to systems (Goodkind et al., 2021). An exploration of the systems that supports echo chambers needs to include an analysis of the impact of bullying and, in the case of incels, the fetishization of attractiveness. Additionally, there needs to be stronger calls for research that looks at the drivers of echo chambers and the experiences of people who fall victim to them. Social workers also need to explore the data related to the positive impact of cell phone-free schools (Haidt & Rausch, ongoing).
Social workers also need a level of digital literacy to contest algorithms, as many are designed to maximize profit and keep engagement through content that perpetuates racism, misogyny, sexism, homophobia, and every other system of oppression that harms clients and communities. Social media companies rely on advertisement revenue to make money and use a capitalistic framework to defend their platforms. Social workers need to become acquainted with sources of misinformation and gain a better understanding of how misinformation spreads, as well as of the strategies that can be implemented to counter it. One approach to stem misinformation, “prebunking,” has gained traction. Prebunking, rooted in inculcation theory, functions like a vaccine by strengthening individuals’ mental defenses against misinformation through exposure to a weakened and forewarned version of an argument, along with prompts or space to analyze it. While the research on prebunking is mixed, there is enough efficacy (c.f. Roozenbeek et al., 2022) that there are numerous tools that explore prebunking and debunking, including the Disinformation Debunking Station hosted by Boise State University.
Two strategies we have used in the classroom include, having students create a social media echo chamber by setting up a fake social media account and engaging it as they normally would to establish a content baseline. Then, they watch videos or read posts significantly more intense or negative than the content they would normally read or watch. Over the next week, students are then asked to note the content on the feed and notice the incremental increase in their intensity. This exercise allows them to see how quickly their newsfeed pushes them to echo chambers. A second strategy is to assign learning modules from the Center for Humane Technology (shared in the resource section) to learn more about algorithms. Most people believe they are choosing what they watch or read on social media and have little understanding of how content they think they are choosing is pushed to them. The algorithms inherent in social media platforms raise significant concerns about the impact of social media on people's health and well-being. Direct practitioners must understand how algorithms may sabotage their work with clients and develop safeguards for them. Finally, social workers must be aware of organizations like Issue One: Fix Democracy First, which is working towards social media reforms. This knowledge enables us to advocate for policies that support ethical social media platforms, protect our clients from exploitation, and promote public health by highlighting the harms of social media.
Conclusion
Echo chambers stand in opposition to social work and critical feminism, fields that traditionally center on advancing social justice. Echo chambers suppress divergent points of view, narrow public discourse, and fuel the spread of mis and disinformation. The problems echo chambers create serve as a powerful call to action for social workers. Social Workers must first begin by educating ourselves about echo chambers and addressing the issue in our teaching, research, and practice. We must also design spaces where diverse voices and ideas are actively welcomed, encouraging dialogue, promoting reflexive practices, and establishing forums for courageous conversations that bridge divides. The antidote to the allure of echo chambers is learning to appreciate diversity: the unique gifts the other has to offer that can stimulate more thoughtful responses, encourage growth on all levels, and lead to mental, physical, social, and emotional health. Practices that lead us to more inclusive conversations can advance the goals of social justice and equity in critical feminism and social work.
Footnotes
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
