Abstract
This essay reflects on a decade of worsening social media harms—inequality, algorithmic power, precarious labor, and data extraction—arguing for renewed interdisciplinary and collaborative approaches to create just sociotechnical futures.
On the occasion of this journal’s launch, I penned an angry little manifesto called “Social Media and the Struggle for Society” (Baym, 2015). My argument was this: Social media are valuable inasmuch as they support the sociality crucial to our humanity. But their business models, premised on growth and advertising, could never support the lofty goal of strengthening social ties, a goal to which some of their leaders then claimed to aspire. I wish this reflection on that essay could tell a happy story about the state of the struggle. I still believe in the extraordinary value of our ordinary communication. Social media platforms are still where so much of that happens. But this decade has been so much worse than I feared.
In that essay, I pointed to four of the ways social media threatened society: they were exacerbating wealth inequality; their algorithmic feeds served interests we could not know; they encouraged an unrealistic willingness to engage in precarious online creative labor in hopes of getting paid; and we never knew where the data we shared would end up, used by whom, or for what purposes. These problems transcended social media, I wrote, “they concern capitalism, democracy, and the fundamental underpinnings of societal fairness.”
How All That Going?
It’s Bad
Take wealth disparity. In 2015, I called out Mark Zuckerberg’s net worth, then a mere $44 billion, as emblematic of the growing inequity between social media platform owners and those they claimed to empower. According to Forbes Wealth Team’s (2026) ranking of billionaires, Zuckerberg started 2026 with a net worth of $216 billion. Sergey Brin, who made his fortune from Google/Alphabet, parent of YouTube, had $237 billion. Twitter, now X, is famously owned by the world’s richest person, who, with a net worth of $730.6 billion, found it trivial to spend $420 million turning the club where credulous journalists hang out into a propaganda machine to serve his ideological aims. There’s a reason the term “broligarchy,” with its allusion to tech leaders’ political dominance, has taken hold. Wealth inequality is political inequality.
Algorithmic feeds? Certainly, recommendation has its place. I’m as soothed as anyone by an endless scroll of feel-good dog videos. But the center of our interpersonal communication spheres has never been the place for companies to impose venture-capital-driven ideals of what people want. As Jean Burgess and Baym (2020) showed, Twitter shifted from fostering ephemeral social connectedness to amplifying outrage for clicks long before it became X. Whatever they may say, none of the dominant social media sites ever really optimized feeds to push people toward interaction or media that made them better citizens, neighbors, and loved ones. Brave trust and safety officers aside, platforms’ leaders never really tried to eliminate coordinated campaigns, often by shady actors, to shift public opinion and foster division. While the dire state of democracy and societal fairness can hardly be attributed to algorithms that optimize for getting and keeping attention, those algorithms too often enable and amplify harm. As Whitney Phillips (2015) titled her too-prescient book about trolling, this is why we can’t have nice things.
Precarious labor? As I wrote about in Playing to the Crowd (Baym, 2018), being visible on social media—be it TikTok, X, or LinkedIn—is ever-more intrinsic to many people’s plans for earning a living. “Influencer” has become a common career aspiration. In economies where even a full-time job may not pay the bills, the rise of content creation as an aspirational career path is unsurprising. Platforms offer the sense that anyone could create and monetize fame with as little as their phone. Yet just as the dream of launching a startup that makes you the next billionaire was always out of reach for nearly everyone, pouring time into building an influential social media presence will never ensure success, let alone a sustainable career. But it’s one form of fuel for the platforms’ continued influence.
And what of all the data we offered without knowing where it would end up, used by whom, or for what? All the posts, all the pictures, the videos, the memes—the “content” that’s fueled this era of “communicative capitalism” (Dean, 2005)? Of course we got surveillance and government overreach. That was already happening, though perhaps it’s gone further and faster than many anticipated. It was harder to foresee how much of that data, and more, would be appropriated to train generative AI models. It’s easy to point to the scandal of the moment as evidence of horrific unexpected outcomes of social media data sharing. As I write, it’s Grok turning photos of women and children into non-consensual violent porn for distribution on X. But we are starting to see more subtle pernicious concerns as covert-AI outputs infest our feeds, undermining trust and leading responsible readers and viewers to ask “is this AI?” even of material as mundane as feel-good dog videos. Authenticity, schmauthenticity.
I have been studying the internet since 1991. Recent years have tested my understanding of what such work should do. What does it mean for us that despite all our research, much of which called out the serious problems before us, so much just keeps getting worse? What are we to do now?
I still believe that it is important to document where things go right. There are glimmers of alternatives in less-algorithmic decentralized social media like Mastodon or BlueSky. There are still communities of care and support online. Mutual aid networks grow. Friends still keep in touch. Relationships endure. Surprising, entertaining, and informative tales do show up in one’s feeds. We need optimistic and inspiring visions and theories, for ourselves and for those who come after.
It’s also still important to document and theorize the ways power shifts, and to understand the dynamics through which societies are harmed by social media. The world needs analytic lenses that enable everyone, from those who develop and deploy technology, to those who use those technologies, to those who govern them, to make wiser, more considered choices.
Increasingly, though, I worry about critique and celebration for their own sakes. I worry we have not taken account of our own inability to affect how this decade of social media played out nor asked what we must do differently. I worry we underestimate the extent to which we are actors in the worlds we study.
None of us as individuals, nor the fields in which we work, have all that we need to make change. We will be at our most powerful when we collaborate across disciplines and sectors with those who bring expertise, capabilities, and networks we lack. Some in our fields, including many who have published in this journal, are doing this already. For example, they are working with governments, NGOs, worker organizations, and platforms to shape policy. They’re leading boundary-spanning initiatives like the ESRC Digital Good Network that seek alternative ways of imagining and creating what media could be. Like me, they are working in industry, doing what we can to develop and translate findings from research in hopes of nudging technology and its deployment toward more just formations.
These collaborative, interdisciplinary and cross-sector engagements are complicated and messy. We must know what we bring to the table, not least our capacity to diagnose sociotechnical problems (Gillespie, 2023), while being eager to understand others’ insights, concerns and the structural conditions that shape their practices as well as our own. This requires time, curiosity, humility, and compromise. But despite the challenges, these kinds of engagements are our best hope if our work is to have the impacts it deserves.
I now look at the closing sentiments of my 2015 piece as encouragement to this dispirited future me. This would be a long struggle, I wrote, one in which we had every reason to be angry, but couldn’t afford to be cynical. That these already-hard problems have gotten so much harder is no reason to stop studying them. It is every reason to rethink how we approach them. Social media have influenced society. Our work must too. Let’s aim to celebrate that the next time Social Media + Society has a significant anniversary.
Footnotes
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
Declaration of conflicting interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
