Abstract

I launched this journal 10 years ago because I wanted to redefine an overused term: Social Media. In what I called our “Manifesto Issue,” I invited members of our Editorial Board to do just that: write mini-manifestos of what social media – of all forms – might be. I asked for essays, songs, poems, visuals, lists, rants, and random philosophizing among any other forms of content that would reflect my commitment to doing things otherwise.
Ten years later, we are a different journal. Grown up, but not mature. Child-like, but not childish. Imaginative and Other. I am grateful that people responded to the vision we laid out in that manifesto issue, even though whenever I use the manifesto word in US-based academic circles, people assume I am gearing up for a revolution. Raymond Williams reminds us that revolutions are long, and I add that they have to be long, to attain meaning. And we are nowhere near revolutionizing social media, as societies, and as the contributors to this anniversary issue, illustrate. We remain caught up in affective cycles of using banal media for subversive purposes, at best.
So, what is next? I invited academics who have shaped this journal’s evolution over the past 10 years to write about what’s to come for social media, of all forms. In my own contribution, I offer my thoughts on where we are headed next. When I started the journal, I wrote that we have always been social. Any technology we make, use, or adapt will by definition be social, because it of us, by us, for us. We overuse the term social and often apply it to things that are not social. But we, as humans, cannot escape being social.
Much of the technology we use today overuses the term machine, to frequently posit the argument that machines are distinct from humans. The term machine is used to signal difference and reason for trepidation in such difference. I am reminded of how the term social is similarly overused. It is deployed to advertise the social properties of platforms. It is bundled into the attention economy. It is monetized into influencer culture. It is routinely unitized and conspicuously reproduced into micro-acts of networked connections. It is performed, affectively and rhetorically, in ways adapted to the logics of the new platforms. Finally, it is assumed to be the defining characteristic of platforms presented as social media, as if these platforms were more social than media used in the past or different from other technologies that were perhaps asocial. We have always been social, I proclaimed in starting the journal, and we will always be social.
Similarly, in embracing what lies ahead for our journal, I want to emphatically say that we have always been machines. We will continue to be machines. We have always deep learned, learned through large scale classification, through mimicry and role modeling, via codes and agents, embodied and non-embodied. We learn with bias and constantly learn how to reject bias, for bias never goes away; it only shifts form. Our anatomy is machinic. We need servicing for parts. We lag and give slow responses when we have not been charged. We need memory upgrades but it is not too easy to receive them. Our parts break, but we cannot always replace them. Older models often are phased out and newer models might be more expensive and less efficient.
DNA is our code and our environments make up the complex large learning landscapes we draw from as we evolve and role model our way through them. We are machinic by nature and will always be. Consciousness is our sentience. Neurons bind our muscles into webs of motions, affects, and actions. Our hearts beat to the rhythm of the breaths we take to live, and when they do not do so with machine-like accuracy, we fail. When we create machines in our likeness we first do so by breaking down our own anatomy and functions, as if we were reverse engineering a prototype to replicate it. Yet we refuse to see ourselves as machines. We assume that a machine will lack the nuances of a human, given its machinic nature. Our own means are machine driven in ways that do not prevent us from being sentient, sensitive, aware.
We simultaneously overestimate and underestimate machines, in ways similar to how we behave with our fellow humans. We insult and congratulate them, in a manner common in our everyday social exchanges. Machines let us down and surprise us, much like people.
What is wrong with identifying with a machine? Why the need to distinguish between creator and creation, as if the creator were not at some point a creation themselves?
Here is where I hope the journal will head next:
- Develop new models for understanding human–machine communication that place less emphasis at the hyphen separating the two terms.
- Turn to models that emphasize the vocabulary of kin, likeness, and companion species, and turn away from language that replicates the term AI – a term that we have criticized often for its inadequacy. In plain speak: AI lives rent free in our brains, and it needs to move.
- Machine politics; synthetic politics and synthetic data.
- Synthetic sociality and social robots.
- Platforming consciousness and intelligence.
- Social learning as deep learning.
- Systems of governance for embodied and non-embodied agents.
- Social misinformation; epistemic hardening of categories; fluid science; classification as social science; micro-species classification and sociality.
I am adding to the list as I go along and I look forward to your suggestions.
We are not different from or like machines. We are machines, always and forever trying to find meaning in being human.
