Abstract
Following media-theoretical studies that have characterized digitization as a process of all-encompassing cybernetization, this paper will examine the timely and critical potential of Günther Anders’s oeuvre vis-à-vis the ever-increasing power of cybernetic devices and networks. Anders has witnessed and negotiated the process of cybernetization from its very beginning, having criticized its tendency to automate and expand, as well as its circular logic and ‘integral power’, including disruptive consequences for the constitution of the political and the social. In this vein, Anders’s works, particularly his magnum opus Die Antiquiertheit des Menschen [The Obsolescence of Man], sheds new light on the technologically organized milieus of the contemporary digital regime and also highlights a new form of cybernetic ‘conformism’. The goal of the essay is therefore, not only to emphasize the contemporary nature of Anders’s thought but also to use it to frame a critique of current neo-technocratic and, ultimately, post-political concepts, such as ‘algorithmic regulation’, ‘smart states’, ‘direct technocracy’, and ‘government as platform’. This essay argues that cybernetic capitalism is causing what Anders terms ‘Unfestgelegtheit’ to disappear; that is, we are losing the originary possibility of technologically (re-)structuring our world in alternative ways, particularly given the determinist character of current technologies.
Keywords
In ‘The Obsolescence of Privacy’, an essay written in 1958, Günther Anders puts forward a thesis that remains thought-provoking and timely today: ‘where bugging devices are used as a matter of course, the main precondition of totalitarianism has been established and totalitarianism is achieved’ (2017: 27). A few lines earlier, Anders offers the following elaboration: every society that allows itself the use of such devices inevitably acquires the habit of considering humans as fully exposable and as entities that are allowed to be exposed. Whilst acquiring this habit, societies inevitably risk sliding into political totalitarianism. As we have seen, technological innovations are never only technological innovations, and this makes the danger so great, for nothing is more misleading than the assertion that machines are ‘morally neutral’. (2017: 24)
Anders’s timely relevance: On the contemporary debate over technical neutrality
The phrase ‘the obsolescence of privacy’, the title of Anders’s abovementioned essay, is currently being adopted in several TED-talks and conferences addressing the digital, albeit under very different auspices. To cite one example, Michal Kosinski, a behavioural psychologist from Stanford University, recently gave a talk entitled ‘The End of Privacy’. Kosinski’s arguments create an emphatic contrast to Anders’s thoughts on bugging devices. The data scientist claims to have shown that an algorithm is able to know a person better than his/her closest relatives based on only 150 Facebook likes and even better than his/her partner on the grounds of only 300 likes (cf. Kosinski et al., 2015). Moreover, in 2016, Kosinski, according to his own description, ‘only showed that it [the bomb] exists’ (quoted in Grassegger and Krogerus, 2017). The ‘bomb’ is metaphorical for highly individualized psychometric targeting methods, such as the display of particular ads to selected target groups that are considered most vulnerable to the messages shown. Kosinski at least indirectly inspired the development of these methods. Subsequently, the notorious data firm Cambridge Analytica began to use these so-called ‘dark ads’ in Donald Trump’s election campaign, and probably even in the British ‘leave.eu’, to help build, analyse and target millions of online voter profiles. In a panel discussion following his 2017 talk, Kosinski continued to claim that privacy was simply no longer an option with regard to the sublime and unstoppable forces of digitization. Eventually, he even went so far as to argue that it is ‘egoistic not to share your data’. Moreover – and this claim illuminates the contrast with Anders particularly drastically – the researcher and former digital consultant holds that technologies, with the sole exception of machine learning, are principally ‘as neutral […] as a knife’ (cf. Kaltheuner et al., 2017).
What is interesting here is less that Kosinski’s thesis of neutrality is diametrically opposed to Anders’s earlier findings and more that his views are representative of a powerful common sense in Silicon Valley, a place that is well known for not putting much effort into preserving digital rights or privacy standards. In fact, Kosinski’s arguments seem to form a sort of profession of faith of business-minded programmers. For instance, during the debate on the circulation of fake news on Facebook, Mark Zuckerberg declared that it is ‘crazy’ to suggest – in any way – that the platform had any influence on the US election outcome (cf. Solon, 2016). According to Facebook’s CEO, the ‘social’ network is nothing but a neutral tech-company, that is, it is not a media outlet, as it is not focused on promoting any particular content. In a similar vein, Ex-Google-CEO Eric Schmidt and Jared Cohen, the founder of Google Ideas, state that Silicon Valley usually subscribes to the view ‘that technology is neutral but people are not’. According to Cohen and Schmidt, this is a ‘central truth…[that]…will periodically be lost amid all the noise. However, our collective progress as citizens in the digital age will hinge on our not forgetting it’ (2013: 66).
That the producers of digital technologies are not afflicted by Promethean shame while still singing the song of ‘Californian ideology’ (Barbrook, 1996) seems unsurprising given the sweeping success of their exploitative economic logic. And yet, this shamelessness has, undeniably, programmatic-political consequences. Indeed, so-called ‘neutral’ technologies are intimately related to a solutionist logic, or, to be more precise, a problematic rhetoric of suspicion. As Cohen and Schmidt state quite prophetically, even in an age of post-privacy, there will always be people who resist adopting and using technology, people who want nothing to do with virtual profiles, online data systems or smart phones. Yet a government might suspect that people who opt out completely have something to hide and thus are more likely to break laws, and as a counterterrorism measure, that government will build the kind of ‘hidden people’ registry we described earlier. If you don’t have any registered social-networking profiles or mobile subscriptions, and online references to you are unusually hard to find, you might be considered a candidate for such a registry. You might also be subjected to a strict set of new regulations that includes rigorous airport screening or even travel restrictions. (2013: 173, emphasis added)
It is likely that Anders would have traced Schmidt’s vision back to the immanent logic of technology itself, i.e. the internal dynamics embedded in and inscribed into technical applications. More precisely, Anders’s prior objects of critique might well have been the forms of control exerted by algorithms. 1 Indeed, these mathematical executions are defined via the determining formula ‘logics + control’ (Kowalski, 1979), whereby it is irrelevant that algorithmic programs seem variable due to the flexibility of their parameters and the formal possibilities of extension. As sequences of determined orders, they remain dependent on unambiguousness and clear classifications. As the media theorist Roberto Simanowski (2014: 106) argues, personalized algorithms reinforce a ‘narcissism of compartmentalisation’. They are based on a binary play between yes or no, between all or nothing, 1 or 0, the visible or the invisible human. In this vein, they are fundamentally incapable of ambiguity, such as forms of tarrying or indecisiveness. They exclude the ‘third’ and thus rest on an essential totality, a moment of including exclusion. If applied to the social, algorithmic if-then clauses thus force more subtly what the experiments of human engineering already hinted at in Anders’s times: an ‘integral’ form of power (cf. Anders, 1980: 140 ff.; cf. Lanier, 2018: 5–25).
Seen from this perspective, Anders’s elaborations on the ‘prejudicing role’ (2017: 24) of technology, i.e. the fact that the apparatus is never neutral but that it always implies its use and that we, ‘irrespective of the political and economic system from within which we turn to it’, are always-already shaped by it (2017: 25), seem most relevant regarding the digital condition of the present. It is particularly in a surveillance-capitalist, increasingly mediatized infrastructure that the ‘assiduous and subtle infiltration of the communication channels in the social “flesh”’ (Lyotard, 1993: 213) unfolds with new significance. With regard to the internet of everything, it becomes obvious that the knife is not a knife – or to put it in the language of Anders, that no ‘means is just […] a means’ (1956: 99), and furthermore, that digital technologies are characterized more than ever via their systemic character.
Most recently, the status in which technology ‘prescribes […] or posits the grounds for’ (Anders, 2017: 24) the maxims of action instead of maxims determining the use of technology led to a partly uncanny revitalization of Anders’s thought. On the one hand, within the dogma of ‘solutionism’ 2 (cf. Morozov, 2014), the emergence of the paradigm of what Anders already critically termed a ‘total machine’ (1980: 114) can once again be witnessed 3 – think, for instance, of the specious fantasies about an AI-based ‘master algorithm’ (Domingos, 2015). On the other hand, experts have called for a ‘moral imagination’, albeit without referring explicitly to Anders. The mathematician Cathy O’Neil, for instance, demands a new ‘moral imagination’ (2016: 204) to anticipate the consequences of the application of algorithms. This concern is echoed by AI-researcher Kate Crawford (quoted in Solon, 2017), who calls for an ethically motivated mapping of unintended consequences regarding our use of Big Data and AI, specifically with regard to technopolitical issues such as algorithmic voter targeting, the growing influence of the so-called ‘Big 5’ (Apple, Facebook & Co.) on the political (cf. Maschewski and Nosthoff, 2017a, 2017c), data mining in the name of intelligence services, or political problem complexes such as fake-news and a fundamental messengerization of the political (cf. Fichter, 2017).
Of relevance here is both the content-related actuality of Anders’s thought and the fact that his critique of technology (that was focused on the second and third industrial revolutions) can be directly applied to our current digital age (cf. Fuchs, 2017). Indeed, what is still decodable in the idiom of Silicon Valley apologists is a larger, and ultimately older, movement whose powerful efficacy Anders had already noted, if even more implicitly than explicitly: ‘Cybernetics’, writes Simanowski (2014: 49, emphasis added), ‘has always been the codeword for control to which the internet – of humans and of things – convey more and more areas of human life’. 4 The beginnings of this powerful performative science can be traced back to the early 1940s, while its discursive peak is simultaneous with the publication of the first volume of Die Antiquiertheit des Menschen (1956). Thus, as we will elaborate in the following section, Anders’s critique of technicization and technocracy needs to be read as a general critique of cybernetization, particularly of cybernetic conformism.
‘Adapted to adaptation’: Günther Anders as interpreter of an all-encompassing process of cybernetization
To set the ground for our discussion of the ‘obsolescence’ of politics it is worth tracking the intersection of cybernetics and Anders’s critique of technology, which deploys a terminology that at times seems overtly shaped by a cybernetic vocabulary, even though his work only occasionally mentions cybernetics as such. Indeed, Anders (1980: 140) writes that ‘conformist societies function as prestabilized harmonic systems’, while his entire critique of adaptive behavior 5 can be understood as a central aim of the Antiquiertheit des Menschen (cf. Hörl, 2012). Furthermore, he anticipates the progression of cybernetic feedback processes: He writes about the ‘replacement of “responsibility” by a mechanical “response”’ (1956: 245), that ‘transforms the ought (das Gesollte) into what is mere chess-likely (schach-mäßig) “correct”, the prohibited into what is mere chess-likely “incorrect”’ (p. 246). He also refers to the ‘circular or the spiral process that sustains the conformist society’ (1980: 145). In Anders’s technosphere, machines enter into relations with one another, and Anders thus already envisions technical milieus, ‘environments’, viz. as he notes elsewhere, a ‘Volksgemeinschaft of apparatuses’ (p. 115). 6 Moreover, he anticipates that singular devices, characterized by a substantial urge to expand, will integrate and connect amongst themselves. In this vein, Anders claims that it is the ‘dream of the machines’ (p. 110) to grow into an all-encompassing, complete system, a ‘total machine’ (p. 114). In addition, Konrad Paul Liessmann defines Anders’s concept of information by referring to social cyberneticist Gregory Bateson’s understanding of information as a ‘difference that makes a difference’ (cf. Liessmann, 2002; Bateson, 1981: 488).
What Anders terms the ‘silent command’ of technology (1980: 154) is thereby increasingly accompanied by informational noise, the ‘sound of a million voices’ (p. 153), which defines the existential condition of the socio-cybernetic machine: As the philosopher explains, its machinery never runs entirely flawlessly […] as it is in constant danger of forfeiting its already-won form, its coefficients of conformity, since it is always a bit in need of improvement – it, thus, always needs to utilise its means to sustain and correct itself. (p. 154) Ultimately individuals are desired who have been tailor-made for impudence; individuals who are ready with ‘their hands on the zipper’ to be encroached on and strip-searched. Only those persuaded to actively and ‘happily’ partake in their own de-privatisation are deemed perfect. (Anders, 2017: 35) creatures onto which one can foist responsibility; thus, machines of oracle, electronic automata of consciousness – for, cybernetic computing machines are nothing else; as epitomes of science (and with it, progress, and with it, the under-any-conditions human), computers now purringly assume responsibility, while the human stands beside and washes his hands of responsibility, partly thankful, partly triumphant. (Anders, 1956: 245) that our existence today exclusively unfolds as instances and processes of receiving deliveries, or even that it has become one single, vast consignment, for there is a complementary process that shapes our existence no less decisively than ‘the supply of world’ (Belieferung), namely, ‘being supplied and surrendered to the world’ (Auslieferung).
Anders’s observations regarding the feedback-logical mechanics of conformism point to a systemic change, hinting at a logic of governance that circulates recursively and in a self-contained manner. The latter is exemplified by what Anders describes as a paradoxical double function, i.e. the weird symbiotic relation of the consumer as producer, the exhibitionist as spy or the Belieferung as Auslieferung. Anders describes this (a-)logic as a ‘loss of category’ (p. 31) or a system of ‘vanished differences’ (p. 84) that marks, in the first place, a totalitarian programmatic. Strikingly, however, and very much in line with his analysis of second-order cybernetics avant-la-lettre, he understands this programmatic not as a unidirectional machine à gouverner that moves the masses by merely pushing a button. Rather, going beyond the collectivizing effects of the radio or the television, Anders recognizes that in an era of ‘cybernetic capitalism’ (Tiqqun, 2007: 41), mechanisms of control that are all-embracing but far more subtle, smooth and comfortable, become dominant. These are mechanisms that are necessarily bound to liberal-promotional propositions, to a form of ‘passivity in the costume of activity’ (Anders, 1980: 145), thus retaining the ‘illusion of freedom’.
Indeed, Anders foresaw that during a time when everyone participates in the conformist noise of ratings and rankings, in which everyone looks into the same smart Gehäuse, thus following standardised presentational patterns for the ‘individual’ self (cf. Maschewski and Nosthoff, 2018), the promise of total functioning seemingly manages to install an instrument of power that is more effective than any externally enforced synchronization of thoughts. In fact, it could be argued that today, it is various forms of solutionism that are colonizing the socio-political imaginary – think of, for instance, Google’s Sidewalk Lab in Toronto or China’s social Credit Score. As we will exemplify further in the next section, some of these approaches go so far as to redefine the state as a social network and politics as mere logistics. One prior aim of these endeavours is to optimize the political through a form of mathematical thinking, which subsequently leads to the suspension of politics itself. Strikingly, this development echoes one specific thought of Anders: as technical devices gain relevance, they eventually tend to suspend the political, which is understood, above all, as a realm of (reasonable) dissensus.
Anders’s critique of technical politics as a critique of (neo-)cybernetic models of politics
More precisely, Anders particularly considered political revolutions to become obsolescent, with the political remaining at best an irrelevant super-structural phenomenon. ‘Freedom exists only as auto-mobility’, writes Liessmann (1993: 106), with particular regard to Anders’s analysis, ‘equality as TV for all and fraternity as the community of users of databases’. Peter Sloterdijk (quoted in Meerman, 2011) has argued in a similar vein: Instead of being the subject of a revolutionary uprising, the demos rather suffers from the ‘revolution’, a revolution that designers and programmers impose upon it and declare against citizens’ everyday lives. Thus, today, one could indeed claim that sovereign is he who decides on the norm. Following Anders’s arguments regarding the technical suspension of political revolutions, sovereign is he who forcefully redesigns and shapes our life forms, thus eventually creating irreversible facts.
Respectively, the limitations of the contemporary sociopolitical imaginary can be witnessed with regard to new disruptive, neo-cybernetic models and practices of governance that culminate in various forms of algorithmic steering – ranging from ‘smart states’, ‘government as platform’, and ‘direct technocracy’ to concepts such as ‘nudging’ and ‘algorithmic regulation’. 9 Politics is hereby understood mainly as an automatically processing system of coordination that reacts to irritations. The allegedly horizontal, direct-democratic freedom from hierarchies – Facebook and Co. are hereby often seen as role models (cf. Khanna, 2018) – subsequently installs a neo-cybernetic mode of governance that aims at liquidating ‘obsolescent’ institutions (parliamentary democracy, parties, etc.) as well as their intervening regulatory powers (cf. Noveck, 2015). Indeed, these evidence-based processes of automation actualize Anders’s thesis that once technology starts to intervene in, colonize, and ultimately suspend the political, 10 its significance ‘gain[s] the upper hand to such an extent that the political occurrence ultimately plays out within its frames [the frames of technology]’ (1980: 108).
The aforementioned critique of cybernetic control can thus also be read against the background of what Anders calls the ‘perfect integration’ (2017: 27) of the state. As he writes in the second volume of Die Antiquiertheit des Menschen, an integral state, ideally, has no ‘blank spots on its map’ (2017: 26). It either reaches all individuals or produces citizens who are themselves voluntarily ‘eager to accommodate’ it and are transparent and without walls. In brief: the totalitarian state would be perfect only if ‘discretion’ (as conceived by natural philosophy) did not exist, not to mention selfhood, ‘privacy’ and ‘intimacy’ in the psychological sense. (2017: 26)
Similar to current models of algorithmic regulation, the early forerunners of cybernetic governance were, corresponding to Shannon and Weaver’s (cf. 1949) influential model of information, focused on the intensity of communication instead of its content or semantics. Indeed, Karl Deutsch was convinced that the higher the intensity of communication was, the more democratic the state. In line with this, central to political-cybernetic thinking was the registration and subtle steering of the people’s will as well as the informative integration of that will into a larger systemic context. To do this, cybernetics was reliant on a socially embedded system of feedback that focused precisely on what was so vehemently criticized by Anders: adaptive behaviour. In this vein, the aim was to establish spontaneous orders and to avoid a content-related dissensus, or any other form of political antagonism.
At the same time, Beer focused on implementing a dialectics between freedom and control. According to him, freedom was a ‘programmable function of effectiveness’ (Beer, 1973: 6) or, as he described the relationship elsewhere: ‘The freedom we embrace must yet be in control’ (Beer, 1974: 88). Günther Anders also sought to decode this particular conflation: According to him, it is part of the ‘duty of the conformist that he never slips free from freedom’ (1980: 143). In this vein, freedom and control are, particularly in a highly technological age, about to enter a problematic relation, whereby ‘the privation of freedom […] goes hand in hand with the ideology of freedom’ and the ‘abolition of freedom […] usually proceeds in the name of freedom’ (Anders, 1980: 195). In this ‘integral system’ (p. 187), participation is, as Anders writes, reduced to mere ‘acts of playing along’ (Mit-tuns) (p. 184), an observation that indeed anticipates the reductive dimension of the cybernetic understanding of politics, which, as Dieter Mersch has recently noted, encompasses only the dimension of Teilnahme (in the sense of merely attending) and precisely not that of Teilhabe (to partake or have a part).
11
As has already been emphasized, Anders refrains from describing the mechanisms of participatory integrality in the sense of a deterministic logic of direction; rather, cybernetic capitalism works on the grounds of a pre-structured horizon of possibility: As our gateways (Einfallstore) are wide open because there are no longer walls between us and the system – since we live in ‘congruency’ with its contents […] – it is always self-evident to us […] how far we can exceed the limits of this system and how far we cannot. (1980: 186)
For Anders, technocracy is thus not merely defined as the domination of technicians. It rather highlights the fact that the world and our relation to it are mediated technologically, thus culminating in a universe that is, de facto and unchangeably, always-already technological. Technocracy in Anders’s sense is thus best understood etymologically: It primarily addresses not a particular state form but rather the increasing dominance of a technology that is in the process of becoming absolute. At the end of this process, technology remains the only forceful subject of history (cf. Dries, 2012: 171 ff.). In a similar vein, what political cybernetics understood as systemic adaptation primarily aimed at rendering noise and irritations productive by integrating them automatically – not least to weaken the possibility for actual overthrows. For instance, Karl Deutsch (1963: 227) mainly thought of the French Revolution as a problem of information. Quoting Deutsch literally, in general, the ‘history of revolutions appears to a significant extent as the history of internal intelligence failures in the governments that were overthrown’ (1963: 158).
In this context, it is vital to consider what Anders decodes as technology’s power to act, including its tendency towards automation. According to him, cybernetic technologies become increasingly autonomous, as they tend to ‘join in with another greater machine’, thereby attempting ‘to take over their environment’ (1980: 119). This thesis is currently echoed politically with respect to phenomena such as social bots, the affect-centered dynamics of filter bubbles or ‘filter clashes’ (cf. Pörksen, 2018: 118) and the systemic-integral power of platforms – including their idiosyncratic logic of autonomous adaption and expansion.
However, despite what Anders decodes as the tendency towards automation, he is far from formulating – in any sense – a technical-deterministic approach. It is true that he repeatedly emphasizes the increasing autonomy of networks. Yet, he does not refrain from referring to powerful tech-representatives – to quote him literally, as producers and ‘dominators of apparatuses (Gerätebeherrscher)’ (Anders, as quoted in Dries, 2012: 171). Indeed, both factors are currently significant for understanding the contemporary situation. It is arguably precisely the dialectics between the capital of tech-apologists and the quasi-autonomous power of technologies that multiplies the discursive dominance of tech-producers. This is also the reason why the social imaginary seems almost incapable of creating alternative visions for the social apart from subscribing to technocentric pseudo-utopias produced and pitched in and by Silicon Valley. In Anders’s sense, thus, the future seems at best fabricated. It appears produced in the very sense that a potential ‘futurelessness’, i.e. ‘the possibility of its own abandonment’ (1956: 282) is inscribed into it. 12 In this vein, one can indeed conclude that the current form of cybernetization is based on a very peculiar totalitarian logic that tends to bring about the ultimate disappearance of the political as such. What replaces politics, then, is not merely a technocracy in Anders’s sense, that is, a technocracy that totalizes the ‘principle of machines’ (2002b: 49). On the one hand, what is manifested, too, is a non-ideological ideology, viz., what Anders termed an alluring ‘world of a post-ideological cockaigne’ (1956: 197). On the other hand, with regard to new, seemingly hyper-ideological phenomena of ‘resistance’ mediated by tech, for example the alt-right, whose members primarily interact via Facebook groups, reddit, as well as so-called ‘alt-tech’ networks (Gab or WrongThink), one can even witness the emergence of a new form of anti-politics. The latter is both anti-political in the sense that it reinforces an intolerant attitude of closure and in undermining any meaningful form of discourse by primarily addressing situational affects, not least by communicating heavily through what Anders would term a ‘global flood of images’ (1956: 3) (the contemporary equivalents of which are fake videos, undifferentiated GIFs and Memes). At the same time, in the case of the left, it has been argued that current forms of activist clicktivism tend to weaken long-term emancipatory potentials and progressive party politics (cf. White, 2016).
One can interpret the detected correlation between an increasingly cyberneticized form of communication and the disappearance of the political as proof of the adequacy and aptness of Anders’s method of prognostic hermeneutics, including its ‘exaggerations in the direction of truth’ (Anders, 2002a). This holds specifically regarding ‘the end of politics’ that was, from the 1970s onwards, subsequent to Anders’s earliest illuminating analyses, frequently diagnosed – in particular by Baudrillard (1978), Flusser (2009), and, most recently, Tiqqun (2007). Although Anders was increasingly sceptical vis-à-vis the potential of his earlier concept of a ‘moral imagination’, his insightful observations reveal, in hindsight, the gelegenheitsphilosophische sharpness of the ‘forward swept historician (vorwärts gekehrten Historiker[s])’ (1980: 429).
Concluding meditations regarding the question of critique: The potential of Anders’s method
As we have examined in this paper, the peculiar diachronic contemporariness of Anders’s thought specifically results from the fact that the process of cybernetization, whose speculative beginnings Anders witnessed with criticism, has now given rise to an extensive new form of governmentality. Indeed, as we have noted, the theoretical end of cybernetics as an institutionalized science went hand in hand with a programmatic actualization of its systemic logic in various areas of our being. Currently, it marks a movement that affects not only the social in its totality but also political thinking and current forms of governance.
Strikingly, in this context, Dieter Mersch diagnoses a ‘discursive totalisation’ (2013: 49). As he claims, any diagnosed problem is often more or less automatically linked to a solutionist logic, which correlates with an excess of data, automation and interconnectedness. Mersch hereby explains the aforementioned imaginary deficits – in quite the sense anticipated by Anders – by referring to the fundamental (cybernetic) ‘misjudgement’ (p. 55) that the ‘nets and channels obtain basic democratic potential, that through them, non-hierarchical spaces could be built, that they are technologically reprogrammable’. Mersch further claims that the opposite is the case: The nets are regimes of authorization, regimes of dressage. They are such precisely through their openness. […] If the saying of their democratization is meaningful at all, then it is at best in the sense of an egalization of control, its interiorization of self-access. (p. 56)
In this mode, as Anders noted decades ago, effectivity is precisely not grounded in a suspension or suppression of individual communication. Productivity rather means letting that communication expand, to enforce a generalized ‘wall-lessness (Wandlosigkeit)’ (Anders, 1980: 150). Within self-organizing systems, control proceeds through constant ranking, rating and monitoring. It operates on the basis of infinite feedback loops (cf. Bröckling, 2008; Mau, 2017). Power hereby becomes manifest in the management of effects, in the alignment of the channels of communication. The prior aim of cybernetic governmentality can thus be defined as the unconditional perpetuation and expansion of its characteristic forms of circulation (cf. Tiqqun, 2007). Facebook, for instance, is a closed system, but it is focused on installing a constantly expanding system of (self-)regulatory feedback – a space that suppresses, in the long run, any subjective, individual form of expression. (cf. Maschewski and Nosthoff, 2019b)
Following Günther Anders, one could argue today that based on these mechanisms, a new form of conformism emerges: A conformism that must not even focus on synchronizing content or semantics, i.e. delivering the ‘equal […] and the same material’ to individuals (Anders, 1980: 150). Today, it suffices to determine the channel – or the interface – through which one communicates in order to maintain control over that channel’s direction. Indeed, we have reached a point where even forms of resistance cause effects that are systemically productive, as long as they do not negate the circles of communication.
Participation, the ‘necessity to connect (Sichanschließen-müssen)’ (Mersch, 2013: 49) to the channels of communication, including their characteristic forms of individual profiling, thereby reflects the ‘prejudicing role’ of technology in a highly atomistic manner – and yet, those channels of communication determine society as a whole. As an example, the individual is provided with a variety of options – Facebook offers around 60 potential sexes. However, rather than providing greater freedom, this vast selection is based on a perverse logic that is decisive for cybernetic capitalism: The more options being offered, the more precise the choice, the more determined the profile, the more valuable the information. In this vein, Andreas Bernard hints at the paradox that the ‘promises of freedom of the pioneer years still form the ideological foundations of all new apparatuses. However, the methods of individualisation no longer aim at diffusing the subject but determining it’ (2017: 46) – which is the decisive point.
‘If there are markers today’, writes Günther Anders, ‘then it is not us who mark the apparatuses but it is, inversely, the apparatuses that mark us. We become their impression (Abdrücke), their […] expression [Ausdruck]’ (1980: 424). What is interesting about this claim is not an alleged, hidden technological determinism that media and cultural theorists have repeatedly, and falsely, ascribed to Anders. Anders is also far from revealing a technical forgetfulness in any sense, that is, he is not subscribing to any pre-technical position. Rather, what is articulated here is an existential finding that is linked to Anders’s early anthropological observations, particularly the fact that freedom is only realized in the practice of undetermined technical Einrichten, i.e. a form of existential ‘non-specificity’ (Müller, 2016: 155), a contingent artificiality that is characteristic of the human per se (cf. Anders, 1937). This early intuition is indeed inscribed into his later magnum opus; however, it remains realistic regarding actual existing as well as potential future technologies. Furthermore, for Anders, it is precisely the pharmacological character of technology itself that is threatened by the process of machinization, which subsequently undermines its free use.
Anders’s critique thus strongly reflects its own technological embeddedness – even more so, it redefines this embeddedness in the sense of an aporetic-existential task confronting thinking itself. As Anders writes self-critically, there is ‘no one who is not in any sense brought into line (gleichgeschaltet wäre). This holds too […] for the writer of these lines’ (1980: 141). What follows from this is, however, precisely not the unconditional surrender in the face of the technically constructed but rather the necessity of a constant confrontation with it, the testing of the human limits compared to the all-powerful machine. Such a method would, in the least, give rise to a widening of one’s own imaginary horizon. The formulation of this space of thinking is, for Anders, a practical exercise that aims at a ‘hyperextension (Überdehnung)’ of the ‘habitual achievements of fantasy and emotion (gewohnten Phantasie- und Gefühlsleistungen)’ (1956: 274).
Seen from this perspective, Anders’s critique articulates an existential concern that is uncannily timely: the human-induced fascination with the processes of cybernetization and machinization could, finally, undermine his originary openness, his undeterminedness and disclosedness. As such, it is difficult to catch up with developments in the realm of artificial intelligence via the development of a ‘moral imagination’: essentially new and unheard-of (unerhört) is the alteration of our body […] because we conduct the transformation of the self for the love of our apparatuses; because we make them the model of our alterations, thus, passing on ourselves as a benchmark and, as a consequence, narrowing or giving up our freedom. (p. 46) As the king liked little that his son, leaving the controlled roads, roved about cross-country to form his own opinion about the world, he gave him a chariot and a horse. ‘Now you don’t have to walk any longer,’ were his words. ‘Now you must not do it any longer’, was their sense. ‘Now you cannot do it any longer,’ was their effect. (p. 97)
Footnotes
Acknowledgements
An earlier German version of this article has been published in Behemoth. A Journal on Civilization 11(1). The authors would like to thank the anonymous reviewers and especially the editors of this special issue, Christopher Müller and David Mellor, for very helpful remarks. Furthermore, Christian Dries and Wibke Liebhart have provided insightful thoughts and comments on an earlier version of the article. Moreover, we would like to express our gratitude to the participants and the organizers of the Günther Anders Conference 2017, Christian Dries and Oliver Marchart. The conference offered a first opportunity to discuss a very early draft of this paper.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
