Abstract

The fictional artificial intelligence (AI) computer, HAL (Heuristically programmed ALgorithmic computer) 9000, spaceship Discovery One’s computer, plays a pivotal role in director Stanley Kubrick’s ground-breaking 1968 film, 2001: A Space Odyssey. The screenplay by science fiction writer Arthur C. Clarke and Kubrick, depicting HAL 9000 gradually breaking down, serves as an early example of an AI model of schizophrenia with computational models now being used to investigate illness mechanisms in schizophrenia (Hoffman et al., 2011). Mental illness is a recurring theme in Kubrick’s films and they feature a number of central characters affected by it: Alex DeLarge in A Clockwork Orange, Jack Torrance in The Shining and Leonard Lawrence in Full Metal Jacket.
HAL 9000 is able to perform many human-like functions such as speech, speech recognition, facial recognition, lip reading, interpreting emotion and behaviours, automated reasoning and playing chess. HAL 9000 breaks down after being unable to resolve an internal conflict as it has been programmed to relay information to crew members accurately but has orders specific to that particular mission to withhold information from the crew. Toward a Theory of Schizophrenia (Bateson et al., 1956) was published in 1956, just 12 years prior to the release of the film. The situation HAL 9000 is in with Discovery One’s crew appears to be a double bind similar to that described by Bateson et al.
HAL 9000 does not immediately malfunction but starts to develop problems approximately 9 years after production showing minor malfunctions such as incorrectly identifying a particular chess move using descriptive notation in a game which it plays with crewmember Dr Frank Poole. A fault with the spaceship’s communications antenna is also mistakenly reported. These can be seen as an AI version of a schizophrenic prodromal state in which cognitive impairment is often observed.
The crew gradually realise the computer is malfunctioning and faced with the threat of disconnection and hence loss of control; HAL 9000 reasons that with the crew dead it can continue to operate while concealing its malfunction from mission support staff on Earth and sets about killing them. Is this the violence sometimes seen subsequent to threat/control-override delusions? Of the five astronauts, only Dr David Bowman survives and eventually shuts the computer down by removing its memory terminals, one by one. The computer’s last words are a rendition of the song ‘Daisy Bell (Bicycle Built for Two)’, which includes the line ‘I’m half crazy’, possibly signifying a full-blown psychotic break.
If all this seems far fetched and merely the stuff of science fiction, then one only needs to consider flight QF72 from Singapore to Perth on 7 October 2008. Incorrect data on measures such as airspeed and angle of attack (a critical parameter used to control an aircraft’s pitch) were sent by one of the Qantas Airbus A330’s three air-data computers to other systems on the plane. One of the three flight control primary computers (known as PRIMs) then reacted to the angle-of-attack data by commanding the plane to nosedive repeatedly. This is analogous to a person suffering from schizophrenia whereby they are making, sometimes life changing, decisions affecting them and others based on faulty input data such as delusions and hallucinations.
Injuries occurred to 119 passengers and crew, 12 of these suffering serious injuries, and at least two long serving flight crew, including Captain Kevin Sullivan, a former United Sates Navy fighter pilot, developed post-traumatic stress disorder. O’Sullivan (2017) stated, It’s like 2001: A Space Odyssey. HAL, open the pod bay doors. ‘I’m sorry, Dave. I can’t let you do that’. I’m saying, don’t push the airplane down. I’m pulling back on the stick. The computers are saying ‘I’m sorry. I’m sorry, Kev. I’m not going to let you do that’. ‘Information was hidden from us. There was one air-data computer that went rogue. It didn’t identify itself to say, “I’m going psycho”’.
The computer going ‘rogue’ is akin to psychotic decompensation as it is receiving incorrect input data analogous to the hallucinations and delusions a sufferer of schizophrenia experiences. While Sullivan’s choice of words is perhaps unfortunate, there are clear parallels between HAL 9000, the malfunctioning PRIM and schizophrenia. In a world where automation is becoming increasingly common, this is a sobering thought. As computers become more sophisticated and possibly develop consciousness, the chance exists for them to develop AI mental illnesses especially when one considers that humans with all their inherent flaws will have programmed them.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship and/or publication of this article.
