Abstract
A market for mental health apps, designed to help millions of refugees manage symptoms of Post Traumatic Syndrome Disorder and other mental health issues, has proliferated since the outbreak of the so-called refugee crisis in 2015. These bite-size, on-the-go, mindfulness-based apps have emerged at the intersection of new investment models, state-of-the-art AI and surveillance and border control regimes. Conceived of as a more cost-effective approach to refugee mental health care, mental health apps are part of a larger endeavour to create the 'smart' refugee. Self-monitoring, agile, entrepreneurial and resilient in the face of adversity, the smart refugee is expected to emerge as a node in a network of information flow, constantly connected to digital technology, at once receiving and providing real-time data. Biometric and data markets, some of the fastest growing in the world, have already been eagerly collecting refugee fingerprints, iris scans, facial images and other genomic information. To add to this arsenal of data, the new apps are harvesting, storing and selling what I call the mental prints of refugee trauma, turning the human experience of loss, grief and suffering into quantifiable and marketable commodities.
Introduction
With the outbreak of the refugee ‘crisis’ of 2015, big tech companies, small startups, humanitarian agencies and enthusiastic activists have rushed to design and circulate mobile phone apps aimed at helping refugees on their perilous journey of escape and in navigating their new host environments. Thanks to the setting up of accessible Wi-Fi hubs in camps and along escape routes, refugees are able to connect with lost family members, acquire legal support and find information on language courses, medical care, translation facilities and food banks. InfoAid, MeshPoint, RefAid, FindHellow, Tarjimly, DxtER and Refugee Care are only a few examples of apps which provide ‘on-the-go’ mobile solutions for the challenges facing refugees fleeing civil wars, climate disasters, ethnic cleansing and other disastrous events. New apps have also emerged in order to help refugees manage their mental health particularly within the framework of Post Traumatic Syndrome Disorder (PTSD) which has come to dominate humanitarian interventions in the last few decades.
From Germany to Lebanon, Greece, the US and Bangladesh, refugees are advised to use mental-health apps sponsored by humanitarian bodies, such as Sanadak, Balsam, Application for Mental Health for Refugees (AlMHAR), HELP@APP, as well as other mental health apps commercially available in the market. In order to heal from trauma and to emerge as resilient subjects who can cope with future uncertainties, refugees are increasingly guided to use apps based on principles central to behavioural psychology, Cognitive Behavioural Therapy and mindfulness techniques (Arthington, 2016; Raveley, 2016; Walsh, 2018). These principles emphasize the need for the individual to exist ‘in the moment’ and to accept and tolerate symptoms of trauma rather than challenge their underlying causes. The guru of mindfulness, Kabot-Zinn, summarises mindfulness healing powers, central to these apps, as the ability to help individuals accept things as they are ‘rather than struggling to force them to be as they once were, or as we would like them to be’ (2005: 136–137). In a life described as an '…unending stream of human suffering and misery' (Kabat-Zinn, 1994: 5), ‘smart’ refugees are expected to emerge as digitally-connected, self-advocating individuals, responsible for managing their well-being and ready to face future insecurities with little or no institutional support. Mental health apps for refugees are designed to replace face-to-face human support by medical professionals.
This article examines refugee mental health apps as a new political technology operating within a larger therapeutic regime for the management of the self and for the creation of mindful neoliberal subjects. It situates the new apps at the intersection of the pervasive expansion of PTSD as a framework for analysis and policy intervention, particularly in the humanitarian sector, on one hand, and the expanding digitalisation and datafication of human activities on the other. The article first interrogates the political role played by the PTSD framework in managing refugee populations and its persistence, both conceptually and as praxis, despite powerful critique from activists, medical staff as well as feminist, post-colonial and indigenous scholars. Trauma apps examined in this article embody this PTSD framework which has been critiqued for its focus on individuals rather than communities, on clinical diagnosis rather than political contexts and on singular events rather than sustained and multigenerational processes (Ehrkamp et al., 2021; Linklater, 2014; Marshal and Sousa, 2017; Pain, 2021).
The second part of the article examines the rise of PTSD apps and their circulation as commodities which extend market logic to some of the most intimate experiences and emotions associated with the refugee existence. Heavily entangled with corporate and political interests, these apps particularly feed into a growing world of biometrics and surveillance. Biometric systems, which collect data from millions of refugees through amassing fingerprints, iris scans and DNA, can now add what I call ‘mental prints’, generously shared by the self-reporting refugee, to their data banks. With the help of an arsenal of sensors, algorithms and digital wearables, ‘smart’ refugees are expected to emerge as digitally-connected, mindful, resilient and self-monitoring subjects.
Methodologically, accounting for all refugee PTSD apps is a challenging task. The research for this article has revealed a whole range of digital apps which are either defunct or actually have never seen the light of day. Navigating through the ‘digital litter’ of ghost websites, broken links and dormant apps which are no longer operational but give the misleading impression of still being updated, one has no clear idea of the size of this market. A study by the Migration Policy Institute, however, reveals that most of 169 refugee apps and other tech solutions launched in 2015–16 were defunct by 2018 (Benton et al., 2018). Defunct apps are often the result of lack of regular maintenance and of the short-term planning and funding cycles. As donors' initial enthusiasm quickly fizzles out and policy agendas shift, apps are abandoned, and new ones are embraced. The fast turnover of technology development also means that apps can become obsolete before any attempt to use them on a large scale.
Among the apps which my search came up with, but were unavailable for downloading, included BALSAM, Karim Chatbot and the Smartphone Mediated Intervention for Learning Emotional Regulation and Sadness (SMILERS). The latter is supposed to be available to Arabic-speaking iPhone users and targets those suffering from mild depression through a seven picture-based module that portrays the treatment of a fictional character while also providing information and practical exercises along the way. The EU-funded Syrian REfuGees MeNTal HealTH Care Systems has also developed HELP@APP which consists of five weekly sessions based on educative narrative and interactive exercises for stress management and an activity planner for behavioural activation targeting Syrian, Palestinian and Lebanese populations. 1 While no first-hand account of these apps was possible, detailed description of their content, rationale and information on their technical operation are readily available on developers’ webpages and on websites of International organisations such as the EU and United Nations High Commisioner For Refugees (UNHCR).
For the purpose of this article, two apps have been downloaded and examined. The first is Sanadak, a self-help app that targets PTSD among Syrian refugees by providing ‘low-threshold cognitive behavioural therapy’ in Arabic while equipping them with tools to rise above the problems ‘that life can bring from time to time’. The second app is the ALMHAR, available in Arabic, Farsi and English, and offering information on common emotional problems faced by refugees including stress, anxiety, traumatic memories, guilt, depression and grief.
The political work of PTSD
The ‘therapeutic turn’ under neoliberalism has seen an increasing medicalisation of all aspects of social life and an ever-growing move towards expanding notions of health and illness into more domains of human experience (Madsen, 2014). In this perspective, individuals are understood as pathologized subjects on a journey of healing, and the fractured nature of life under capitalism as an unavoidable part of human existence and not as a direct result of political and socio-economic structural conditions. Incidentally, there is nothing unique to the neoliberal age in this respect. Since the end of the 19th century, mental health disorders have evolved as socially constructed categories, constantly shaped by political and corporate interests. For many authors, mental health categories are seen to function as policing tools, a justification for the prevailing capitalist order and for the production and management of particular subjectivities (Abdelrahman, 2021; Cohen, 2016; Navarro, 1980; Rose, 1993). 2 While such classifications have tended to be ideologically and politically driven, diagnosis and treatment of mental illnesses have been typically divorced from the socio-economic and political contexts in which they emerge and are instead reduced to individual pathologies.
Against this background, PTSD has risen as a dominant framework of analysis, diagnosis and treatment of refugees who are increasingly seen as victims of multiple traumatic experiences including wartime violence, dangerous escape journeys, protracted displacement, xenophobia, structural racism and precarious living conditions. As a diagnosis, PTSD firmly locates trauma in the biological and cognitive explaining its symptoms as a defect in the individual’s processing function (Goozee, 2020). As a treatment, it targets behavioural change and cognitive transformation in the individual sufferers and their ability to learn to tolerate their symptoms.
PTSD, like all other mental health classifications, is historically and politically contingent and is held together by practices, technologies and narratives which determine how the ‘traumatised’ is diagnosed, studied, treated and represented (Ehrkamp, Loyd and Secor, 2021: 121). The pervasive power of PTSD in the last few decades, has led many authors (see Casper and Wertheimer, 2016; Ehrkamp, Loyd and Secor, 2021) to question the politics of PTSD and why it has come to dominate how we understand ourselves and the things that are don to us (Casper and Wertheimer, 2016: 5). For many critics, the PTSD frame shifts the refugee question from a political concern for rights and justice to one of welfare and professional management (Moganieh and Marranconi, 2017; Munyikwa, 2020; Pain, 2021; Pupavac, 2008; Summerfield, 2008). By substituting care for justice and compassion for politics, PTSD frames absent issues of international conflict, war crimes and climate injustices from the refugee agenda. Instead, it focuses attention on the refugee’s mind and psyche as an arena of intervention, thus undermining the fact that refugee trauma is often a collective experience of violence which is located in the social and political contexts in which the suffering occurs (Herman, 1992; Malkki, 1995; Marshall and Sousa, 2017; Pain 2021; Visser, 2015). Recent calls for ‘decolonizing’ trauma have emerged to challenge this Euro-American, event-based, clinical, individualistic framework of PTSD (Visser, 2015) and to relocate our understanding of trauma in the violence of colonialism, racism and capitalism (Fanon, 1963; Mbembe, 2019) among communities and intergenerationally (Quinn et al., 2000).
Ironically, despite its heavy focus on the individual, locating the problem and the treatment of PTSD within the brain and the cognitive leads to assumptions of universalism about refugee suffering. PTSD frameworks begin from the premise that all refugees experience violence and other forms of atrocities in the same way and that their symptoms can be treated using the same techniques which train the mind to cope better. The gender, race and class blindness in this analysis, erases the unique experience of suffering and ends up creating a fictional refugee figure.
As a political technology, PTSD has also become a powerful tool to confirm or deny the authenticity of claims and rights. In their seminal work discussing the ascendency of trauma analysis in the humanitarian sector, Fassin and Rechtman (2007) argue that a trauma diagnosis has come to entrench a hierarchy of deservedness which distinguishes between ‘good’ and ‘bad’ victims. As such, life and death decisions, such as granting asylum, resettlement or deportation, can sometimes hinge on refugees’ ability to prove their state of trauma (Shuman and Bohmer, 2004). This is often a difficult task as trauma victims are usually unable to create a coherent narrative where their trauma can be explained in terms of political persecution. As a result, a market of experts, or what calls trauma brokers, lawyers, doctors and NGO staff emerge as gatekeepers to the humanitarian assistance apparatus by forever documenting trauma narratives and testimonials and verifying them through physical examination and psychological tests.
The medicalizing and individualising of refugee suffering embodied in the PTSD framework plays another major political role. By reducing refugee trauma into a set of discrete symptoms to be managed, the potential to mobilise trauma as a force for change and resistance is completely undermined. For feminist and post-colonial authors, trauma should be seen as a productive and disruptive force that can challenge power relations and expose the links between capital, race and colonial violence (Ehrkamp, Loyd and Secor , 2021; Espiritu, 2006; Till, 2012). From Iraqi victims of war and occupation (Al-Ali and Al-Najjar, 2013) to Palestinian children standing up to Israeli violence (Marshall and Sousa, 2017) to Cambodian Americans’ cultural production against genocide amnesia (Schlund-Vials, 2012), scholars and activists have shown that trauma symptoms are normal reactions to violence and can, as such, be mobilized to resist this violence (Burstow, 2003; Pain, 2021).
The rise of e-mental health and mental health apps
This section deals with a paradox. While therapeutic governance actively encourages the expansion of new categories of psychological and mental disorder and extends such labels to more and more individuals and communities, official discourse increasingly portrays care for such communities, including refugees, as a burden on the economy (Marshal et al., 2020). As the protracted refugee ‘crisis’ of the 21st century has established itself as a globally permanent affair rather than a moment of exception, refugees’ mental health demands have come to be seen as a huge burden on national and humanitarian budgets especially under increasing austerity measures and aid cuts. In the UK, the prolonged squeeze on NHS funding coupled by spending cuts for local authorities has drastically affected mental health care. 3 Similarly, international aid budgets have also been hard hit by repeated cuts. 4 In a recent example, of the total $1.991 billion needed by the UNHCR to fund the Syrian ‘situation’ in 2020, only 38% was secured thus forcing the organization to halt many of its programmes including mental health care (UNHCR, 2020). 5
However, presenting these budget cuts and reduced spending solely as a result of austerity measures and a diminishing welfare regime can be a smoke-screen that hides larger political issues such as anti-immigration policies (Shavisi, 2019). Many countries in fortress Europe have increasingly introduced stringent laws which restrict refugee access to health care services as a way to discourage ‘illegal’ migration and to control the asylum process. The UK’s Hostile Environment Policy towards undocumented persons, for example, often results in denial of care as refugee medical bills contain threats to inform immigration enforcement of their details (Shavisi, 2019). Additionally, migrants from outside the European Economic Area (EEA), especially undocumented migrants, are required to pay an up-front charge equalling 150% of the face-value cost of treatment. In Germany, during the first 15 months after arrival, refugees have no free access to mental health care and when they do become eligible, refugees still have to pay for translators and other associated costs (Golchert et al., 2019). 6
The ‘burden’ of mental health care for refugees, and the political entanglement of these services with questions of legality and rights, has pushed humanitarian agencies and national governments, along with large tech giants and small startups, to devise more cost-effective strategies for refugee mental health care. The WHO has played a particularly important role in these developments by promoting self-help online services. Its online initiative Step-by-Step, for example, was developed as an online self-help version of the earlier WHO’s Problem Management+ programme launched in 2016. The launch of the online service interestingly reflected a change in the title of the programme from ‘Problem Solving’ to ‘Problem Management’. The change came out of concern that the language of ‘problem solving’ would lead ‘clients’ to believe that the interventions could actually offer solutions to the complex problems of war, communal violence and chronic poverty. Problem ‘management’, on the other hand, promised no such solutions but instead offered clients tools to relieve the impact and symptoms of these challenging problems (Carswell et al., 2018).
With the global spread of smartphones replacing computers for internet-based services, mobile-mental health quickly emerged as a subfield of e-mental health-creating mobile apps which offer diagnosis and treatment for depression, anxiety and PTSD. By 2017, of the approximately 318,000 health apps on the market, around 10,000 were specifically designed for mental health issues. Apps such as Happify, Mood Tracker, Pacifica, Headspace, Buddhify, Calm Harm, PTSD Coach and a host of chatbots have been promoted as game-changers and marketed as cheap on-the go ways of self-management of mental health (Neary and Schueller, 2018; Sangupta, 2019). These apps arose on the back of huge strides made in AI technology particularly natural language processing, machine learning and computational biology. One example of this technology is IBM’s automated mobile phone speech analysis application which promises real-time overview of the patient’s mental health by simply analysing one minute of speech input (IBM blog, 2017). The Karim chatbot created by the Silicon Valley start-up X2AI relies on AI-powered text message conversations with Syrian refugees in Lebanon. The technology is based on natural language processing which allegedly captures the individual’s emotions and state of mind by analysing their choice of words (Solon, 2016). Similarly, typing speed, punctuation changes and the amount of phone movement are said to provide enough material for some AI-powered programmes to assess users’ emotions. 7
The rise of e-mental apps targeting refugees can also be seen through the lens of what Scott-Smith terms ‘humanitarian neophillia’: digital technologies, gadgets and wearables used in the humanitarian space (2016). These presumably apolitical, technically superior modes of intervention combine an optimistic faith in technology with a commitment to the expansion of markets (Burns, 2015; Collier et al., 2017; Fast, 2017; Sandvik, 2019; Scott-Smith, 2016). The political work of these devices emanates from the promise to help refugees survive their calamities without the institutional support of the state. They, further, serve the increasing humanitarian retreat and the bunkerisation of aid workers in the last decades (Duffield, 2012). Gadgets such as mobile apps in this sense widen the gap between professionals and aid workers and the people they are meant to assist. They provide ways of managing a crisis from a distance and without the ‘messy complications and entanglements of collective action’ (Collier et al., 2017: 81).
The generic world of trauma and its apps
Trauma apps for refugees are built on a generic model which assumes that the experience of trauma is universal. All apps reviewed for this article begin by providing refugees with general information on the physiological and psychological nature of PTSD and many start with a definition of PTSD. ALMHAR, for example, defines trauma ‘as a profound damage to the mind’ which is ‘evoked by experiencing, witnessing or hearing about extremely shocking and overwhelming events, such as the death of a loved one, torture, sexualized violence, a bomb attack and many more’. Apps outline major trauma symptoms: anxiety, sleeping difficulties, negative thoughts, avoidance behaviour, anger, etc., and how to recognize these symptoms: increased heartbeat, superficial breathing, sweating, shaking, muscle tension, nausea, etc. Narration is an essential part of the apps. A fictional refugee character, Ahmed in ALHAMAR and Ali in Sanadak, is used as a vehicle to explain the symptoms and how to recover from them.
The specific experience of the refugee characters is, however, hardly ever mentioned and only scant reference is made to how they ‘fled’ their home or country, ‘had an extreme experience’, grieved a loved one or their sense of security, or more euphemistically as having ‘had a bad experience with crowds’. Mobile apps emerging from the WHO’s Step-by-Step project, for example, intentionally use illustrated story narratives which do not explain the details of the adversity faced by the character on the app. For the app developers, this is so that the apps can reflect ‘… the experience of loss or illness without defining the cause’ in order to make these stories as relevant to as many groups as possible (Carswell et al., 2018). Whether it is torture, rape, bereavement, fleeing home or any other tragedy that had befallen the refugee, apps show no attention to how refugees should understand the mechanics of their own trauma. For Fassin and Rechtman (2007: 19), in this universal view of trauma not only ‘…do scales of violence disappear, but their history is erased’.
Among the core messages these apps communicate to users is that the problem of PTSD resides within the individual’s mind and attitude. ALMHAR explains that trauma symptoms are not unusual in themselves, but it is how we deal with them that is the main problem: ‘sometimes our way of coping in not the best way to deal with our anxiety’. As such, it is the responsibility of the user to control how they feel and change their outlook and modify their patterns of behaviour in order to best deal with the symptoms. Despite this responsibilisation, app users are not given space to articulate their emotions or to make reference to their individual experiences. Instead, users can only choose from a list of emotions and reactions to communicate their state of mind and to be ‘guided’ on how to deal with them. For example, Sanadak, which offers submodules on different symptoms, invites users to explain their feelings of despair and loneliness by ticking one of three boxes to indicate how they experience these emotions: ‘withdrawal: staying at home, hardly do anything, don’t speak about my feelings’, ‘distract myself: escape my loneliness by constantly keeping busy or being with others’, or ‘stay balanced: I neither withdraw nor distract myself’. Feedback is provided accordingly. If the users choose number one, for instance, the app asks if they are happy with how they deal with loneliness, if the answer is no, the app then provides positive feedback through the automated message: ‘Great. You are honest with yourself. Refer to the sub-section on how to strength your relationships and how to try something new’. Once the user goes through several more recommended activities: listening to mindfulness audios, doing muscle relaxation and breathing exercises and engaging in ‘pleasant activities’, the app ‘unlocks’ a new level for the users and they can move on to deal with another of their trauma symptoms.
None of the apps engage with categories of gender, sexuality, age, race or class to explain how individuals might experience and deal with trauma differently. Some, however, attempt to individualize the ‘experience’ for users. Step-by-step, for example, offers refugees a range of on-screen images which users can choose as online representation of themselves or of narrators. Users can choose a character to reflect their gender (man/woman only) or some identity marker (veiled/unveiled or bearded/non-bearded). They can also create and dress up their own avatars. These gestures at on-screen personalisation notwithstanding, mental health apps for refugees are intentionally designed to be generic, reflecting a universal experience of what Malkki (1995) describes as the ‘transhistorical’ refugee condition.
Refugee mental health apps circulate within a market of digital humanitarian goods. New AI-supported technologies and smart devices which increasingly provide long distance support in crisis situations (e.g. satellites, drones) are the handmaidens of an increasing humanitarian retreat and of a remote refugee management regime (Duffield, 2015; Meier, 2015). Mental health apps function in the same way by removing the ‘human’ contact from the humanitarian support. The apps are first and foremost promoted for their cost-effectiveness and, as such, rely on cutting costs by eliminating health care workers from the process of consultation and treatment. Once an app has been identified, refugees only have their smartphones and AI technology to communicate with and trust. Interestingly, however, the apps have given rise to a new category of digital ‘trauma brokers’: e-supporters or e-helpers. E-helpers are human agents who can offer occasional support with the use of some apps such as those sponsored by the WHO Step-by-Step programme. Mostly university graduates in fields outside of health care, e-helpers often receive no more than six days of training in identifying, assessing, managing and sometimes reporting risk to a professional clinical supervisor (Carswell et al., 2018). In general, e-helpers are more likely to offer support with troubleshooting and how to track progress rather than with any substantial content. In this large unregulated market of apps, bots and e-minders, it is not always clear who is responsible for the supervision of e-helpers. In some cases, it is professionals hired by the app developers themselves while in the case of Karim, ‘human minders’ who can ‘ghost in’ at will during the text chats, are said to be typically employees of the health-care company that licensed the bots and not the Silicon Valley creator of X2AI (Romeo, 2016). ‘Ghosting in’ is indeed an apt phrase for what really is an attempt to link the refugee to what Seymour (2019: 57) describes as someone who is not actually there or is only there ‘…as a written trace, a ghost in the machine’.
Gamification of trauma
A major feature of all refugee mental health apps is their reliance on gamification, which refers to the use of game design elements and principles in non-game contexts such as education, management and health care. Elements of gamification include narration, leaderboards, customization, goals, points, levels, badges and feedback. The ultimate purpose of gamification is to influence or trigger a desired behavioural change among particular groups (Fleming et al., 2017).
In both Sanadak and ALHAMAR, as well other apps such as Happy Helping Hand and SMILERS, customization affords users the chance to select from a range of images or avatars which narrate stories and provide regular feedback on activities. Similarly, users proceed from one level of activities to the next only after completing specific tasks (breathing exercises, noting down positive thoughts, reading a book or speaking to a friend). Each accomplishment then allows them to unlock different levels and to move to the next stage. Moreover, all apps include video or audio sequences, sets of exercises or interactive games which can be followed on their own at anytime of their choosing. In order to serve the needs of mobile populations, apps are, by definition, designed to be used ‘on the go’. As such, they offer users a menu of several choices of activities and tasks which take only few minutes each to complete and between which users can easily switch. This ‘snacktivity’, engaging with frequent, brief activities which can be done for a few minutes at a time (Sardi et al., 2017), is a feature of game design which is recently seen as a particularly valuable tool in mental health apps.
Like most other games and gamified applications, mental health apps rely on the use of digital images, icons, facial expressions, also known as emojis, to allow users to communicate their feelings. One app designed for female trauma survivors from Somalia and Bhutan in the US (Shapiro, 2016) invites users to choose among a number of emojis which best reflect their emotions: happy face, sad face or anxious face. In another app, refugees are asked to numerically indicate the level of anxiety or sadness they are experiencing through emojis and numbers. In a way, refugee experiences of ethnic persecution, sexual assault, bereavement, starvation and the isolation of living in a foreign country are now reduced to bare, gamified forms of expression more convenient for the needs of AI technology algorithms than to the complexity of trauma experiences.
These gamified systems of rating also extend to guiding refugees in selecting mental health apps best suited for their needs. Like other consumer groups, they are expected to make choices based on information available to them through word of mouth, advertisements or increasingly through rating platforms. Several app-rating platforms such as the (for profit) Organisation for the Review of Care&Mental Health Applications, PsyberGuide, MindTools, Happitique and the American Psychiatric Association have sprung up in the last few years. The rating systems of these platforms do not necessarily reflect scientific evidence or specific diagnosis but instead mostly focus on aesthetics, visual design and sometimes issues of privacy and security. While app-rating platforms are infamous for engineering positive reviews and developers often pay others to ‘like’ their apps (Neary and Schueller, 2018), tragically ‘likes’, star buttons, emojis and inane anonymous reviews have become the compass which refugees have to rely on to find a way to heal their trauma.
Gamification in general universalises the human experience by turning trauma and how to overcome its consequences into discrete, quantifiable steps and exercises. They support the PTSD framework by squaring the responsibility on the individual who needs to ‘level-up’ through process of self-optimisation. The smart refugee, connected to these apps, therefore, emerges as someone who understands that precarity and danger is something which needs to be invested in rather than rebelled against, symptoms which can be overcome rather than structural conditions to be questioned.
A market of suffering and of data
‘Smart’ refugees and mental health apps designed for them, first and foremost, operate within a highly unregulated market which lacks institutional oversight over questions of scientific and ethical standards. The commodification of mental health services, in general, relies on a process of abstraction necessary for the creation of standardized treatments which are easier to ‘package’ and sell (Tamimi, 2011: 158). In particular, services like mental health apps have a short life cycle, which means that by the time independent research is carried out on one app, new apps would have already been introduced on the market to replace it (Marshal et al., 2020). This marketization essentially means that there is a dearth of rigorous research on these apps, which, in some cases, has led to what some researchers have euphemistically called ‘wrong self-management advice’ for the users (Neary and Schueller, 2018).
In this new consumer market, investment in apps has been alarmingly diverted from spending on conventional services and mental health infrastructure. New partnerships between public funding agencies and large tech companies means that resources which could otherwise be invested in scientific research and public service funding is now going into for-profit products. One glaring example is the app developed by the Canadian Connecting Culture initiative for refugee mental health which is funded by the National Institute on Minority Health and Health Disparities in partnership with the tech company Gametheory. The public funding received by this initiative covered the first phase of the project in which data collected on the effectiveness of the app among refugees allowed the producers to receive a second public grant in order to ‘…test, tweak and prepare the app for commercial sale’. 8 In a sense, the public is now funding experiments on refugees in order to make private companies even richer.
However, what is really the most valuable commodity in this market of mental health apps is not the gadgets and treatment models themselves but the huge amounts of data they generate from users. Under neoliberal capitalism, one of the most thriving markets is that of personal data. Descripted as the ‘new oil’,9 personal data have become the latest currency by which consumers pay for services increasingly provided on digital platforms (van Dijk, 2014). The value of data resides in its reusability and in its predictive powers. ‘Dataveillance’, ‘Data commodification’ and ‘data colonialism’ have, thus, emerged as powerful concepts which refer to the process where personal data have become a frontier for capitalist exploitation and a source to be harvested, mined and accumulated (Couldry and Mejias 2019; Sandvik, 2019; Thatcher et al., 2016; van Dijk, 2014). Data colonialism in this regard combines the predatory, extractive practices of historical colonialism with state-of-the-art methods of quantification (Couldry and Mejias, 2019; Sandvik, 2019).
Until now, data markets have been largely fragmented and lack standardization. In the health care industry, for example, highly valuable data are spread across uncoordinated repositories including electronic medical records, laboratory and imaging systems, physician notes and health-insurance claims (Bresnik, 2017). In response to growing demands for more aggregated and clean data, specialized data brokers, such as Cerner, Prognos and Explorys, have emerged. 10 The invaluable contribution of these companies to the health-care market is celebrated by a McKinsey (2017) report which highlights how their work increases the profitability of life- and health insurance providers as well as pharma and medical companies. Personal data which fuel this industry are increasingly derived from gadgets, wearables and other AI-powered technologies, such as refugee mental health apps, which billions of individuals are pushed to use on daily basis.
The datafied mind of the refugee
Within this thriving data market, a global market for biometrics has become one of the fastest growing and is estimated to reach almost $68.6 billion by 2025. Centering the body of the refugee, UN agencies, border control regimes, national governments are prominent clients of biometrics data and technology. 11 New devices located on or inside the bodies of humanitarian targets are used to generate data out of the refugees’ ‘digital bodies’ by tracking and recording their every detail, making them legible and quantifiable (Burns, 2015; Fast, 2017). These include user stamps, wristbands, fingerprints, photographs, tokens, registration cards and other gadgets. The data collated by these technologies are promptly deposited in data banks of fingerprints, iris scans, facial images, DNA and other bodily-coded information.
So far, in this increasingly sophisticated world of AI-operated structures of governance, the mind of the refugee had eluded the system. While the body of the refugee had been accounted for, the psyche and the emotions remained untapped. Fingerprints had been collected but now, through mental health apps, what I call ‘mental prints’ can also be harvested, mined and stored for future use. Mental health apps rely on users who willingly self-monitor, self-report and generously share personal data with their machines. The only way these apps can help the traumatised refugees overcome their symptoms is, arguably, when individual users offer as much information on their most intimate thoughts and feelings. As such, users are invited and constantly sent reminders by the apps to evaluate and to share details on their state of mind and patterns of behaviour in writing, in spoken words as voice notes or by selection of images. For example, HELP@APP states that it applies a self-test that can be completed at any time in order to monitor symptom severity and to provide automated individually tailored feedback regarding progress and potential problems. One of the ‘positive thinking’ exercises in Sandak gives ample space to users to narrate their daily activities. Another asks them to detail their skills and what they think they are good at. In order to encourage creating and strengthening the refugee’s social networks, the app asks the user to list three people closest to them and follows up by asking them to write a message of gratitude for one of them. Similarly, in order to encourage the traumatized refugees to improve their memory and help their concentration, ALHAMAR suggests they ‘Start a notebook or a list (on your phone) where you write down everyday things you want to remember’. Another personal journal is also recommended by the app for noting down ideas related to grief and depression. 12
Twitter has been called ‘sentiment detector’ of people’s political predilections, while Facebook ‘likes’ can now easily predict specific personal attributes such as ethnicity, intelligence, happiness and gender (Kosinki et al., 2012). In the final analysis, users of mental health apps similarly generate a wealth of visual, oral and textual data which AI can then process. AI detects patterns in phrasing, diction, typing speed, sentence length, voice or other reporting methods in order to discern the various emotional states of the refugee. Diaries and activities on the other hand give details on their daily lives, their preferences, expectations, skill lists and inner most dark thoughts. For a new business model which ‘…refuses to specify what it is looking for, and must therefore capture everything’ (Davies, 2021), recorded notes on grief, anxiety and shattered lives could prove invaluable for a future enterprise.
Beyond its market value, data collection and sharing is also important for surveillance. The regime of medicalization under neoliberal capitalism is built on the normalization of surveillance of the individuals’ mind and most intimate emotions. Cohen (2016: 70) argues that under a regime of ‘psychiatric hegemony’, ‘Our behaviour, our personalities, our relations and even our shopping trips are now closely observed and judged’. With time, the individual is trained in the bio-politics of self-monitoring using the same discourse of illness and wellness. The phenomenon of function creep, in which data are used for purposes for which they were not collected, is leading to serious questions about the use of mental health apps among vulnerable populations such as refugees. Besides volunteering information about their lives, thoughts, emotions and other intimate details, the apps can indirectly be used for other forms of surveillance. For example, many apps provide location identification which makes surveillance of particular communities a growing concern. Several apps targeting Muslim users are a case in point. Apps such as Muslim Pro, Salaat First, Qibla Finder as well as other Muslim dating apps, which are downloaded by millions of Muslims worldwide, have been collecting location data which are then sold to data brokers, US military contractors and other intelligence agencies.13
Conclusion
The pathologisation of trauma through the increasing deployment of PTSD frameworks in more settings is part of a wider process of normalising the medicalisation of everyday life under neoliberal capitalism. Recently, PTSD has grown not only as a diagnostic framework but as a political tool for population management. As its critics have long demonstrated, reducing human experience of trauma into a PTSD diagnosis and treatment plan effectively reduces historical structures of violence into separate random events and collective suffering into individual symptoms. On the other hand, the datafication of human bodies, through the use of digital platforms for almost every human activity, has also become a normalised practice which feeds predictive analytical models necessary for new forms of capitalist expansion and consumption. In this article, I have worked to explain how at the intersection of these two complementary processes, refugee mental health apps have emerged as a form of political technology tasked with heralding a new resilient, entrepreneurial, efficient and above all ‘smart’ refugee. In a growing digitaled, neoliberal age, the smart refugee is expected to function as a node in a network of information flow by constantly being connected to digital technology. Hard wired by sensors, facial recognition technologies, surveillance networks, and digital gadgets, the mobile bodies of refugees collect and transmit huge volumes of data and create a constant flow of algorithmically-linked data points which feed into the development of more sophisticated technologies and future business ventures.
Mental health apps, examined in this article in themselves, might not survive into the future. Already the scant information available on the distribution and uptake of these apps by humanitarian agencies, and the digital landscape of broken links and defunct applications in which they circulate is a sign that these apps could soon be overtaken by other ‘solutions’ rooted in digital technology and behavioural sciences. Rising doubts about the self-inherent weaknesses of self-reporting assessments, at the heart of Sanadak and other applications, means that a new range of supposedly more objective trauma detection techniques will soon be made available. Recent experiments have already successfully created algorithms able to identify specific vocal characteristics associated with PTSD while another AI-enabled blood test can now show accurate markers of the disorder. According to their developers, the superiority of these new technologies is that they can measure trauma ‘… objectively, inexpensively, remotely, and noninvasively’ (NYU Langone Health, 2019/2020). In a way, they will usher in a new phase where histories of violence are erased and reliable, algorithm-friendly evidence replace human interaction and expression.
However, mental health apps and new technologies soon to replace them, face resistance from the very communities they are supposed to help. Across camps and borderlands, refugees are increasingly abandoning, tinkering with and destroying gadgets and other wearables which are hoisted on them by governments and humanitarian agencies. In one example, Rohingya refugees and their camp leaders in Bangladesh, staged several days of protest against the introduction of smart ID cards by UNHCR. The protests were motivated by increasing data collection linked to ID cards, and how this could be used by the Myanmar authorities (refworld, 2018). In terms of mental health apps in particular, humanitarian agencies are also reporting high levels of abandonment and drop-out compared to services offered face-to-face. Frantz Fanon has long showed how colonial authorities understood resistance to colonial medicine as a type of madness in need of curing (1963). Today's refugee acts of defiance against pervasive use of technology are similarly explained by the humanitarian sector as an outcome of cultural and religious superstitions. At best, they are interpreted as a sign of lack of selfdisciple, a malady which can be cured through more cognitive training, behavioural change, measurement and quantification of refugee lives.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
