Abstract

More than 2 decades ago, the IBM supercomputer Deep Blue beat world chess champion Gary Kasparov in a 6-match series. The 1997 win marked a turning point—drawing on artificial intelligence (AI), it was the first time a computer had bested a human champion under tournament conditions. Looking back, Mr. Kasparov commented, “A game of chess was vulnerable to very powerful machines with sufficient algorithms and bigger databases and very-high-speed processors.” 1
Could Mental Illness Also Be Vulnerable to Very Powerful Machines?
For most people, the stakes are far higher than a game of chess. Across the world, people with mental health problems and illnesses face significant challenges in accessing care. Only half of Canadians with depression receive “potentially adequate treatment.” 2 Although patients prefer psychotherapy to medications, the former is often unavailable as a treatment choice; in a recent study, just 13% of British Columbians with depression had any access to therapy or counseling. 3 The availability of (human) psychiatric services is unlikely to change in the coming years. 4 Even in England, where cognitive behavioural therapy (CBT) is publicly funded, the attrition rate is high, suggesting that access to traditional, in-person services may not be helpful or convenient to some. 5 In developing countries, basic mental health care may be absent; according to the World Health Organization (WHO), 45% of the world’s population lives in countries with less than 1 psychiatrist per 100,000 people. 6
This In Review series offers 2 articles on digital psychiatry that could potentially help bridge the access gap.
In “Chatbots and Conversational Agents in Mental Health: A Review of the Psychiatric Landscape,” Vaidyam et al. 7 consider programs “that use machine learning and artificial intelligence methods to mimic human-like behaviors and provide a task-oriented framework with evolving dialogue able to participate in conversation.” Chatbots are increasingly used in society for access to general information (think Amazon’s Alexa or Apple’s Siri), but what about for access to mental health care? The authors provide a literature review with 10 relevant studies. Chatbots have various roles, including providing CBT for people with depression and anxiety, encouraging medication adherence, and offering psychoeducation. The authors find high satisfaction rates among users and little risk of harm—although results are hard to compare, given the lack of uniform reporting or even measuring. They conclude that “chatbots are an emerging field of research in psychiatry, but most research today appears to be happening outside of mental health.” Still, they find great potential, if they warn that the use of chatbots must be done “correctly and ethically.”
We share in that enthusiasm: unlike human therapists, chatbots remember and learn from everything people have told them and are always available; in addition, they are not distracted from the task at hand by unrelated thinking, as we human clinicians can be; users have their undivided, if artificial, attention as well as intelligence.
If Vaidyam et al. 7 consider a very new development (chatbots), Andersson et al. 8 consider an intervention first used around the time Kasparov and Deep Blue faced off: Internet-delivered CBT (iCBT), psychotherapy based on CBT principles, delivered via the Internet by an individual or program remote from the client. In “Internet Interventions for Adults with Anxiety and Mood Disorders: A Narrative Umbrella Review of Recent Meta-Analyses,” the authors note a rich literature and focus on iCBT for anxiety and mood disorders in adults, drawing from recent years. They find 9 meta-analyses and note diverse applications of iCBT, with different lengths of treatment but collectively involving human therapist guidance in addition to web-based access to CBT modules. They conclude, “Overall, evidence is now accumulating suggesting that therapist-supported Internet interventions, and in particular iCBT, can be effective. In this overview, we found meta-analytic support for panic disorder, social anxiety disorder, GAD, PTSD, and major depression with moderate to large average effect sizes overall.”
The conclusion, reinforced with respectable effect sizes, bolsters our view that solutions to access for evidence-based interventions can successfully use innovative approaches—like iCBT.
And there is much going on in digital psychiatry not covered by these 2 articles. Consider the following: Virtual reality (VR). VR is exposure therapy 2.0, and studies suggest promising results for anxiety disorders.
9
Digital phenotyping. Drawing on passive data collection from smartphones, digital phenotyping seeks to consider everything from sleep habits to geographic movement to keystroke speed to find patterns that may indicate early relapse in known mental illnesses.
10
Apps. There are over 315,000 mobile health apps; some are designed for people with psychiatric conditions, helping patients remember when to take medications or track their mood over time or improve their sleep.
11
(Chatbots, which could be apps, are considered earlier.)
While not offering a solution for every patient in every circumstance, digital psychiatry may even be attractive to a subset of patients who would prefer a digital interaction to a human one, whether for financial and other pragmatic considerations (e.g., a single parent of 3 children, on public assistance, for whom getting to an office appointment is a major logistical problem and even financial hardship) or for more psychological and interpersonal reasons (e.g., someone with autism spectrum disorder for whom the lack of human connection with a chatbot may be helpful).
These are early days. While there are many apps, they are of a heterogeneous quality; in a recent article, Shen et al. 12 find that, when a basic quality standard was applied (such as revealing the source of information), only 25% of the apps studied met that standard. We note that AI in health care is an active field of both medical research and clinical experimentation, yet to date, the results have been mixed; IBM Watson’s project with the MD Anderson Cancer Center was cancelled after 4 years, in part because of basic problems with electronic health records. 13
While the experimentation continues to be vigorous, we offer a few notes of caution.
Privacy in a Facebook World
Where does all this digital information go? How is it protected? Can it be hacked (in the same way someone can break into a psychiatrist’s office and purloin her charts)? These questions are important to consider as people increasingly turn to digital solutions. Clinicians worry about the security of electronic health records. For our patients who actively use apps, the information they populate on their smartphones may be more personal.
Enthusiasm (and Entrepreneurialism) Overtaking Evidence
Digital tools are shiny new things. Apps and online tools proliferate. The continuous updating of versions and so-called bug fixes defy traditional research evaluation methodology where a pill’s ingredients are fixed by patent or a psychotherapy is manualized and audited. We need new research paradigms for new solutions. And the interests of Bay Street and Wall Street—market penetration and return on investment—may not be the interests of clinicians.
The Potential Harm of Digital Tools
Is there a number needed to harm for an app? In a recent article, Torous et al. 14 note that some apps provided unhelpful answers when people discuss suicidal thoughts. And an available tool is not necessarily used. For example, PTSD Coach app, developed by the US Veterans Administration, reports having been downloaded more than 150,000 times—yet only 14% of individuals had used the app the day after downloading it. 15
What role could digital psychiatry have in the future? The concept of stepped care, matching severity and complexity of need to intensity of service, used to begin with self-help in the form of reading a book. For more than 2 decades, generations of clinicians have recommended texts such as Mind over Mood. (The old term bibliotherapy, already in vogue in 19th-century American asylums, was revived to sound more medical and interventionist than “read this book.”) As digital tools develop, they may enter this world, aiding self-help and augmenting therapies, likely for people with lower acuity of illness.
Although we note possible pitfalls with digital psychiatry, we also see great potential, empowering patients and enabling them to get more timely access to care more broadly defined. We also see task-shifting not simply to other human professionals but also to computers, leaving the more complex needs to psychiatrists. Very powerful machines, thus, could remake psychiatry.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
