Abstract
Digital conversational agents (“chatbots”) hold interesting potential as a technology that can be used to deliver mental health interventions and supports to people in need. Youth are high users of digital technology and are widely affected by mental health conditions; youth might therefore be particularly interested in using digital conversational agents to support their mental health. We reviewed the literature on digital conversational agents for mental health, then conducted a two-pronged qualitative study of these tools for youth with mental health conditions. While the evidence base is limited, there is some preliminary evidence of utility and acceptability to users of this technology. In our qualitative study, youth were cautiously optimistic about using a digital conversational agent for mental health and identified potential benefits, but also substantial concerns. In terms of content and format, they wanted reliable, accurate, validated information, a flexible format, and a friendly interaction style, with attention to confidentiality and security. From these findings, we propose eight distinct recommendations to guide the rigorous development of digital conversational agents for youth mental health. By following these recommendations, it may be possible to build tools that leverage the strengths and potential of modern conversational agents while mitigating the potential harms to youth.
Keywords
Generative digital conversational agents (“chatbots”) available today can hold conversations with users by employing artificial intelligence and natural language processing to support human-like conversational experiences. They can respond intelligently to user inputs in order to provide support and information. Conversational agents have been used to support youth in the mental health sector, focusing on emotional health, sometimes providing direct mental health-related supports such as psychoeducation, information about coping skills, cognitive-behavioral therapy exercises, and support with making a diagnosis.1–4 Indeed, the first ever chatbot to be developed was in the field of mental health. 5
The evidence on conversational agents for mental healthcare today is limited.3,4 However, the existing research has suggested that this technology can simulate empathy 6 and is capable of building something like a therapeutic alliance, 7 showing promise for mental health care. Other potential benefits include flexibility, accessibility, cost effectiveness, and providing a stigma-free service.8–10 However, a range of ethical concerns have been raised with regard to conversational agents for mental health, such as ensuring confidentiality and transparency, as well as safety in crisis situations. 11 Also of ethical importance are testing efficacy, encouraging human services, not purporting to treat severe mental illness, and protecting against technology addiction or dependent use. 11
Digital technologies are omnipresent among youth, including those with mental health concerns. In Canada, a national report indicates that 67.9% of Canadian youth have gone online to find health information. 12 A 2024 report cited that more than half of youth had used generative artificial intelligence, 13 a number that continues to grow as publicly available conversational agents and other forms of generative artificial intelligence expand in implementation and scale. Anecdotally, we hear from youth in our networks that many youth are already using publicly available conversational agents to support their mental and emotional health. Considering that youth are intensive users of digital technologies, conversational agents need to be developed with youth in mind. Indeed, conversational agents might be a form of e-health innovation that is particularly relevant to young people. 14
Since about 50% of youth have had one or more lifetime mental health or substance use health condition, 15 providing accessible, stigma-free care is a priority. Mental health concerns among youth can have long-term impacts on a broad range of spheres of life, including education, vocational development, and quality of life, as they interfere with developmental milestones.16–20 However, rates of unmet need for mental healthcare services are high. 21 It is possible that conversational agents may fill in a void in the support needed for youth mental health, if the evidence points in this direction. We therefore conducted a set of studies to explore this possibility.
Our research to date
First, we conducted a scoping review of the literature to understand the current state of the research on conversational agents. 22 Ten published articles were reviewed, focusing on conversational agents for treatment-seeking youth with mental health conditions in clinical contexts. Most studies reported high acceptability and positive perceptions of digital conversational agents among youth populations. However, research examining conversational agents for youth mental health in clinical contexts is in the preliminary stages, with minimal established outcomes in terms of efficacy or effectiveness. Given the fast pace of technological development and the substantial time required to generate rigorous evidence, research on the most recent generative conversational agent technology used today is limited.
We then discussed conversational agents with a diverse sample of 28 youth with multiple mental health conditions.23, 24 We qualitatively asked them about their perceptions of such a tool, their willingness to use it, desirable content and use cases, advantages and disadvantages, and any concerns they may have. Participants described feelings of cautious optimism around the use of a conversational agent. While they recognized potential benefits, they expressed a noted preference for in-person services, as well as substantial concerns around ethics, confidentiality, and privacy. The majority of participants were in favor of the development of a conversational agent for youth mental health, but not all. From a content standpoint, they expressed that a conversational agent should offer reliable, accurate information that has been user tested and professionally validated. They wanted it to be a highly flexible and customizable tool with a friendly, human-like interaction style. They clearly expressed that careful attention should be paid to confidentiality and security features.
The research conducted to date provides sufficient evidence to cautiously pursue the development of conversational agents for youth with mental health conditions. However, this readiness comes with a number of caveats. We therefore make a number of recommendations to guide the development of conversational agents for youth with mental health conditions. The recommendations are directly based on the literature and our qualitative findings. From a pragmatism paradigm and lived experience engagement framework,25,26 we further consulted with young people on the recommendations and refined them accordingly.
Recommendations for the development of digital conversational agents for youth with mental health conditions
Conclusions
Conversational agents appear to have the potential to contribute to mental health support for vulnerable youth seeking care. However, the potential is mitigated by substantial ethical and usability concerns. Given that some youth are already using publicly available conversational agents for mental health support, this justifies conducting thoughtful, rigorous e-health research to develop mental health conversational agents that are ethically sound, effective, and designed specifically for youth mental health applications. For some youth with limited access to traditional care, such e-health tools could serve as low-threshold, easily accessible supportive tools.
By following these recommendations, which were derived from the literature, research, and youth consultation, it may be possible to build tools that leverage the strengths and potential of modern conversational agents while mitigating the potential harms to youth. However, the concurrent research in this area must continue, specifically involving treatment-seeking youth with mental health concerns and in a manner that keeps pace with the rapidly advancing digital technology and conversational agents. Collaborative partnerships and supportive policy environments are also essential to ensure responsible development and implementation.
Footnotes
Acknowledgements
We thank the members of the youth advisory group for their feedback on the recommendations.
Ethics approval
This is a perspectives paper without original data; ethics approval was not required.
Contributorship
LDH led the project in terms of conceptualization, design, and supervision. All authors contributed to the conceptualization, edited and approved the final manuscript.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was financially supported by the Ontario Brain Institute.
Conflict of interest
The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Brian Ritchie is affiliated with Kamazooie Development Corporation (“Kama.AI”) as its founder and CEO, a company specializing in artificial intelligence and conversational agents. While this expertise informed the study's design and analysis, all efforts were made to ensure that the research process and conclusions remained independent and unbiased. The other authors declare no conflicts of interest.
Data availability statement
Not applicable.
