Abstract
Objective
This qualitative study aims to examine the key features and design elements of a mental health digital conversational agent (“Digital Conversational Agent” or “DCA”) for youth with multiple mental health conditions.
Methods
Twenty-eight youth participants aged 14 to 25 were recruited from the Toronto Adolescent and Youth (TAY) Cohort study. Data were collected through focus groups guided by a semi-structured interview guide. Focus group discussions were audio-recorded, and transcripts were analyzed using codebook thematic analysis. Youth engagement was integrated throughout the study.
Results
Four key themes were generated from the focus group data: (1) the importance of a customizable and flexible design for personalization; (2) confidentiality, privacy features and risk mitigation features; (3) the need for reliable, informative content that is user tested and validated; (4) a friendly and human-like interaction style.
Conclusions
The study identified key design features that may enhance youth engagement and trust in DCAs for mental health support. Collaborating with youth engagement specialist and industry partners underscored the value of co-designed approach in preparing to develop relevant, feasible, and ethical DCAs.
Background
The prevalence of mental health conditions among youth has become a growing concern globally.1,2 Adolescence and young adulthood are critical periods for mental health as many mental health and substance use disorders first appear during these years. 3 In the USA, nearly 50% of youth are affected by at least one mental health or substance use disorder, with about 27% meeting criteria for severe impairment, and about one in every five youth experience two or more such conditions. 4 In Canada, recent estimates showed that nearly one in five individuals aged 15 and older meet criteria for at least one mental health or substance use disorder. 5 These disorders can have long-term impacts on education, vocation, and quality of life, as they interfere with key developmental milestones facing young people.6–10 Given the stigma surrounding mental health and substance use conditions and the barriers to accessing timely and effective care,11,12 there is an increasing demand for youth-friendly service and support approaches that address their unique needs.
Youth with multiple, or co-occurring, mental health conditions face heightened challenges in accessing treatment. 13 The co-occurrence of mental health conditions can be described as either multimorbidity when considering all mental health conditions in an individual youth present without the need to identify a “primary index disorder,” or comorbidity when a co-occurring condition is considered secondary to a primary index disorder; 14 note that the youth teams associated with this project prefer the term “multiple mental health conditions” over “multimorbidity,” and we will therefore use their preferred term. Co-occurrence of multiple mental health conditions has been found to be associated with more academic difficulties, higher rates of service utilization, suicide attempts, poor role functioning, 15 and social and psychological functional impairment16,17 compared to individuals with a single mental health diagnosis. Research has indicated that the co-occurrence of multiple mental health conditions is linked to elevated treatment barriers, 18 such as complexity of navigating public and private care systems, limited insurance or financial support, and fears around disclosing risky behaviors to parents. 19
With the high use of digital technologies in youth, digital conversational agents (“Digital Conversational Agents” or “DCAs”), have emerged as a promising tool for addressing youth mental health challenges. 20 According to Statistics Canada, 99.2% of youth aged 15‒24 use the Internet at least occasionally, with 67.9% specifically searching online for health information.21,22 A 2024 report suggested that over half of youth have used some form of generative artificial intelligence (AI) in their lives. 23 Research has suggested that youth recognize the benefits of digital health interventions, including their flexibility, accessibility, ease of use, cost-effectiveness and ability to provide support without the stigma often associated with traditional mental health services.24–26 Some digital mental health interventions have demonstrated acceptability among youth, which include DCAs accessible via mobile apps27–29 or web applications, 30 as well as daily activities prompts 31 and gamified interventions. 32
DCAs, commonly known as “chatbots,” are programs that engage in human-like conversations by integrating and responding to user inputs. Some advanced DCAs use AI and natural language processing and are generative in nature.33,34 In mental health contexts, these advanced DCAs can respond intelligently to users and can be used to deliver services and support, such as psychoeducation, coping strategies, cognitive-behavioural therapy, and diagnostic support, and basic companionship value.27,29,35–37
The research on the use of DCAs in mental health is still emerging. Recent systematic reviews suggested that DCAs can effectively reduce psychological distress and are generally well-accepted, but there is a specific lack of research on their application to youth with mental health conditions.35,38,39 In addition to these reviews, other studies have shown that DCAs have the potential to promote mindfulness, emotion regulation, 40 and medication adherence 41 in patients with mental health challenges, yet research on their use specifically for treatment-seeking youth populations remains limited. Some evidence has indicated that DCAs can foster a something akin to a therapeutic alliance and provide responses that are rated as “empathetic” and high-quality.20,42 However, their efficacy, effectiveness, and safety in youth with mental health conditions remains unclear.
Objective
Given high rates of co-occurrence of mental health conditions among young people and the potential for unique needs and preferences among this population, this qualitative descriptive study aimed to take a first step toward considering whether to develop a relevant and youth-friendly mental health DCA by examining perspectives of youth with multiple mental health conditions regarding its key features and design elements.
Method
This qualitative descriptive study incorporated youth engagement principles 43 and a pragmatic paradigm. 44 Our reporting followed the Consolidated Criteria for Reporting Qualitative Research (COREQ) checklist. 45
Participants
The study included 28 youth participants with multiple mental health conditions (MMHC). A total of 30 participants consented, but two did not complete the study (one due to scheduling conflicts and one decided to not participate), which resulting in a final sample of 28. To be eligible, participants had to (1) be aged 14‒25, (2) be part of the Toronto Adolescent and Youth (TAY) Cohort at the Centre for Addiction and Mental Health (CAMH), and (3) meet criteria for at least two mental health disorders based on research-administered diagnostic assessments from the source study. 46 Age and MMHC eligibility were confirmed through assessments from the TAY Cohort study, based on the Kiddie Schedule for Affective Disorders and Schizophrenia - Present and Lifetime (K-SADS-PL) 47 and Structured Clinical Interview for DSM-5 (SCID) 48 and confirmed in a team consensus meeting. 46 A quota sampling approach was used to ensure diversity, targeting balanced representation across ages, genders, and ethnic/cultural backgrounds. Exclusion criteria included (1) inability to provide informed consent and (2) inability to communicate in English. The sample size was chosen based on practical considerations and existing recommendations for qualitative research. 49
Recruitment
Participants were recruited through target referrals from the TAY Cohort study, 46 which is a large longitudinal study at same research institution, CAMH, that follows youth accessing hospital-based mental health services in the province of Ontario, Canada. Eligible individuals who had previously consented to be contacted were approached by TAY study staff, provided with a study flyer or brief study description, and referred to this study's research analyst (RA) upon expressing interest. Participants were contacted by the RA via email, text, or phone call, depending on their prior communication preference. The RA then met with them virtually, provided detailed study information, confirmed eligibility, and obtained electronic signed informed consent through the REDCap system. 50 While the RA's professional role was explained, personal motivations for conducting the research were not discussed. No prior relationship was established between researchers and participants before study recruitment. The source and current study were approved by the CAMH Research Ethics Board. In accordance with the Research Ethics Board approval, parental consent was not required for youth participation. Figure 1 provides an overview of the recruitment and participation process.

Recruitment and participation flowchart.
Procedure
Following consent, the current study's RA requested that the TAY Cohort team transfer the participant's demographic information, as the TAY team had obtained consent for the use in sub-studies. Participants were then scheduled into one of the available virtual focus groups. Each focus group session included 4‒6 participants. Only the participants and researchers were present during the focus group. Focus groups lasted from 65 to 100 minutes of recorded discussion time, with an additional preamble and 2-minute introductory animation video. The video was youth co-designed and presented at the beginning of each session. It aimed to provide a youth-friendly introduction to DCAs and their potential in mental health care. After this, two RAs (JH and another research team member) facilitated the conversation with a semi-structured interview guide. Participants received a $50 gift-card honorarium for completing the focus group discussion. Sessions were conducted between June and August, 2024. No official field notes were recorded during the focus groups.
After completion of four focus groups, we began analyzing the data (below) and felt that more data was required. Once we reached six focus groups, we observed that similar perspectives emerged consistently in the later sessions, which suggested that data saturation has been reached. However, recognizing the challenges in defining and confirming saturation, 51 we employed additional strategies to further enhance reliability, trustworthiness, and credibility, beyond perceived saturation, including ongoing engagement of Youth Engagement Specialists (i.e., youth with lived experience) throughout the process to ensure a comprehensive understanding of their perspectives.52,53 No repeat interviews were conducted. Transcripts were not returned to participants for comment or correction.
Measures
Participants’ demographic information and MMHC data were gathered from the TAY Cohort study, which is described elsewhere. 46 The semi-structured interview guide for the focus groups was developed in collaboration with study team members, including the Youth Engagement Specialists (Appendix A). The RE-AIM framework was employed as inspiration during the interview guide development process to ensure the comprehensive coverage of relevant topics and to provide both breadth and depth in understanding participants’ experiences and perspectives. 54 Participants were prompted with open-ended questions to explore the potential role of DCAs for youth with MMHC, discussing topics such as accessibility, specific content and interaction features, user engagement, and potential benefits and challenges. In discussion with participants, we referred to the digital tool as a “chatbot” rather than “Digital Conversational Agent,” as this term is more commonly recognized and relatable for youth.
Data analyses
Focus group discussions were transcribed verbatim and anonymized. Demographic and MMHC data were described using counts and proportion. Qualitative data from focus groups were analyzed using codebook thematic analysis, 55 combining both inductive and deductive strategies. This analysis was carried out by the study RA (JH), with ongoing input from the research lead (LDH) and discussions with Youth Engagement Specialists, industry partners, and other team members. NVivo software was used for transcript management and coding. 56 Initially, after transcribing four of six focus groups, the RA (JH) and research lead (LDH) collaboratively developed a codebook to guide the coding process. The transcripts were then coded using this codebook; since saturation was not reached, two additional focus groups were conducted, with iterative refinements made to the codebook during biweekly meetings as more data were processed. At six focus groups, the codebook was stable and saturation was reached. The codes were reviewed by two Youth Engagement Specialists to ensure the credibility of the analysis. Themes were developed by the RA by organizing the refined codes into cohesive patterns to provide a synthesized view of the data. Each theme was analyzed in depth, with input from Youth Engagement Specialists and industry partners. Anonymous representative quotes were selected to illustrate key findings. The iterative analytical process allowed for progressive refinement of codes and themes to ensure the findings reflected the data. Participants did not provide feedback on the findings.
Stakeholder engagement
This study integrated youth engagement as part of methodology. 43 Youth Engagement Specialists were involved from study design to reporting. Youth Engagement Specialists are paid youth staff with lived experience of mental health and/or substance use conditions and expertise in research engagement. They were engaged in co-developing focus group guide, youth-friendly introductory video, and the related scoping review. They were presented with codes and tentative themes and contributed to the results interpretation and co-authorship to ensure the findings were informed by lived experience. In addition to youth engagement, industry partners specializing in AI-based DCAs were engaged to provide feedback and align the DCA's design with real-world applications. They provided input at the study design stage, as well as during data analysis and reporting, and one is a co-author of this work. Industry partners did not have access to the raw data and we engaged in careful consideration of their feedback to avoid any conflicts of interest or undue influence. The combination of youth and industry engagement ensured that the study addressed the needs of users and the technical capabilities of natural language AI-assisted engagement tools.
Positionality
The lead data analyst and focus group facilitator (JH) is a Chinese female with a M.Ed. in psychology, who studied in and immigrated to Canada. She currently works as a Research Analyst at CAMH. Her interest in AI's potential role in mental health care grew significantly following the release of AI-powered products such as ChatGPT in recent years. The lead researcher (LDH) is a White female Canadian with a PhD in psychology, specializing in the engagement of people with lived experience in mental health and substance use research at CAMH. She is new to AI and has experienced its benefits and challenges through the course of the study.
Results
The demographic characteristics of the 28 participants are detailed in Table 1. The sample includes a diverse range of youth in terms of age, gender, and ethnic/cultural backgrounds. Participants’ average age was 18.8 (SD = 3.3) years. Regarding mental health conditions listed in Table 1, participants met diagnostic criteria for an average of 4.0 (SD = 1.6) mental health conditions.
Demographic characteristics of participants.
Through six focus group discussions, participants highlighted perspectives that were grouped into four key themes regarding the design and functionality of a DCA for youth with multiple mental health conditions. These themes are: (1) a customizable and flexible design for personalization; (2) confidentiality, privacy, and risk mitigation features; (3) reliable, informative content that is user tested and validated; (4) friendly and human-like interaction style. Table 2 provides a summary of themes and subthemes.
Summary of themes and subthemes.
Theme 1: Customizable and flexible design for personalization
Participants expressed a strong need for a personalized, flexible DCA experience. This included customization of the look and feel, such as being able to choose avatars, names or alias, interface themes, and the voice of the DCA. They also preferred flexible multi-modal interactions, that is, the use of features such as voice-to-text, voice messages, images, and emojis to enhance engagement.
Participants emphasized the importance of creating a user-friendly and accessible interface that is suited to a wide range of users. They wanted the design to be simple and intuitive, which would minimize the number of navigation steps required, to make the tool “easy to use” [Participant #1]. I think the accessibility of the chatbot, you want to make it very user friendly and make sure that anybody who might have accessibility challenges is still able to use it. [Participant #1]
Participants wanted the ability to interact with the DCA through multi-modal interactions, including voice-to-text, voice messages, images, and emojis. Participants felt that the ability to speak to the DCA, rather than just typing, could provide ease during moments of emotional distress. I find like saying things out loud helps me get through them easier than like typing them, so I feel like that might make a difference. [Participant #4] I think what might be cool is if the AI could use like images and videos and send you links and stuff like that, as, like, an aid for what it might be talking to you about. That might make it seem cooler or more engaging. [Participant #6] But I wouldn't want celebrity voice or honestly even emojis, ‘cause I feel like emojis can be misinterpreted. [Participant #1]
Theme 2: Confidentiality, privacy, and risk mitigation features
Participants emphasized the need for transparency and strong privacy controls in the DCA design to ensure confidentiality, privacy, and risk mitigation. Notably, they were significantly concerned about how user data would be handled and how it would be determined that confidentiality should be broken in situations of crisis to mitigate risk. They wanted a balance between providing effective support and adhering to ethical standards in designing DCAs for mental health.
Participants felt that confidentiality and privacy could be enhanced by providing personal accounts that they can log into, alongside a guest mode for those who prefer to remain anonymous or do not want their data stored. They felt that the log-in feature of the DCA should allow users to tailor their interaction experience based on their preferences for data control. In terms of youth, they don't always have a device of their own or reliable internet access at all times, and so I think it would be helpful to create, instead of it needs to be that app on your phone, it could also just be a website and you can log in. [Participant #4] I think there's a way that we could combine the two solutions where it would be like, like a summary and not the word for word, or it would be saving it for like 24 hours, a week… I think that could also be both implemented at the same time if you close the chat, you can get a little box and you can say having both options like, “don't save,” “save a summary,” “save word for word,” or “don’t save,” “save for 24 hours,” “save for a week,” “save for a month,” “save forever.” [Participant #9] I also think if you decide to do that, then parts of the terms of service should also be like, it gets to have your location, because let's say the human evaluator decides that this person's in trouble now, it should have your location in my opinion. [Participant #4]
Theme 3. Reliable, informative content that is user tested and validated
Participants felt that the DCA should provide reliable, informative content, and that this content should be validated by mental health professionals and tested by users to ensure accuracy and effectiveness. They also suggested human oversight and a clear verification process to ensure the accuracy and quality of the information provided by the DCA.
Participants felt that one of the most important forms of reliable, informative content that a DCA should provide is psychoeducation. Participants emphasized that many young people may not recognize their mental health conditions and need help navigating these experiences. One participant suggested that DCAs could help users “spot some warning signs of trauma” [Participant #10], which could be especially valuable for survivors of childhood trauma who might not know where to seek support. Additionally, participants mentioned the slow and challenging process in accessing professional care; DCAs were seen as a potential bridge to human help, not only by providing information, but also by offering actionable resources and directing users to alternative support services. Accessing mental health services is a really arduous process right now. (…) I think having AIs there to help should also include having ways for them to help people GET to human help, not just “recommending” it. Having suicide prevention information, being able to send users to other sites where they can be better supported, providing not only data, but also alternate courses of action. (…). That being said, they should also be able to just listen, as well as providing alternate resources. [Participant #9]
Some participants stated they would trust the DCA more if its content was validated by a “healthcare practitioner” or a “healthcare organization” [Participant #11]. They emphasized that while simple suggestions like “go outside” [Participant #11] might not need complex professional validation, more specific advice, such as coping techniques or psychological strategies, should be backed by experts in the field to ensure reliability. So you know if that advice comes just from the AI and what it knows, probably that would be a little bit shady for me, but then if some of the techniques and all the important stuff is written by healthcare professionals or revised by them at least… [Participant #11] It would be important to me that the chatbot itself was like tested and has a success rate, but, I was actually speaking to whatever coping skills or resources the chatbot suggests. Like, I would want it to say, like, four out of five people find that this coping strategy worked, or something like that. [Participant #4] I think it's really important to vet the resources that the chatbot provides. Because AI generally speaking has like a lot of problems with giving you resources that don't exist or resources that are just commonly available on the internet, but probably don't apply to youth specifically because of just the general nature of generative learning. [Participant #14]
Theme 4. Friendly and human-like interaction style
Participants preferred that the DCA be more human-like, emulating empathy and respect for all users’ backgrounds, adapting responses to reflect each user's unique experiences and needs.
Participants highlighted the importance of responses that emulate empathy. Rather than immediately offering solutions, they wanted the DCA to provide emotional support, especially during moments of crisis. One participant explained that they would prefer emotional validation before being offered coping strategies. I also personally wouldn't want it to respond with a solution right away. I would want it to, like, if I'm in crisis, I want support for a while before I'm even ready to consider a solution. So I would not want to like type out a whole thing, about like how upset I am, and then it goes, oh, just like, just do XYZ you'll feel better, you'll… it'll help you cope. I don't like that. [Participant #4] Yeah, basically, like for it to be able to like not only just like compare, like ‘oh this same thing happened to this person and that person, so must the same level of like trauma.’ So that it doesn't only make that observation, because again, be able to sympathize with people, like, how it personally affect them. [Participant #8]
Respect for diverse identities and backgrounds was crucial for participants. Some participants suggested that the DCA could use inclusive language by asking “what pronouns do you use” or “things related to personal values, what's the most important to you” [Participant #8] to apply the correct pronouns and appropriate responses for each individual user. There was a strong preference for the DCA to adapt to users’ needs and tailor its interactions to individual circumstances and preferences. This included considering the user's emotional state, backgrounds, and experience to offer more personalized support. I think it'd be ideal if it would naturally adjust based off of certain cues of like what the client needs. Like, a therapist might do that – might naturally figure out what you need by talking to you. Because I think it's especially for if you're looking to help teens or you or have teens use this, they might not know what they need or what they want, you know. [Participant #6] Again like important to find the balance between it being personalized and it being safe and accurate in what it says cause, even if it's like a vetted response still it can be personalized based on, your name, the tone you prefer to talk with, and just generally patterns, it can be personalized but it doesn't mean that the information it provides must be different. Like, it just doesn't have to provide any false information or like misinformation. That's the only thing. [Participant #11] Chatbots would need to be able to have a lot of different personalities per se, in order to cater to all the different ways that a person might need to be responded to in the moment. [Participant #8] Let's say you're talking about another story and you're like, “oh, I reacted this way.” Like the chat, the AI chatbot is already like ‘Oh, because he went through this in the beginning, he told me this in the beginning. I don't have to review that.’ [Participant #17]
Discussion
This study explored the preferences and needs of treatment-seeking youth with multiple mental health conditions regarding the design and functionality of AI-based DCAs. Four key themes were generated by the analysts based on participant dialogue: a need for a customizable and flexible design; strong confidentiality, privacy and risk mitigation features; reliable and validated content; and a friendly, human-like interaction style. These findings provide exploratory insights into the design of a mental health DCA that considers youth’s unique needs, preferences, and circumstances.
A dominant theme was the importance of the DCA emulating empathy in its responses. It is important to note that true empathy is not possible from a DCA, but the emulation of empathy is a target for young people with multiple mental health conditions seeking support in clinical contexts. While many participants expressed the desire for the DCA to emulate empathy, some youth were skeptical about the genuineness of the DCA's support, showing tension in this recommendation. Existing literature on AI's ability to express something akin to empathy is limited but emerging. Some research has presented skepticism about DCAs’ ability to create empathetic communication, as they inherently lack emotional resonance and experienced empathy.57,58 In contrast with the wish for empathetic responses, a recent study has suggested that focusing on instrumental support such as practical assistance without attempting deep emotional engagement may better align with users’ perceptions of AI, which potentially increases trust without risking authenticity concerns. 59
Haque and Rubya 60 highlight what they call “therapeutic misconception,” where users may mistakenly believe the DCA can provide the same empathetic support as a human therapist. This is important to consider in the context of youth with multiple mental health conditions who may be seeking human-like support. In contrast, in another study, DCA responses were rated to be more empathetic than the responses of physicians, 61 which may be because AI uses algorithms that allow it to adapt to the communication styles of the users. DCAs that adapt to the empathy needs of young people with multiple mental health conditions might be able to provide a more extensive level of support that young people need, outside of clinical care. DCA developers are encouraged to address concerns around the provision of empathetic responses, versus the therapeutic misconception; this could be by focusing on transparency about the DCA's capabilities, and by prioritizing instrumental assistance and comprehension.
Participants expressed a strong need for the DCA to offer a range of personalization features, such as user settings and profile preferences, which would allow for a more personalized experience. However, many of these features fall within the scope of digital application development (e.g., log-in settings, interface design) rather than DCA programming and development (e.g., adaptive responses, language models). To meet youth's preferences in terms of customization, it would seem to be important for any DCA design process to be conducted in collaboration between AI experts and application developers to achieve this customizable experience that participants expected.
Confidentiality and crisis management emerged as an important risk mitigation feature for participants, as many expressed concerns about how the DCA would handle mental health crisis situations. Participants suggested that the DCA might have pre-set permissions for human escalation, with clearly defined thresholds for crisis intervention. For example, if a user expresses suicidal intent or self-harm, the DCA could prompt crisis resources and encourage contacting a trusted individual, or, if explicitly consented to, escalate to a live support service. A clear explanation of these procedures in the terms and agreements should be presented to make sure users understand when and how escalation occurs and what level of intervention the DCA can and cannot provide. Ethical issues like confidentiality and crisis management have been extensively discussed, for example by Kretzschmar et al. 62 and Obadinma et al. 63 Those factors should be carefully negotiated with youth and other important stakeholders, such as caregivers and clinicians, to identify an optimal solution and ethical framework for any DCA designed for youth with complex needs.
Algorithmic bias occurs when AI systems reinforce existing inequities related to socioeconomic status, race, gender, cultural background, disability, and other factors that lead to inequities in health system. 64 This is another key ethical concern, as a lack of diversity in training data may result in the DCA providing responses that do not accurately reflect the varied experiences and needs of youth from different backgrounds and will further limit its effectiveness. Participants emphasized the importance of content validation by both professionals and users to ensure accuracy and relevance. This aligns with broader bias mitigation strategies, as engaging diverse users and professionals in validation and evaluation may help identify gaps in representation and potential biases in responses. Since youth can help identify and address algorithmic bias, 65 involving them in co-design and evaluation can improve the DCA's equity by ensuring the interactions are more inclusive and responsive to different lived experiences.
Based on youth preferences and the literature on intervention development, it is important that any digital mental health agent be co-designed with youth to ensure that they meet the specific needs and preferences of young people. 66 Co-designing with stakeholders not only enhances relevance and acceptability, but also supports potential efficacy and effectiveness. 67 Moreover, involving clinicians and crisis response experts in the design and testing process can make it possible to further refine features such as crisis intervention protocols and privacy safeguards, aligning the DCA's functions with ethical standards,62,63 although the tension between content that is vetted by professionals versus AI-generative must be addressed. Engaging youth and professionals at each stage, from initial design through iterative testing, would ensure that the DCAs embody user-centred design principles and are rigorously evaluated for safety, usability, and efficacy. This must be an ongoing process. In a climate of fast-paced technology development and social change, the tools designed with today's technology for today's youth might not be relevant or acceptable to the next generation of young people.
This study highlights several implications that may be relevant to clinical practice and future research. It is important to note that DCAs are being used by youth with mental health challenges, as demonstrated by emerging literature on AI agent development and anecdotal evidence, and the opportunity is therefore upon us to build a DCA that meets the unique needs and preferences of youth with multiple mental health conditions while being safe and effective. In the clinical context, the DCAs could potentially function as a low-threshold, accessible support option that can potentially reach youth who face barriers to traditional care. 68 Offering customized and user-centered interactions would allow the DCAs to reach youth who may be hesitant to seek in-person help due to stigma or other barriers. 69 In terms of research, our findings contribute to the growing literature on digital mental health tools by highlighting the specific needs of youth with multiple mental health conditions. Future research should test such a DCA developed with these design elements in real-world settings to evaluate the acceptability to youth with multiple or complex mental health presentations rather than only in preventive contexts; they should also test the impact of DCA use on mental health outcomes and overall user satisfaction for this population. Appropriate safety mechanisms must be put into place when testing a DCA not yet demonstrated safe and effective for vulnerable youth. Longitudinal studies will be needed to understand these DCAs’ long-term effects and sustainability in supporting treatment-seeking youth with complex mental health conditions.
Strengths and limitations
This study is unique in its focus on youth with multiple mental health conditions, revealing exploratory themes that define how a DCA might be relevant to youth with complex needs. In addition, this study involved youth engagement specialists who provided valuable contributions and expertise, from the study design to the reporting of the results. Collaboration with industry partners who specialize in AI DCAs further strengthened the study by ensuring that the study materials were aligned with technological capabilities in the field. However, the study's small sample size and reliance on qualitatively reported preferences may limit the generalizability of its findings. Despite diversity in the sample, it is not possible to derive findings for specific cultural or demographic subgroups of the population. Future research should consider the needs of priority subgroups of the population, such as Indigenous youth, 2SLGBTQI+ youth, newcomer youth, and various racial and cultural subgroups. Additional focus groups conducted with clinicians or cybersecurity experts could identify key privacy, data usage, and safety considerations. The actual use of any developed DCA should be assessed on an ongoing basis, since predictions of future use may differ from actual use. The use of online recruitment and data collection is a limitation given the digital divide, which may exclude youth from underserved communities with limited Internet access; however, since youth who are likely to use DCAs are also likely to be connected to the Internet, we may have reached the intended target population. Future studies should explore alternative access strategies, such as text-based or offline AI tools and community-based digital hubs, to bridge accessibility gaps. Additionally, a therapeutic misconception may arise if users mistakenly perceive the DCA as providing human-like empathy or clinical-level support. Ensuring clear communication about the tool's capabilities and setting realistic user expectations will be critical for responsible implementation.
Conclusion
This study provides insights into the design features of AI-based DCAs that youth with multiple mental health conditions would prefer. The results highlighted critical features that could enhance the engagement and trust in DCAs for treatment-seeking youth, which include customization, confidentiality and privacy, reliable content, and human-like interactions. Co-designed with youth engagement specialists and industry partners, this study illustrated the importance of collaboration when preparing to develop tools that are not only relevant, but also feasible and ethically responsible. Although there is a need for future research to test these design elements in real-world settings, this study lays the groundwork for co-designing youth mental health DCAs that could bridge gaps in treatment-seeking youth mental health supports and expand access to care.
Footnotes
Acknowledgment
We would like to thank all of the participants for their contributions. We thank Shelby McKee for co-facilitating the focus groups.
Guarantor
Lisa D. Hawke
Ethical Considerations
This study was approved by the Centre for Addiction and Mental Health (CAMH) Research Ethics Board (REB No. 2023/182) on April 15, 2024. Respondents gave signed informed consent before starting the interviews.
Author contributions/CRediT
LDH led the study in terms of conceptualization, design, management, and supervision. JH collected data, analyzed data, and drafted the manuscript. LG acquired funding for the project. All authors contributed to the design of the project and edited and approved the final manuscript.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Ontario Brain Institute. The source study from which participants were recruited was made possible with financial support from the CAMH Discovery Fund.
Conflicting interests
The authors declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Brian Ritchie is affiliated with Kamazooie Development Corporation (“Kama.AI”) as its founder and CEO, a company specializing in AI and conversational agents. While this expertise informed the study’s design and analysis, all efforts were made to ensure that the research process and conclusions remained independent and unbiased. The other authors declare no conflicts of interest.
Data availability
The data underlying this project are governed by the Research Ethics Board of the Centre for Addiction and Mental Health.
