Abstract
Objective
The integration of Human-Computer Interaction (HCI) and healthcare technologies is transforming the landscape of mental health interventions. Despite the growing adoption of mental health apps, current evaluation methods often neglect the interplay between interface design, personalization, emotional resonance, privacy, and community engagement. These gaps limit the capacity of digital tools to meet therapeutic goals while maintaining user trust and long-term engagement. This study examined the role of Xin Dao Diary, an AI-assisted platform, in enhancing emotional well-being through innovative design and user engagement strategies.
Methods
With a mixed-methods approach, including walkthrough methods, diary studies, and sentiment analysis of user feedback, we explored how digital interfaces can facilitate effective mental health care.
Results
Our findings reveal that intuitive interface design and personalized AI interventions improve user satisfaction and emotional health outcomes. However, challenges remain in data privacy, algorithmic transparency, and the authenticity of emotional responses, which may undermine user trust and limit long-term engagement.
Conclusion
The present research proposes a Holistic AI Care Design which emphasized the integration of multiple factors, including user needs, AI personalization, privacy, and community building in app design. It also incorporates usability, user engagement, and ethical considerations into the evaluation of AI-assisted mental health apps. This research underscores the importance of interdisciplinary approaches in advancing digital health solutions, offering valuable insights for developers and healthcare practitioners aiming to optimize user experience and therapeutic efficacy.
1. Introduction
Digital health technologies, especially artificial intelligence (AI), are emerging as a transformative force in the field of healthcare. As the global mental health crisis intensifies, the growing prevalence of mental health disorders is placing unprecedented pressure on healthcare systems worldwide. 1 Traditional face-to-face counseling and treatment models are proving inadequate, falling short in meeting the immense demand for scalable, accessible, and personalized mental health interventions. In this context, AI and HCI have together begun to reshape how mental health services are designed, delivered, and experienced. These technologies introduce new paradigms for emotionally intelligent systems that resonate with user needs. 2 By using large-scale data, machine learning algorithms, and natural language processing, AI-assisted mental health applications can deliver interventions that dynamically respond to individual emotional states, behavioral patterns, and contextual factors. In parallel, HCI serves as the critical infrastructure for ensuring these tools are intuitive, usable, and emotionally engaging. The intersection of AI and HCI, therefore, holds promise for advancing mental health care, yet remains underexplored in academic literature, especially from a multidisciplinary and user-centered perspective. Current evaluation methods often neglect the interplay between interface design, personalization, emotional resonance, privacy, and community engagement. These gaps limit the capacity of digital tools to meet therapeutic goals while maintaining user trust and long-term engagement.
Specifically, while individual aspects of HCI, such as interface usability and AI personalization, have been studied in isolation within digital mental health research, the integrated examination of how these factors collectively shape user experience, therapeutic efficacy, and sustained engagement remains significantly underexplored. 3 Current evaluation frameworks typically focus on single dimensions such as clinical efficacy, usability, or user satisfaction without accounting for the interdependencies between interface design, AI-driven personalization, privacy and trust, feedback mechanisms, and community support features. This fragmented approach obscures the holistic user experience and limits our understanding of how technology can be optimized to support both clinical outcomes and user wellbeing.
Thus, the present research addresses the need for a comprehensive methodological framework that integrates multiple factors essential for designing and evaluating AI-assisted mental health applications. We propose a Holistic AI Care Design that foregrounds user-centered interface design, adaptive AI personalization, ethical data practices, active feedback mechanisms, and social connectivity. This interdisciplinary approach aligns with the principles of responsible innovation, emphasizing both technical performance and emotional relevance in mental health technologies. It also responds to critical limitations in existing evaluation paradigms, which often focus narrowly on technical accuracy or clinical efficacy, while overlooking the lived experiences of users, the interpretability of AI recommendations, or the psychosocial implications of digital engagement. 3 To ground this investigation, we conducted a case study on Xin Dao Diary, a Chinese AI-assisted mental health application that exemplifies the convergence of therapeutic content and social networking features. Launched in 2020, Xin Dao Diary has grown into an emotional support platform with more than two millions of users. It integrates AI-driven features such as mood tracking, personalized emotional reports, and virtual therapy sessions, alongside community-based functionalities that enable anonymous peer support and shared emotional expression. The app’s design philosophy bridges psychological self-help with participatory media environments, making it particularly relevant to both HCI and digital healthcare research. Its dual role—as both a mental health intervention and a digital community—invites critical examination of how interface, interaction, and emotional intelligence intersect to shape user experiences and outcomes.
This study seeks to explore how Xin Dao Diary supports emotional well-being through innovative design and user engagement strategies. Specifically, the study addresses four interconnected research questions: (1) How are principles of HCI implemented within the app to support users’ mental health experiences, particularly through interface design that facilitates emotional expression and routine self-reflection? (2) How do AI-driven features contribute to the delivery of personalized mental health interventions that respond dynamically to users’ emotional states and behavioral pattern? (3) How do user interactions and feedback influence the app’s continuous development and psychological efficacy4,5? (4) Whate implications does Xin Dao Diary offer for evaluating and designing AI-assisted mental health app? These questions focus on the intersection of HCI and healthcare, emphasizing user experience, AI interaction, and mental health outcomes. In this sense, this is integrated approach advances our understanding of how technology can simultaneously achieve clinical effectiveness and user-centered design excellence.
1.1 Human-computer interaction and AI healthcare design
HCI is a critical bridge between technology and user behavior in digital health interventions, shaping the effectivenes of healthcare systems. 6 Interface design and user experience affect the usability, acceptability, and safety of healthcare applications, thereby influencing their long-term effectiveness. 7 Intuitive navigation and visually coherent interfaces build user trust in mental health platforms and improve operational efficiency in integrated healthcare systems, enabling collaboration between patients and providers. Effective HCI design also supports health monitoring and timely emergency response. AI-assisted mental health applications, particularly those leveraging large-scale data, machine learning algorithms, and natural language processing, have demonstrated therapeutic efficacy in delivering personalized interventions that respond to individual emotional states and behavioral patterns. 1 Recent evidence supports the effectiveness of these approaches across multiple mental health conditions, 1 with particular success in specialized domains such as addiction care monitoring 8 and personalized mental health interventions. 9 Despite its importance, Blandford underscores a persistent gap in the field, 10 noting that “insufficient integration of human factors insights into healthcare technology design remains widespread.” Suboptimal interface design carries tangible risks: studies demonstrate that poorly conceived user interfaces in complex healthcare systems can directly compromise patient safety, even leading to life-threatening errors. 11 Consequently, optimizing user interface (UI) design is imperative to strengthen interactions between healthcare technologies and their users. 12 Notably, reliance on prior knowledge among trained professionals fails to mitigate errors stemming from flawed system architecture, emphasizing the non-negotiable role of usability in clinical adoption. Usability and user experience jointly dictate the real-world implementation of healthcare technologies. Enhanced usability correlates strongly with increased device utilization, while superior user experience drives iterative improvements in system design. 10 Ultimately, the synergy of human-centered user experience and robust technical systems forms the cornerstone of achieving measurable healthcare outcomes, as evidenced by longitudinal studies on technology-integrated care models. 13 However, implementing AI-assisted mental health applications presents significant challenges that must be carefully addressed. Surveillance fatigue remains a critical concern, as continuous monitoring and data collection may lead to user discomfort and reduced engagement. 14 Additionally, maintaining appropriate therapeutic boundaries between AI systems and users is essential, as unclear boundaries can undermine the therapeutic relationship and user trust. 15 The effectiveness of AI interventions also depends on robust feedback loops that enable continuous system refinement based on user outcomes and clinical effectiveness. 16 These implementation challenges underscore the importance of integrating rigorous HCI principles with ethical considerations to ensure that AI-assisted mental health applications enhance rather than compromise user well-being and therapeutic efficacy.
HCI principles promote sustained user engagement in healthcare technologies by minimizing cognitive demands and integrating emotionally resonant design. One of the key strategies involves leveraging gamification elements to incentivize consistent use, as demonstrated in behavioral health interventions. 17 Given that poor usability remains a critical barrier to adoption, healthcare applications must prioritize intuitive, user-friendly interfaces that balance simplicity with clinical efficacy. Gamified components, such as progress tracking and reward systems, not only foster positive psychological engagement but also mitigate frustration and cognitive overload during system interactions. Simultaneously, real-time interactive features, such as mood diaries and symptom trackers, amplify users’ capacity for self-reflection. These tools exemplify the broader HCI imperative for information exchange in smart mobile medical systems. At the intersection of HCI and emotional health research, self-tracking enables users to monitor activities, dietary habits, and mood fluctuations, while smartphone or web-based platforms facilitate real-time condition management, thereby reinforcing self-regulatory behaviors. 18 Central to effective HCI design is the systematic analysis of users’ environmental contexts and device interactions, which generates activity logs and historical data to underpin reflective practices. However, a critical consideration lies in balancing functional sophistication with usability: over-engineered tools risk overwhelming users, as evidenced by Ben-Zeev et al., 9 who observed that functionally redundant mental health apps exacerbated cognitive strain in anxious populations, resulting in reduced adherence. Consequently, the central challenge in healthcare HCI is transforming technological complexity into seamless, intuitive experiences—oftern described as “senseless services” in user-centered design—that support effortless engagement without compromising clinical utility.
The integration of AI into digital health interventions introduces opportunities for personalization and adaptability in mental healthcare. AI-driven personalization represents a paradigm shift, enabling the creation of tailored treatment plans that account for individual behavioral patterns, cognitive profiles, and real-time physiological data. 1 Such precision in intervention design improves therapeutic efficacy by aligning strategies with patients’ unique challenges and strengths, thereby enhancing treatment adherence, shortening recovery timelines, and elevating user satisfaction. In addiction care, AI systems demonstrate particular utility through continuous behavioral monitoring. By analyzing triggers, stressors, and substance use patterns, these systems can identify high-risk scenarios in real time, alerting both clinicians and patients to imminent relapse risks and enabling proactive adjustments to treatment protocols. 8 This capability underscores adaptability as a critical metric for evaluating AI’s impact in mental health interventions. Machine learning algorithms dynamically assess user progress and iteratively refine therapeutic approaches, circumventing the inefficiencies of traditional trial-and-error methods while optimizing outcomes through data-driven iteration. Notably, AI enhances evidence-based practices such as cognitive behavioral therapy (CBT) by adapting interventions to users’ evolving needs. For instance, algorithms can detect users’ cognitive patterns and modulate therapeutic content to target these traits specifically, thereby improving CBT’s relevance and effectiveness across diverse populations. Concurrently, AI-enabled continuous monitoring serves as a preventive tool, detecting early indicators of mental health deterioration—such as deviations in sleep, activity levels, or speech patterns—in conditions like depression and bipolar disorder. These alerts empower timely clinical interventions while fostering longitudinal care through continuous data collection. The aggregation of longitudinal datasets further enriches therapeutic decision-making, offering clinicians actionable insights into patients’ mental health trajectories and guiding personalized intervention strategies. Beyond clinical settings, AI-powered tools such as chatbots and virtual therapists can also scale mental healthcare delivery. By providing timely and accessible support, these solutions expand access to underserved populations, reduce treatment costs, and alleviate systemic shortages in mental health resources. 19 Thus, AI not only helps bridge gaps in service delivery but also broadens access to patient-centered care.
Despite these advantages, the integration of AI into mental healthcare faces two key challenges: ethical dilemmas and barriers to user acceptance. Data privacy concerns remain paramount due to the sensitive nature of mental health data. Robust safeguards for patient information are imperative, particularly as persistent anxieties about large-scale data surveillance undermine user confidence. 2 Meanwhile, AI’s inherent limitations in human empathy and contextual understanding, qualities essential to therapeutic relationships, risk eroding trust. Surveillance fatigue, poor management of therapeutic boundaries, and unintended disclosure of non-clinical personal data may further weaken users’ trust, a critical concern given that trust is foundational to effective digital mental health interventions. Another concern is the opacity of algorithmic decision-making, often termed the “black box” problem, which compromises transparency and fuels skepticism about AI-driven conclusions. When AI influences clinical decisions, accountability becomes ambiguous, particularly in crisis scenarios where questions arise about ultimate responsibility for harmful outcomes. 14 Thus, AI algorithms must address concerns about transparency and algorithmic decision-making to build user trust. Explainable Artificial Intelligence (EAI),which aims to make AI decision-making processes transparent and interpretable to users, has emerged as a critical solution to enhance credibility, accountability, and trust in high-stakes applications such as mental health interventions. Moreover, the implementation of anthropomorphic AI designs requires cautious calibration, as excessive human-like features may provoke resistance to perceived mechanization of care. 15 Thus, the successful adoption of AI in mental healthcare hinges not only on technological advancement but also on developing user-comprehensible interpretive frameworks. Addressing these challenges demands proactive ethical governance and regulatory structures. Ensuring AI tools align with both ethical standards and therapeutic objectives requires participatory frameworks that embed clinicians, patients, and policymakers as co-designers throughout the technology lifecycle. Co-design approaches with end-users have been shown to enhance designer empathy and foster more responsive, culturally appropriate solutions. 20 Participatory frameworks, which involve users in the design and refinement process, have similarly demonstrated their effectiveness in fostering user trust and ensuring that digital mental health applications align with user needs and preferences. 16 By incorporating these co-design and participatory principles into the development of AI-assisted mental health applications, developers can ensure that technology solutions genuinely address user needs while maintaining high standards of safety and efficacy.
User feedback serves as a central catalyst for the iterative optimization of digital health platforms, underpinning collaborative efforts among stakeholders. As a cornerstone of HCI and user-centered design (UCD) methodologies, feedback mechanisms critically inform the development and refinement of these systems. 6 Distinct from alternative approaches, UCD prioritizes users throughout the HCI lifecycle, enabling the creation of intuitive tools that align with clinical workflows and user expectations. By systematically integrating user insights, digital health technologies can enhance engagement while ensuring compatibility with existing practices—a prerequisite for sustained adoption. 16 In conversational interfaces such as chatbots, precise interpretation of user inputs and contextually appropriate responses are essential for maintaining seamless engagement, directly influencing utilization rates and long-term retention. 21 This principle holds heightened significance in mental health applications, where user feedback proves indispensable for identifying unintended adverse effects of technologies like behavioral tracking. For instance, Sanches et al. emphasize the necessity of designing tracking systems that mitigate discouragement by carefully framing negative data and providing supportive user interactions. 22 Co-design practices that engage service users and caregivers as domain experts are vital for developing ethical and effective mental health AI. Feedback loops bridge the innovation-acceptability divide, ensuring technologies resonate with user preferences and therapeutic goals. However, scholars caution against overreliance on self-reported user needs, as individuals may lack awareness of their latent requirements or the technical possibilities available. 6 Consequently, improving digital healthapplications demands a dual focus: fostering end-user participation while employing agile development frameworks that enable continuous product iteration informed by empirical evaluation.
The adaptability of digital health platforms can be enhanced through iterative design processes that integrate real-world user insights. For instance, a study analyzing AI-driven Conversational Agents (CAs) in mental health, which are designed as “diary-like” applications, revealed communication breakdowns as a primary contributor to user disengagement. 23 Such breakdowns occur when conversational agents inadequately interpret or contextually respond to user inputs, thereby diminishing intervention efficacy and fostering negative experiences. These findings underscore the imperative to refine conversational agents capabilities in natural language processing and response generation, directly informing iterative design improvements. In digital mental health, user feedback derived from mixed-methods approaches like triangulating qualitative data (interviews or questionnaires) with quantitative engagement metrics, provides critical insights into technology integration and its psychosocial impacts. Researchers have observed new user behaviors, including clinicians repurposing online interventions as adjunct therapeutic tools or patients using mood diaries to raise sensitive topics during therapy. 21 These emergent usage patterns and unanticipated challenges constitute actionable feedback for adaptive system redesign. Stakeholder-centered design methodologies further illuminate nuanced user requirements and contextual constraints. Co-designing prototypes with end-users cultivates designer empathy, ensuring solutions align with lived experiences while preemptively addressing usability barriers. By embedding user perspectives into development cycles, such participatory frameworks not only enhance experiential outcomes but also foster trust in digital interventions—a prerequisite for sustained adoption.
HCI lays the foundation of user experience, AI technology empowers personalized service, and user feedback drives continuous optimization of the system. The core of the synergy of the three lies in the “human-centered” design philosophy - from the ease of interface interaction, to the interpretability of AI decision-making, to the initiative of user participation, all of which need to be centered on the emotional needs and cognitive characteristics of users. Despite these contributions, current methodological approaches to AI-assisted mental health app design and evaluation often remain fragmented and limited in scope. Many studies prioritize either clinical efficacy, algorithmic functionality, or usability in isolation, without sufficiently accounting for the interconnected nature of users’ psychological, technological, and social experiences. This disciplinary siloing results in evaluation tools that overlook how emotional resonance, interface design, AI personalization, privacy expectations, and social dynamics collectively shape user engagement and therapeutic effectiveness. Moreover, critical concerns—such as trust in algorithmic decision-making, ethical data handling, and the sustainability of user participation—are frequently underexplored. These gaps highlight the pressing need for an integrative evaluative framework that brings together multiple interdependent factors: HCI principles, usability and user experience, AI-driven personalization, privacy and trust safeguards, feedback loops, and mechanisms for social support and community building. Such a comprehensive, interdisciplinary approach ensures that mental health technologies are not only functionally sound but also emotionally relevant, ethically responsible, and socially resonant, laying a stronger foundation for effective, user-centered digital mental health interventions.
Building on these theoretical foundations, our experimental investigation focuses on five interconnected HCI elements that represent critical components of effective digital mental health interventions. First, we examine interface design and usability through analysis of navigation structures and affordances that facilitate user interaction. Second, we assess emotional resonance through design element, including soft music, visual cues, and pre-set emotional tags that enhance the emotional support experience. Third, we investigate personalization through AI features such as adaptive recommendations and tailored interactions that respond to individual user needs. Fourth, we evaluate feedback mechanisms and iterative design processes that enable continuous refinement based on user input. Fifth, we analyze community engagement features that foster peer support and social connection. By systematically examining these elements through our mixed-methods approach, we can provide comprehensive evidence for how thoughtful HCI design, combined with AI capabilities and community features, creates effective digital mental health interventions.
2. Methods
This research examines Xin Dao Diary, an emotional healing app in China that doubles as a social media platform. Launched in 2020, Xin Dao Diary (Figure 1) combines therapeutic content (e.g., mindfulness exercises, AI counseling), user-generated support communities, and social networking features, having over 2 million users.
24
Xin Dao Diary is characterized by its integration of social media attributes with emotion management features. The platform facilitates this engagement by enabling users to document their emotional experiences in digital diaries, which the app then analyzes to generate personalized emotional reports. This study was conducted in mainland China between September to November 2024. It combines (1) a researcher-led technical walkthrough of the production app, (2) a two-week remote diary study with 11 university students, and (3) a sentiment analysis of public user reviews from the Xiaomi App Store. All activities were carried out remotely on participants’ own smartphones; no clinical sites were involved. The walkthrough is a researcher-led, HCI-focused audit intended to systematically examine the app’s environment of expected use, interface materiality and constraints and it is designed to complement, rather than replace the real-user data reported in diary studies and sentiment analysis. Because Xin Dao Diary is a mature platform launched in 2020 with over 2 million users, our study is not a pilot of the app but a case-study evaluation of app usage. Push notification of Xin Dao Diary (English text was translated by corresponding author).
2.1. Walkthrough method
The walkthrough method, a standard HCI approach, is effective for systematically identifying usability problems. It is valued for its structured approach.20,25 The method requires few resources and can be conducted by designers, developers, or researchers and thus it is cost-effective for iterative design and evaluation cycles.25,26 The method has been successfully adapted for various application areas, including mobile apps, health applications, and domain-specific tools, and can be combined with other techniques for more comprehensive results.25–27 This study employs the walkthrough method to investigate the design, interface, and features of the Xin Dao Diary app from a HCI perspective. 28 This approach enables a comprehensive analysis of user interactions and interface usability. Specifically, we examined the environment of expected use by looking at the vision and operating model. We analyzed the strategies employed by the app provider to manage and control user interactions in order to maintain the app’s operational model and achieve its vision. We also examined information regarding user data applications, privacy, and safety. 28 Subsequently, throughout the technical walkthrough process, we adopted the perspective of a user, interacted with the app’s interface, navigated through screens, pressed buttons, and investigated the menu options. In this phase, we generated detailed field notes. We focused on the app’s materiality, such as the actions it demands and directs users to perform, and considered how users might interpret these as affordances or constraints. 28 To be specific, we examined user interface arrangement, functions and features, textual content and tone and symbolic representation.
By navigating the app as users, researchers evaluated the interface’s usability and its role in facilitating effective human-computer interaction. This method allowed for the identification of design elements, such as emotion logging and pre-set tags, which enhance user engagement and emotional expression. The analysis highlighted how these features align with user expectations for mental health care.
To address the limitations of the walkthrough method, such as its lack of analysis regarding user content, activities, or attitudes, we incorporate supplementary data from diary studies and app reviews. This combination provides a richer understanding of user attitudes and activities, enhancing our evaluation of the app’s impact on mental health outcomes and its role in AI-assisted mental health care.
2.2. Diary studies
Demographic information of diary study participants.
In the diary study, participants recorded their daily experiences with the app, including their expectations and perceptions of the app’s empathetic qualities. They also noted how closely they adhered to the app’s guidance and instructions, and reported on their sense of autonomy while using the app. At the end of the study, participants provided an overall evaluation of their experience, assessing the app’s influence on their emotions both online and offline.
Prior to the study, participants were briefed on its purpose, procedures, and privacy protection measures, and provided informed consent. Given the sensitive nature of mental health topics, participants were informed of the possibility of psychological discomfort and were free to withdraw from the study at any time. To ensure privacy, all data were anonymized. Participants received a compensation of 100 RMB (approximately 13 USD) upon completing the study.
The diary studies provided valuable insights into the app’s healthcare impacts by capturing users’ emotional experiences and interactions over time. Through detailed personal accounts, we gained an understanding of how the app affects users’ emotional well-being and its potential role in emotion regulation. Participants’ detailed accounts of their interactions with AI features, like the “Forest Healing Room,” provided qualitative insights into how these elements offer personalized mental health interventions. The study captured user experiences with AI-driven sessions that employ psychological techniques (e.g., CBT, Dialectical Behavior Therapy (DBT)), illustrating how these contribute to user satisfaction and emotional well-being.
2.3. Sentiment analysis
Using a Web Crawler, we collected 9,346 reviews from the Xiaomi Store, focusing on Xin Dao Diary. After excluding irrelevant comments and emojis, we processed 9,171 reviews spanning from August 2021 to August 2024 using sentiment analysis with Atlas. ti. This approach allowed us to systematically evaluate user satisfaction and emotional responses to the app, By examining the sentiment in user feedback, researchers can identify trends and patterns that reflect the app’s impact on mental health outcomes. Positive feedback might indicate beneficial interactions and successful outcomes, while negative feedback could suggest areas where the app needs improvement.
3. Results
3.1. User interface and experience
Xin Dao Diary’s designed interface and features effectively facilitated user interaction, emotional tracking, and engagement, creating a healing and supportive environment that aligned with user expectations for emotional expression and well-being. The app’s design promoted interaction through its digital diary feature, which allowed users to log their feelings using pre-set emotional tags. This structure not only helped users articulated their emotions but also organized those emotions systematically, promoting mental health care. Specific interface elements, such as push notifications (Figure 1), soft music, and inspirational quotes, further enhanced user engagement by reminding them to document their emotions and providing comfort when needed.
For instance, when a user was not actively using the app, it sent push notifications prompting them to record their emotions. Upon the user’s next login, a feedback interface appeared, asking if they have new feelings to share. The app may also push comforting content, like soft music or inspirational quotes, to offer emotional support. Within the diary feature, users were encouraged to document their emotions using pre-set tags, which helped them better understand and articulate their feelings. The app employed algorithms to match users’ emotion records with similar ones, displaying these on their page. When a user clicks on another person’s mood record, they were directed to a comment interface, where the app prompted them to “warmly respond and empathize.” Additionally, users had the option to view calming images or watch short videos related to emotional healing during the session.
Beyond notifications, the app implements soft control mechanisms that structures user activity and access. These include time-gated resource collection, staged capability cultivation, episodic event triggers, and community access barriers. For instance, “Daily Heart Collection and Tasks” is a time-gated loop where users collect “hearts” at fixed intervals, complete daily tasks, and level up; higher levels yield more hearts per unit time. This progression scaffolds continued engagement while control users’ gains. Real-time interactions in the “Floating Island” are gated by level/criteria, creating an entry threshold for new users.
Result of sentiment analysis generated from APP reviews.
With 75% of the reviews reflecting positive emotions, users were satisfied with the app, appreciating its ability to provide timely solutions to emotional challenges, a healing environment, and anonymous social interactions. The following quotes are extracted from the diary studies. Participants were instructed to document their daily experiences with the app, and these data were analyzed to identify themes related to user satisfaction, emotional impact, and perceived therapeutic benefit. Participant identifiers (P01-P11) are used to preserve anonymity while enabling traceability.
Participant P04 said “I have certain expectations for the app: I hope it can help me regulate negative emotions and become my “emotional trash can,” allowing me to express many “unspeakable” little secrets. These expectations were partially fulfilled. Specifically, the app indeed provided me with a space to vent my emotions and document my life. Additionally, the posts I shared received many comments from ‘island friends (other users).’ They gave me comfort and empathy, making me feel cared for by strangers. ” Similarly, P07 acknowledged “before using the app, I hoped it could truly replace a traditional paper diary, as the app might make regulating emotions more convenient and effective. The app could be opened at any time, allowing for instant recording without the need for pen and paper, making recording emotions much more accessible. Upon entering the app’s diary recording interface, a character would inquire whether my mood has improved. If not, the app would offer me some suggestions for emotional improvement. This feature compensated for the lack of interaction in traditional diaries, where the absence of communication can diminish the therapeutic effect. As a result, the app served better as a companion for emotional support.” The app’s promotional video also highlighted its strengths in providing a healing environment to meet users’ expectations.
User feedback highlighted satisfaction with the app’s structured yet flexible environment for emotional expression. Users such as P04 had noted the convenience of being able to record emotions instantly, enhancing the therapeutic effect compared to traditional diaries. However, some users, such as P01, had raised concerns about privacy, suggesting the need for more robust privacy settings to maintain user trust and satisfaction.
3.2. AI-driven personalization in mental health care
The AI features within Xin Dao Diary played a crucial role in delivering personalized mental health care by offering tailored recommendations and interventions. A standout feature is the app’s AI-powered “Forest Healing Room,” where users could engage in a 10-minute “psychological therapy” session with an animal avatar powered by a large language model (Figure 2). During these sessions, the animal therapist actively inquired about the user’s recent emotional state and employs psychological techniques, such as CBT and DBT, or Motivational Interviewing, to help regulate emotions when appropriate. “Forest healing room”, an AI driven psychological therapy session at Xin Dao Diary (Screenshot of XinDao).
Users had reported that these sessions provide valuable insights and aid in emotional regulation. For instance, P08 shared her experience with the AI animal therapists, who helped her navigate negative emotions. She added, “I feel the AI animal therapists grasp my emotions and perspectives. Though they may not fully empathize, their programmed companionship occasionally enlightens and aids me through challenges. With their help, I will not be constantly stuck in negative emotions.”
AI-driven recommendations were generally perceived as effective, with users appreciating the personalized content that resonated with their current emotional states. However, trust in AI varies among users, with some expressing skepticism about the authenticity of emotional records generated by the algorithms. Several participants had openly admitted to having low trust in the app, feeling hesitant to post content involving personal privacy due to the lack of robust privacy settings and the high likelihood that their published diaries could be seen by others. P04 stated, “I feel hesitant to post personal content because the app lacks privacy settings, and it seems like anyone could view my diary.” Similarly, P11 held a critical view of the application including the AI animal therapists, questioning the authenticity of emotional records, the rationality of emotional quantification, and the actual effectiveness of emotional regulation. These concerns highlighted the need for continuous improvement in AI personalization to enhance user trust and the effectiveness of mental health interventions.
3.3. User interaction and feedback
Xin Dao Diary’s responsive adaptation to user feedback had led to the integration of innovative features and community-driven activities, fostering a supportive environment that enhances users’ mental health and sense of belonging. The platform actively evolved based on user input, as evidenced by the introduction of features like AI animal therapists, which were added following user suggestions. Additionally, community activities that gave a voice to people with depression were implemented to address users’ suggestions and their need for psychological counseling.
The app’s adaptability not only improved the user experience but also garnered positive feedback. One user in social media remarked, “As a ‘veteran’ user of Xin Dao Diary, I have seen many aspects of the app. The most touching aspect for me is the attitude of the app developer. The app is the most conscientious app I have used so far.” The user further illustrated that when the online community “Floating Island” faced operational issues, “the app management team promptly adjusted and came up with plans. They are responsible and do far more than we think.”
Social networking features played a pivotal role in fostering a supportive community within the app. The algorithmic matching of users experiencing similar emotions encouraged mutual support and empathy, creating a connecting environment that enhances mental health outcomes. For example, in Figure 3, users sharing the same emotion such as feeling sad were shown at the same interface and user could click the emotional icon and interact with another user. Social networking feature blended with emotion at Xin Dao Diary (English text was translated by the corresponding author, screenshot from the author’s phone).
Through the app, users developed a sense of self-identity by considering their emotional needs and hoping the app can satisfy them. Specifically, using the app enhanced users’ self-awareness and self-efficacy, while also fostering a sense of social belonging. For example, participant P06 noted that the app allowed her to express her feelings, which strengthens her self-awareness and provides a sense of satisfaction.
Users also gained a sense of social belonging. Participant P08 shared, “I really enjoy reading letters from other users; most of the letters are sincere long texts, and the ones I receive are from peers facing similar dilemmas. Waiting for the letters to arrive makes me cherish them even more, and I keep all the letters I get. For me, having genuine help documented gives me more motivation to support more people.” Through sincere communication with peers, resonance with shared struggles, anticipation and appreciation for incoming letters, and the act of saving letters, users developed a sense of identification with and belonging to the community, which inspired their internal drive to support others.
Users’ self-efficacy was enhanced when they discovered they can help others and accomplish tasks within the app. Participant P07 shared her experience of comforting a user who was struggling to adapt to the stress of senior high school through the app’s messaging function. The user’s gratitude for her response gave P07 a sense of self-efficacy in providing psychological support to others.
Users like P04 and P10 had shared experiences of receiving virtual empathy and support from other users, reinforcing the community’s positive atmosphere. Beyond verbal communication, users employed emojis and icons to convey understanding and resonance with others’ emotions. P04 recounted, “After posting a diary entry on feeling emotionally drained, strangers (other users) ‘hugged’ me virtually and reassured me that things would improve. In other entries, strangers express shared feelings, indicating a moment of shared emotions, allowing others to empathize or project themselves into my narrative, thus experiencing similar emotions.” Similarly, P10 mentioned, “In the app, we comfort and support each other, virtually lighting a bonfire to give a hug. This interaction makes me feel a genuine connection with others.” Through these specific acts of shared vulnerability and virtual comfort, users actively build the empathetic, positive atmosphere that defines the app’s community.
4. Discussion
4.1. Development of the holistic AI care design framework
The Holistic AI Care Design framework emerged systematically from our multi-methods analysis. The walkthrough method identified specific HCI elements implemented within Xin Dao Diary’s interface, documenting design features such as pre-set emotional tags, push notifications, and visual affordances. Concurrently, diary studies with 11 participants revealed rich user experiences related to interface design, AI personalization, privacy concerns, and community engagement features over the two-week study period. By synthesizing findings across all three data sources, we identified four interconnected dimensions that collectively characterize effective AI-assisted mental health design: user-centered interface design, AI-driven personalization, privacy and trust and community support features. Each pillar of the framework was validated by specific findings from our methodology: interface design elements identified through the walkthrough method demonstrated measurable improvements in usability; personalization features revealed in the diary study correlated with higher user satisfaction; privacy concerns documented by participants informed the trust and security dimension; and community features emerged as critical enablers of user engagement and self-efficacy. Furthermore, sentiment analysis of app reviews complemented these qualitative findings, quantifying overall user satisfaction levels. This data-driven approach ensured that the framework directly reflects the real-world experiences and needs of users, rather than representing theoretical constructs disconnected from actual implementation.
4.2. Framework components
Holistic AI care design.
The study on Xin Dao Diary underscores the pivotal role of user-centered design in AI-assisted mental health applications. A well-crafted interface can significantly enhance user engagement and emotional tracking by creating a soothing and supportive environment. For instance, incorporating soft music and inspirational quotes fosters a healing ambiance, while emotion tagging and intuitive navigation address diverse user needs. Personalization further amplifies user satisfaction by tailoring features such as social interactions and AI-driven therapy sessions. These interventions support emotional regulation, ensuring that users feel seen and supported.
Privacy and trust are fundamental to the success of AI-assisted mental health apps. Xin Dao Diary has faced challenges in this area, as some users expressed reluctance to share personal content due to concerns about privacy protections. A more transparent and ethical framework for user data usage and processing is necessary to address these concerns. For example, stronger privacy settings, including options for anonymization and selective content sharing, can build trust and encourage users to engage more openly with the app.
The app leverages AI-driven personalization to deliver unique emotional support experiences. One standout feature is the “Forest Healing Room,” where users interact with animal avatars for personalized support sessions. By employing large language models to analyze user emotions and behaviors, Xin Dao Diary tailors interactions in real time, integrating evidence-based techniques such as CBT and Motivational Interviewing. This adaptive learning ensures that interventions remain relevant and effective for each user, reinforcing the app’s role as a trusted mental health tool.
Community support is another essential element of Xin Dao Diary’s design. The app fosters a sense of belonging by connecting users with similar emotional experiences, promoting empathy and mutual support. Features such as “Floating Island” and comment interfaces encourage users to “warmly respond and empathize” create a supportive environment where individuals feel understood.
Evaluation criteria including usability, user engagement and ethical implications. Usability focuses on interface design, accessibility, and user experience, with features like the digital diary and emotion tagging earning positive feedback for providing a structured, user-friendly environment. User engagement is tracked through usage frequency and community participation, with push notifications and algorithmic emotional support proving effective in sustaining interest. Evaluation criteria for ethical considerations encompass beneficence, assessed through user reports of healing and satisfaction; nonmaleficence, reflected in privacy related hesitation, erosion of trust, and distress events; autonomy, evidenced by use of visibility controls and gains in self-efficacy; explicability and transparency, measured by the clarity of AI rationales and user comprehension; privacy and data protection, evaluated via granular settings, uptake of anonymization, and rates of incidents; accountability, shown by prompt developer responses and corrective updates; stakeholder engagement, captured by the frequency and impact of feedback driven changes; lifecycle and iterative improvement, indicated by feature evolution aligned with user input; risk identification, demonstrated through documented risks and mitigation actions; and metrics and monitoring, tracked with sentiment trends and usage indicators.
While anonymization processes and transparent policies address some concerns, ongoing user feedback and regular audits are crucial to mitigate biases, enhance privacy protections, and ensure responsible AI use.
Mapping of HCI elements, evaluation methods, and outcomes.
5. Conclusion
This study examined the role of Xin Dao Diary, an AI-assisted mental health platform, in enhancing emotional well-being through innovative design and user engagement strategies. With a mixed-methods approach, including walkthrough methods, diary studies, and sentiment analysis of user feedback, we explored how digital interfaces can facilitate effective mental health care. Our findings reveal that intuitive interface design and personalized AI interventions are associated with higher user satisfaction and self-reported emotional health improvements. The app cultivates an emotionally supportive user experience through a combination of minimalist design, emotionally resonant features, and adaptive support mechanisms. Specific elements—such as pre-set emotional tags, timely push notifications, calming audio-visual stimuli, and AI-generated avatars—assist users in articulating and regulating emotions more effectively than traditional self-help methods. Moreover, the app’s community-oriented features, including anonymous diary sharing and peer-to-peer interaction, foster a sense of belonging and emotional reciprocity, thereby addressing the social isolation commonly associated with mental health challenges. Nonetheless, the study also highlights several critical concerns, particularly regarding data privacy, algorithmic transparency, and the authenticity of emotional responses, which may undermine user trust and limit long-term engagement. The present research proposes a Holistic AI Care Design which emphasized the integration of multiple factors, including user needs, AI personalization, privacy, and community building in app design. It also incorporates usability, user engagement, and ethical considerations into the evaluation of AI-assisted mental health apps. This research underscores the importance of interdisciplinary approaches in advancing digital health solutions, offering valuable insights for developers and healthcare practitioners aiming to optimize user experience and therapeutic efficacy.
Theoretically, this study advances the field of HCI and digital mental health by moving beyond fragmented evaluations of individual features toward a holistic, integrative framework. Existing literature has tended to examine discrete design elements—such as usability, personalization, or community support—in isolation, without accounting for their interdependencies or collective therapeutic impact. The Holistic AI Care Design framework proposed here bridges this gap by demonstrating how affective, functional, social, and ethical dimensions of app design mutually reinforce one another to produce sustained user engagement and emotional well-being. This contribution extends relational perspectives in HCI research by specifying how AI-mediated environments can be designed not merely to respond to users, but to co-constitute supportive emotional ecologies. In doing so, the framework offers a transferable conceptual vocabulary for evaluating AI-assisted mental health interventions across diverse cultural and clinical contexts.
From a practical standpoint, these findings carry concrete implications for app developers, mental health practitioners, and platform policymakers. Developers should prioritize not only technical functionality but also the emotional resonance and ethical architecture of their products, ensuring that personalization features respect user autonomy and data privacy. Practitioners can employ the Holistic AI Care Design framework to evaluate whether a given application is therapeutically appropriate and aligned with user needs. Policymakers and institutional stakeholders, particularly in China and other contexts where mental health stigma and service gaps persist, should recognize AI-assisted platforms as complementary tools that can extend care reach, while establishing regulatory safeguards around algorithmic transparency and informed consent. The present case study of Xin Dao Diary suggests that embedding these considerations into design from the outset—rather than retrofitting them after deployment—may help foster user trust and long-term engagement.
Several limitations of this study merit acknowledgment and point toward directions for future research. First, the diary study sample was relatively small and demographically homogeneous—comprising predominantly young female university students—which may limit the generalizability of findings to broader populations, including older adults, clinical users, or individuals from different socioeconomic backgrounds. Second, as a single-platform case study, the research reflects the specific design philosophy and user community of Xin Dao Diary; cross-platform comparative studies would strengthen the external validity of the Holistic AI Care Design framework. Third, the two-week study duration precludes conclusions about long-term engagement trajectories or the sustained therapeutic impact of repeated app use. Future research should employ longitudinal designs, incorporate clinically validated psychological outcome measures, and expand to diverse user populations and cultural contexts. Comparative studies examining how different implementations of the Holistic AI Care Design principles perform across varying app ecosystems would further advance evidence-based guidelines for AI-assisted mental health intervention development.
These findings inform a Holistic AI Care Design framework that integrates user-centered interface design, AI-driven personalization, privacy protections, active feedback mechanisms, and community support features to create effective mental health applications. This comprehensive approach is essential because these elements collectively build trust, ensure safety, and maximize the real-world impact of AI-assisted mental health tools while fostering individualized care. By systematically distinguishing between the specific HCI elements implemented, the rigorous evaluation methods employed to assess them, and the documented outcomes demonstrating improved usability and satisfaction, this research clarifies how thoughtful design choices translate into measurable improvements in user experience. The framework demonstrates that successful digital mental health interventions require not simply the presence of individual features, but rather their thoughtful integration, rigorous evaluation, and alignment with user needs and preferences. Human-centered design approaches help identify potential risks such as over-reliance on AI, ensuring that technology augments rather than replaces human judgment and empathy. Given the sensitive nature of mental health data, robust privacy protections, transparent data practices, and active user feedback remain critical for building trust and ensuring responsible AI deployment. Ultimately, this integrated approach ensures that technology enhances rather than undermines the quality and safety of mental health care, providing a model for future development of digital mental health interventions. The global mental health crisis continues to evolve, such holistic and user-informed approaches are not only timely but essential for realizing the promise of digital well-being.
Footnotes
Acknowledgments
The authors would thank Lin Gao and Peng Ding for their help in data collection and discussion. Special thanks go to participants in diary studies. Authors used AI writing assistance tools for manuscript organization and language refinement. All data, analyses, and key intellectual contributions were conducted by the authors. The use of AI did not compromise the scientific integrity of the research.
Ethical consideration
There was no ethics or institutional committee in place at the researchers’ institution at the time the study was conducted.
Consent to participate
Written informed consent was obtained from all the participants prior to study initiation. each participant signed a written informed consent form before engaging in any study activities. All collected data were anonymized to protect participant identities.
Author contributions
ZZ: Writing – review & editing, Conceptualization, Methodology; XT: Writing – original draft, Conceptualization, Methodology, Formal Analysis.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research is supported by China Postdoctoral Science Foundation (2024M762277).
Declaration of conflicting interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data Availability Statement
All data generated or analyzed during the study are included in the published article.
Copyright declarations
No copyrighted figures from third-party publications have been reproduced. All app screenshots were captured by the research team and are used strictly for non-commercial academic research, critical commentary, and scholarly discussion purposes.
