A Scoping Review of AI-Driven Digital Interventions in Mental Health Care: Mapping Applications Across Screening, Support, Monitoring, Prevention, and Clinical Education
Abstract
:1. Introduction
2. Methods
2.1. Research Aims
2.2. Design and Scope of the Study
2.3. Identification and Selection of Studies
2.4. Search Strategy and Data Extraction
3. Results
3.1. Summary of Reviewed Results
3.2. Analysis and Synthesis of the Results
Key Findings in AI Technologies in Mental Health Care
3.3. Applying Natural Language Processing
3.4. Applying Machine Learning
3.5. Applying Deep Learning
Key Findings in the Applications in Mental Health Care
3.6. Applications in the Pre-Treatment Stage
3.7. Applications in the Treatment and the Post-Treatment Stage
3.8. Applications in General Support, Improvement, and Prevention
3.9. Clinical Education
4. Discussions
4.1. AI Modalities Applied Across Phases
4.2. AI-Clinician Collaboration
4.3. Strengths, Weaknesses, Opportunities, and Threats
4.4. Advancing Mental Health Prevention and Improvement
4.5. Policy Implications
4.6. Limitations
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Coombs, N.C.; Meriwether, W.E.; Caringi, J.; Newcomer, S.R. Barriers to healthcare access among U.S. adults with mental health challenges: A population-based study. SSM Popul. Health 2021, 15, 100847. [Google Scholar] [CrossRef] [PubMed]
- Hidaka, B.H. Depression as a disease of modernity: Explanations for increasing prevalence. J. Affect. Disord. 2012, 140, 205–214. [Google Scholar] [CrossRef]
- Uutela, A. Economic crisis and mental health. Curr. Opin. Psychiatry 2010, 23, 127–130. [Google Scholar] [CrossRef]
- Torous, J.; Myrick, K.J.; Rauseo-Ricupero, N.; Firth, J. Digital mental health and COVID-19: Using technology today to accelerate the curve on access and quality tomorrow. JMIR Ment. Health 2020, 7, e18848. [Google Scholar] [CrossRef]
- Arean, P.A. Here to stay: Digital mental health in a post-pandemic world—Looking at the past, present, and future of teletherapy and telepsychiatry. Technol. Mind Behav. 2021, 2, e00073. [Google Scholar]
- Friis-Healy, E.A.; Nagy, G.A.; Kollins, S.H. It is time to REACT: Opportunities for digital mental-health apps to reduce mental-health disparities in racially and ethnically minoritized groups. JMIR Ment. Health 2021, 8, e25456. [Google Scholar] [CrossRef]
- Prescott, M.R.; Sagui-Henson, S.J.; Chamberlain, C.E.W.; Sweet, C.C.; Altman, M. Real-world effectiveness of digital mental-health services during the COVID-19 pandemic. PLoS ONE 2022, 17, e0272162. [Google Scholar] [CrossRef]
- Wu, T.; He, S.; Liu, J.; Sun, S.; Liu, K.; Han, Q.L.; Tang, Y. A brief overview of ChatGPT: History, status-quo and potential future development. IEEE/CAA J. Autom. Sin. 2023, 10, 1122–1136. [Google Scholar] [CrossRef]
- Lattie, E.G.; Stiles-Shields, C.; Graham, A.K. An overview of and recommendations for more accessible digital mental-health services. Nat. Rev. Psychol. 2022, 1, 87–100. [Google Scholar] [CrossRef]
- Adeshola, I.; Adepoju, A.P. The opportunities and challenges of ChatGPT in education. Interact. Learn. Environ. 2023, 32, 6159–6172. [Google Scholar] [CrossRef]
- Biswas, S.S. Role of ChatGPT in public health. Ann. Biomed. Eng. 2023, 51, 868–869. [Google Scholar] [CrossRef] [PubMed]
- D’Alfonso, S. AI in mental health. Curr. Opin. Psychol. 2020, 36, 112–117. [Google Scholar] [CrossRef] [PubMed]
- Su, S.; Wang, Y.; Jiang, W.; Zhao, W.; Gao, R.; Wu, Y.; Tao, J.; Su, Y.; Zhang, J.; Li, K.; et al. Efficacy of Artificial Intelligence-assisted psychotherapy in patients with anxiety disorders: A prospective, national multicentre randomized controlled trial protocol. Front. Psychiatry 2022, 12, 799917. [Google Scholar] [CrossRef] [PubMed]
- Sedlakova, J.; Trachsel, M. Conversational artificial intelligence in psychotherapy: A new therapeutic tool or agent? Am. J. Bioeth. 2022, 23, 4–13. [Google Scholar] [CrossRef]
- Henson, P.; Wisniewski, H.; Hollis, C.; Keshavan, M.; Torous, J. Digital mental-health apps and the therapeutic alliance: Initial review. BJPsych Open 2019, 5, e15. [Google Scholar] [CrossRef]
- Vilaza, G.N.; McCashin, D. Is the automation of digital mental health ethical? Front. Digit. Health 2021, 3, 689736. [Google Scholar] [CrossRef]
- Roumeliotis, K.I.; Tselikas, N.D. ChatGPT and Open-AI models: A preliminary review. Future Internet 2023, 15, 192. [Google Scholar] [CrossRef]
- Adamopoulou, E.; Moussiades, L. An overview of chatbot technology. In Artificial Intelligence Applications and Innovations: AIAI 2019; Springer: Cham, Switzerland, 2020; pp. 261–280. [Google Scholar]
- Bayani, A.; Ayotte, A.; Nikiema, J.N. Transformer-based tool for automated fact-checking of online health information: Development study. JMIR Infodemiol. 2025, 5, e56831. [Google Scholar] [CrossRef]
- Hang, C.N.; Yu, P.D.; Chen, S.; Tan, C.W.; Chen, G. MEGA: Machine-learning-enhanced graph analytics for infodemic-risk management. IEEE J. Biomed. Health Inform. 2023, 27, 6100–6111. [Google Scholar] [CrossRef]
- Rollwage, M.; Juchems, K.; Habicht, J.; Carrington, B.; Hauser, T.; Harper, R. Conversational AI facilitates mental-health assessments and is associated with improved recovery rates. medRxiv 2022. medRxiv:2022.11.03.22281887. [Google Scholar] [CrossRef]
- Lewis, R.; Ferguson, C.; Wilks, C.; Jones, N.; Picard, R.W. Can a recommender system support treatment personalisation in digital mental-health therapy? In CHI ’22 Extended Abstracts; ACM: New York, NY, USA, 2022; p. 3519840. [Google Scholar]
- Liu, J.M.; Li, D.; Cao, H.; Ren, T.; Liao, Z.; Wu, J. ChatCounselor: A large-language-model-based system for mental-health support. arXiv 2023, arXiv:2309.15461. [Google Scholar]
- Shaik, T.; Tao, X.; Higgins, N.; Li, L.; Gururajan, R.; Zhou, X.; Acharya, U.R. Remote patient monitoring using AI: Current state, applications and challenges. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2023, 13, e1485. [Google Scholar] [CrossRef]
- Trappey, A.; Lin, A.P.C.; Hsu, K.K.; Trappey, C.; Tu, K.L.K. Development of an empathy-centric counselling chatbot system capable of sentimental-dialogue analysis. Processes 2022, 10, 930. [Google Scholar] [CrossRef]
- Hadar-Shoval, D.; Elyoseph, Z.; Lvovsky, M. The plasticity of ChatGPT’s mentalizing abilities. Front. Psychiatry 2023, 14, 1234397. [Google Scholar] [CrossRef] [PubMed]
- Kapoor, A.; Goel, S. Applications of conversational AI in mental health: A survey. In Proceedings of the ICOEI 2022, Tirunelveli, India, 8–30 April 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1013–1016. [Google Scholar]
- Moilanen, J.; Visuri, A.; Suryanarayana, S.A.; Alorwu, A.; Yatani, K.; Hosio, S. Measuring the effect of mental-health-chatbot personality on user engagement. In Proceedings of the MUM 2022, Lisbon, Portugal, 27–30 November 2022; ACM: New York, NY, USA, 2022; pp. 138–150. [Google Scholar]
- Moulya, S.; Pragathi, T.R. Mental-health assist and diagnosis conversational interface using logistic-regression model. J. Phys. Conf. Ser. 2022, 2161, 012039. [Google Scholar] [CrossRef]
- Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.J.; Horsley, T.; Weeks, L.; et al. PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef]
- Thieme, A.; Hanratty, M.; Lyons, M.; Palacios, J.; Marques, R.F.; Morrison, C.; Doherty, G. Designing human-centred AI for mental health. ACM Trans. Comput. Hum. Interact. 2023, 30, 15. [Google Scholar] [CrossRef]
- Tutun, S.; Johnson, M.E.; Ahmed, A.; Albizri, A.; Irgil, S.; Yesilkaya, I.; Ucar, E.N.; Sengun, T.; Harfouche, A. An AI-based decision-support system for predicting mental-health disorders. Inf. Syst. Front. 2023, 25, 1261–1276. [Google Scholar] [CrossRef]
- Park, G.; Chung, J.; Lee, S. Effect of AI-chatbot emotional disclosure on user satisfaction. Curr. Psychol. 2023, 42, 28663–28673. [Google Scholar] [CrossRef]
- Sharma, A.; Lin, I.W.; Miner, A.S.; Atkins, D.C.; Althoff, T. Human–AI collaboration enables empathic conversations. arXiv 2022, arXiv:2203.15144. [Google Scholar]
- Amanat, A.; Rizwan, M.; Javed, A.R.; Abdelhaq, M.; Alsaqour, R.; Pandya, S.; Uddin, M. Deep Learning for Depression Detection from Textual Data. Electronics 2022, 11, 676. [Google Scholar] [CrossRef]
- Rathnayaka, P.; Mills, N.; Burnett, D.; De Silva, D.; Alahakoon, D.; Gray, R. A mental-health chatbot with cognitive skills. Sensors 2022, 22, 3653. [Google Scholar] [CrossRef]
- Siemon, D.; Ahmad, R.; Harms, H.; De Vreede, T. Requirements and solution approaches to personality-adaptive conversational agents in mental health care. Sustainability 2022, 14, 3832. [Google Scholar] [CrossRef]
- Blease, C.; Kharko, A.; Annoni, M.; Gaab, J.; Locher, C. Machine learning in clinical psychology education. Front. Public Health 2021, 9, 623088. [Google Scholar]
- Rollwage, M.; Habicht, J.; Juechems, K.; Carrington, B.; Stylianou, M.; Hauser, T.U.; Harper, R. Using conversational AI to facilitate mental-health assessments. JMIR AI 2023, 2, e44358. [Google Scholar] [CrossRef]
- Fulmer, R.; Joerin, A.; Gentile, B.; Lakerink, L.; Rauws, M. Using psychological AI (Tess) to relieve symptoms of depression and anxiety. JMIR Ment. Health 2018, 5, e64. [Google Scholar] [CrossRef] [PubMed]
- Nichele, E.; Lavorgna, A.; Middleton, S.E. Challenges in digital mental-health moderation. SN Soc. Sci. 2022, 2, 217. [Google Scholar] [CrossRef]
- Chen, M.; Shen, K.; Wang, R.; Miao, Y.; Jiang, Y.; Hwang, K.; Hao, Y.; Tao, G.; Hu, L.; Liu, Z. Negative information measurement at AI edge: A new perspective for mental health monitoring. ACM Trans. Internet Technol. 2022, 22, 1–16. [Google Scholar] [CrossRef]
- Jin, S.; Choi, H.; Han, K. AI-augmented art psychotherapy. In Proceedings of the CIKM 2022, Atlanta, GA, USA, 17–21 October 2022; ACM: New York, NY, USA, 2022; pp. 4089–4093. [Google Scholar]
- Shao, R. An empathetic AI for mental-health intervention: Conceptualizing and examining artificial empathy. In Empathy-Centric Design Workshop 2023; ACM: New York, NY, USA, 2023; pp. 1–6. [Google Scholar] [CrossRef]
- Maurya, R.K. Qualitative content analysis of ChatGPT’s client simulation role-play for practicing counselling skills. Couns. Psychother. Res. 2023, 24, 614–630. [Google Scholar] [CrossRef]
- Elyoseph, Z.; Hadar-Shoval, D.; Asraf, K.; Lvovsky, M. ChatGPT outperforms humans in emotional-awareness evaluations. Front. Psychol. 2023, 14, 1199058. [Google Scholar] [CrossRef] [PubMed]
- Levkovich, I.; Elyoseph, Z. Identifying depression and its determinants with ChatGPT. Fam. Med. Community Health 2023, 11, e002391. [Google Scholar] [CrossRef]
- Danieli, M.; Ciulli, T.; Mousavi, S.M.; Riccardi, G. Conversational AI agent for a mental-health-care app. JMIR Form. Res. 2021, 5, e30053. [Google Scholar] [CrossRef] [PubMed]
- Zhang, M.; Scandiffio, J.; Younus, S.; Jeyakumar, T.; Karsan, I.; Charow, R.; Salhia, M.; Wiljer, D. Adoption of AI in mental-health care—Perspectives from professionals. JMIR Form. Res. 2023, 7, e47847. [Google Scholar] [CrossRef]
- Wrightson-Hester, A.R.; Anderson, G.; Dunstan, J.; McEvoy, P.M.; Sutton, C.J.; Myers, B.; Egan, S.; Tai, S.; Johnston-Hollitt, M.; Chen, W. An artificial therapist to support youth mental health. JMIR Hum. Factors 2023, 10, e46849. [Google Scholar] [CrossRef]
- Kleinerman, A.; Rosenfeld, A.; Rosemarin, H. ML-based routing of callers in a mental-health hotline. Isr. J. Health Policy Res. 2022, 11, 25. [Google Scholar] [CrossRef] [PubMed]
- Heston, T.F. Safety of large-language models in addressing depression. Cureus 2023, 15, e50729. [Google Scholar] [CrossRef]
- Kaywan, P.; Ahmed, K.; Ibaida, A.; Miao, Y.; Gu, B. Early detection of depression using a conversational AI bot. PLoS ONE 2023, 18, e0279743. [Google Scholar] [CrossRef]
- Kannampallil, T.; Ronneberg, C.R.; Wittels, N.E.; Kumar, V.; Lv, N.; Smyth, J.M.; Gerber, B.S.; A Kringle, E.; A Johnson, J.; Yu, P.; et al. Virtual voice-based coach for problem-solving treatment. JMIR Form. Res. 2022, 6, e38092. [Google Scholar] [CrossRef]
- Danieli, M.; Ciulli, T.; Mousavi, S.M.; Silvestri, G.; Barbato, S.; Di Natale, L.; Riccardi, G. Assessing the impact of conversational AI for stress and anxiety in aging adults: Randomized controlled trial. JMIR Ment. Health 2022, 9, e38067. [Google Scholar] [CrossRef]
- Straw, I.; Callison-Burch, C. AI in mental health and the biases of language-based models. PLoS ONE 2020, 15, e0240376. [Google Scholar] [CrossRef] [PubMed]
- Dawoodbhoy, F.M.; Delaney, J.; Cecula, P.; Yu, J.; Peacock, I.; Tan, J.; Cox, B. AI in patient-flow management in mental-health units. Heliyon 2021, 7, e06993. [Google Scholar] [CrossRef]
- Alowais, S.A.; Alghamdi, S.S.; Alsuhebany, N.; Alqahtani, T.; Alshaya, A.I.; Almohareb, S.N.; Aldairem, A.; Alrashed, M.; Saleh, K.B.; Badreldin, H.A.; et al. Revolutionizing healthcare: AI in clinical practice. BMC Med. Educ. 2023, 23, 689. [Google Scholar] [CrossRef]
- Rahman, S.M.A.; Ibtisum, S.; Bazgir, E.; Barai, T. The significance of machine learning in clinical-disease diagnosis: A review. Int. J. Comput. Appl. 2023, 185, 10–17. [Google Scholar] [CrossRef]
- Lejeune, A.; Le Glaz, A.; Perron, P.A.; Sebti, J.; Walter, M.; Lemey, C.; Berrouiguet, S. Artificial intelligence and suicide prevention: A systematic review. Eur. Psychiatry 2022, 65, e19. [Google Scholar] [CrossRef] [PubMed]
- Kruszyńska-Fischbach, A.; Sysko-Romańczuk, S.; Napiórkowski, T.M.; Napiórkowska, A.; Kozakiewicz, D. Organizational e-health readiness: How to prepare the primary healthcare providers’ services for digital transformation. Int. J. Environ. Res. Public Health 2022, 19, 3973. [Google Scholar] [CrossRef]
- Hunter, A.; Riger, S. Meaning of community in community mental health. J. Community Psychol. 1986, 14, 55–71. [Google Scholar] [CrossRef]
- Lawrence, H.R.; Schneider, R.A.; Rubin, S.B.; Mataric, M.J.; McDuff, D.J.; Jones Bell, M. Opportunities and risks of LLMs in mental health. arXiv 2024, arXiv:2403.14814. [Google Scholar]
- Timmons, A.C.; Duong, J.B.; Simo Fiallo, N.; Lee, T.; Vo, H.P.Q.; Ahle, M.W.; Comer, J.S.; Brewer, L.C.; Frazier, S.L.; Chaspari, T.; et al. A call to action on assessing and mitigating bias in AI for mental health. Perspect. Psychol. Sci. 2023, 18, 1062–1096. [Google Scholar] [CrossRef]
- Holohan, M.; Fiske, A. “Like I’m talking to a real person”: Exploring transference in AI-based psychotherapy applications. Front. Psychol. 2021, 12, 720476. [Google Scholar] [CrossRef]
- Lin, B.; Cecchi, G.; Bouneffouf, D. Psychotherapy AI companion with reinforcement-learning recommendations. In WebConf 2023 Companion; ACM: New York, NY, USA, 2023; p. 3587623. [Google Scholar]
- Knight, A.; Bidargaddi, N. Commonly available activity tracker apps and wearables as a mental health outcome indicator: A prospective observational cohort study among young adults with psychological distress. J. Affect. Disord. 2018, 236, 31–36. [Google Scholar] [CrossRef] [PubMed]
- Beardslee, W.R.; Chien, P.L.; Bell, C.C. Prevention of mental disorders: A developmental perspective. Psychiatr. Serv. 2011, 62, 247–254. [Google Scholar] [CrossRef]
- Kazdin, A.E. Adolescent mental health: Prevention and treatment programs. Am. Psychol. 1993, 48, 127–141. [Google Scholar] [CrossRef]
- Garrido, S.; Millington, C.; Cheers, D.; Boydell, K.; Schubert, E.; Meade, T.; Nguyen, Q.V. What works and what doesn’t work? A systematic review of Digital mental-health interventions for depression and anxiety in young people. Front. Psychiatry 2019, 10, 759. [Google Scholar] [CrossRef] [PubMed]
- World Economic Forum. Global Governance Toolkit for Digital Mental Health. Available online: https://www3.weforum.org/docs/WEF_Global_Governance_Toolkit_for_Digital_Mental_Health_2021.pdf (accessed on 10 April 2025).
- Mental Health Commission of Canada. Artificial Intelligence in Mental Health Services: An Environmental Scan. Available online: https://mentalhealthcommission.ca (accessed on 10 April 2025).
- Volkmer, S.; Meyer-Lindenberg, A.; Schwarz, E. Large language models in psychiatry: Opportunities and challenges. Psychiatry Res. 2024, 339, 116026. [Google Scholar] [CrossRef] [PubMed]
- Xian, X.; Chang, A.; Xiang, Y.T.; Liu, M.T. Debate and dilemmas regarding generative AI in mental health care: Scoping review. Interact. J. Med. Res. 2024, 13, e53672. [Google Scholar] [CrossRef]
- Guo, Z.; Lai, A.; Thygesen, J.; Farrington, J.; Keen, T.; Li, K. Large language models for mental health applications: Systematic review. JMIR Ment. Health 2024, 11, e57400. [Google Scholar] [CrossRef]
- Liu, Z.; Bao, Y.; Zeng, S.; Yang, L.; Zhang, X.; Wang, Y. Large language models in psychiatry: Current applications, limitations, and future scope. Big Data Min. Anal. 2024, 7, 1148–1168. [Google Scholar] [CrossRef]
- Obradovich, N.; Khalsa, S.S.; Khan, W.U.; Suh, J.; Perlis, R.H.; Ajilore, O.; Paulus, M.P. Opportunities and risks of large language models in psychiatry. NPP Digit. Psychiatry Neurosci. 2024, 2, 8. [Google Scholar] [CrossRef]
- Kolding, S.; Lundin, R.M.; Hansen, L.; Østergaard, S.D. Use of generative artificial intelligence (AI) in psychiatry and mental health care: A systematic review. Acta Neuropsychiatr. 2025, 37, e37. [Google Scholar] [CrossRef]
- Holmes, G.; Tang, B.; Gupta, S.; Venkatesh, S.; Christensen, H.; Whitton, A. Applications of large language models in the field of suicide prevention: Scoping review. J. Med. Internet Res. 2025, 27, e63126. [Google Scholar] [CrossRef] [PubMed]
- Blease, C.; Rodman, A. Generative artificial intelligence in mental healthcare: An ethical evaluation. Curr. Treat. Options Psychiatry 2025, 12, 5. [Google Scholar] [CrossRef]
Label | Theme |
---|---|
AI Chatbots | Rule-based or scripted dialogue systems that deliver predefined psychoeducation or CBT prompts; no statistical/ML adaptation; always user-interfacing. |
Conversational AI Agents | Multi-turn dialogue systems that incorporate traditional NLP components, such as intent classification or sentiment analysis. to tailor replies, but do not employ large language model generation. |
Machine Learning (ML) Models | Supervised algorithms (e.g., logistic regression, SVM, and gradient boosting) trained on structured or text features to classify diagnosis, predict outcomes, or service flow, generally backend analytics. |
Natural Language Processing (NLP) Tools | Standalone language-processing pipelines, tokenization, topic modelling, and emotion detection, used either to support a conversational or analyze text corpora; excludes LLMs. |
Large Language Models (LLMs) | Transformer-based generative models with >1 billion parameters (e.g., GPT-3.5/4) capable of free-text generation, contextual memory, and zero-shot reasoning; typically fine-tuned for multi-turn counselling. |
Deep Learning (DL) Models | Neural networks such as CNNs or RNNs applied to non-text signals (images, sensor streams) or structured clinical data for pattern recognition and outcome prediction. |
AI Prediction Modelling (RPM) | Any ML or DL model embedded in a remote patient-monitoring pipeline that ingests physiological or behavioral signals to stratify risk or trigger alerts in real time. |
Clinical Phase/Scenario | Descriptions |
Pre-treatment/Screening | Interventions used before formal care begins, including online self-referral, triage, or risk screening. |
Treatment | AI components integrated during active psychotherapy, pharmacotherapy, or combined treatment phases. |
Post-treatment/Monitoring | AI tools used for follow-up care, symptom monitoring, risk assessment, or treatment adjustment after formal treatment. |
General support and prevention | Standalone tools aimed at maintaining well-being, reducing stress, or preventing mental health problems in non-clinical or community populations. |
Clinical education | AI tools used to train, assess, or upskill mental health professionals, clinical students, or educators. |
Functional Category | Descriptions |
Assessment | Structured intake or self-report instruments automated by AI to collect clinical or mental health information. |
Diagnosis | Tools designed to output diagnostic labels or severity assessments of mental health conditions. |
Patient monitoring | Tools providing continuous or periodic tracking of symptoms, behaviors, or physiological markers. |
Treatment outcome prediction | Models that forecast treatment responses, dropout risks, or recovery trajectories. |
Mental health counseling/therapy | AI-delivered psychotherapeutic interventions, such as cognitive behavioral therapy (CBT), behavioral activation, or problem-solving therapy. |
Mental health treatment (Clinical decision support) | AI systems recommending treatment plans, medication, or therapy adjustments, supporting clinician decision making. |
Mental health support | Low-intensity support services, such as psychoeducation, emotional assistance, or peer facilitation without formal therapy claims. |
Reference | Scenario/ Application | AI Technology | Purpose | Main Result |
---|---|---|---|---|
[21] | Mental health (MH) assessment | AI chatbot | Examined “Limbic Access” AI in enhancing mental illness recovery in NHS services. | The use of AI tool was associated with an increase in recovery rates from 47.1% in the pre-implementation period to 48.9% post-implementation. |
[23] | Patient monitoring | AI prediction modelling | AI-based RPM model incorporates RFID for monitoring mental health, targeting vital signs and activity classification. | The implementation can help to monitor patients with mental illnesses sufficiently and effectively support the treatment teams to provide timely interventions, improve patient safety, and prevent incidents such as self-harm. |
[31] | Treatment outcomes prediction | Machine learning (ML), RNN | AI applications in iCBT predict mental health outcomes. | The developed AI models demonstrated good accuracy in predicting patient outcomes, resulting in approximately 87% accuracy after three clinical reviews. |
[32] | Detects and diagnoses | ML | AI tool improves MHM assessment, using fewer questions while maintaining diagnostic precision. | The system provided an accuracy level of 89% in diagnosing mental disorders with only 28 questions asked during diagnostic sessions, reducing questions volume, and encouraging better participation. |
[33] | MH counseling/therapy | AI chatbot | AI chatbots’ emotional interactions enhance user satisfaction and retention. | Emotional disclosure in chatbot significantly increases satisfaction and reuse intention. Users’ emotional disclosure intention and the perceived intimacy with a chatbot also mediate the effect. |
[34] | Peer support | Conversational AI agent | HAILEY AI system fosters empathy in text-based peer mental health support. | The study showed a substantial growth of users’ empathic response due to the human–AI collaboration. |
[23] | MH support | Large language model (LLM) | ChatCounselor, an AI tuned with real counseling data, evaluated for mental health aid. | ChatCounselor demonstrated improved performance in a counseling-specific benchmark, showing promising potential in providing mental health support. |
[28] | MH support | LLM, natural language processing (NLP) | Studies chatbots’ personalities, like Big Five Traits, on user engagement in mental health. | The efficacy of chatbots with high conscientiousness drives user engagement, and variations in preferences indicating a match between chatbot and user personalities could influence engagement. |
[27] | MH support | Various, mainly NLP | Survey reveals conversational AI’s acceptance for mental health among students. | The survey result highlights the increasing awareness and positive perception among students toward the use of conversational AI technologies for mental health support. |
[29] | Diagnosis, MH support | Conversational AI agent | Developed a conversational AI agent that acts like a human therapist, providing accessible emotional support and mental health diagnosis | The logistic regression model produced improved accuracy of emotion prediction since the chatbot was able to effectively interact with users and quickly respond. |
[35] | Diagnosis | Deep learning model | Developed an early diagnostic system for detecting depression by analyzing textual data using deep learning techniques. | The model achieved 99% accuracy in detecting depression from textual data, outperforming other frequency-based deep learning models. |
[36] | MH counseling/therapy, patient monitoring | NLP | AI chatbot with Behavioral Activation enhances mood in users. | The pilot study demonstrated effectiveness in mood improvement, with significant improvements observed in mood scores from pre-usage to post-usage. |
[37] | MH support | Conversational AI agent | Study on personality-adaptive conversational agents in mental health care outlines benefits and risks. | The study highlighted both potentials in utilizing PACAs to provide accessible mental health support and serious concerns regarding trust and privacy, indicating requirements. |
[38] | Clinical education | ML | Clinical psychology students show strong interest in AI/ML education. | Identifies high interest and perceived importance among students for AI/ML in their education, with high needs for greater formal instruction in these domains. |
[39] | Patient referral and clinical assessment | AI chatbot | Conversational AI “Limbic Access” boosts psychotherapy efficiency and patient outcomes. | Improved clinical efficiency and reduced the time that clinicians spent on assessments; improved patient outcomes such as shorter wait times, lower dropout rates, more accurate treatment allocation, and higher recovery rates. |
[40] | MH support | AI chatbot, NLP, ML | RCT evaluates “Tess” AI in reducing depression and anxiety in college students. | Showed significant decline in the symptoms of depression and anxiety, which indicated increased engagement and satisfaction with control group. |
[25] | MH counseling/therapy | AI chatbot | VR empathy chatbot for college stress shows stress and sensitivity score reductions. | Indicated a reduction in mean stress level and psychological sensitivity scores of the subjects following the intervention by the VRECC system. |
[41] | MH counseling/therapy, MH support | AI chatbot, NLP | Focuses on the challenges and needs of moderating digital interventions and counseling offered at Kooth. | Outlines some of the main challenges, including controlling time, interpreting user communication, and addressing hidden risks. |
[42] | MH monitoring | Deep learning model | AI-edge computing system monitors pandemic-related negative info’s impact on mental health. | The system is capable of measuring negative information and its influence on well-being, offering an effective strategy for mental health monitoring during pandemic. |
[22] | MH treatment | ML | Assessed the feasibility of recommender systems in personalizing treatment within digital mental health therapy. | Suggested that recommender systems could successfully individualize therapeutic recommendations and improve outcomes. |
[43] | Diagnosis, treatment outcomes prediction | Deep learning model | M2C model in art therapy accurately predicts stress levels through art analysis. | The M2C model had 88.5% accuracy in predicting stress levels based on the data from offered art psychotherapy tests. |
[44] | MH counseling/therapy | AI chatbot | Study compares AI-driven counselor’s artificial empathy with human counselors. | AI-based empathetic counseling was rated with less helpfulness. |
[45] | Clinical education | ChatGPT | ChatGPT’s effectiveness in simulating client scenarios for counseling practice assessed. | Suggested that while ChatGPT can simulate various aspects of client interactions, it has limitations, such as lack of non-verbal cues. |
[46] | Diagnosis, MH support | ChatGPT | ChatGPT’s emotional understanding in conversations evaluated using LEAS. | ChatGPT’s emotional awareness was significantly higher compared to human norms, with improvement over time. It achieved high accuracy in fitting emotions to scenarios. |
[26] | MH treatment | ChatGPT | ChatGPT’s response adaptability for BPD and SPD conditions analyzed. | For BPD, ChatGPT showed the ability to respond with a higher emotional resonance than SPD, which suggests its potential use as personalized mental health intervention. |
[47] | MH treatment | ChatGPT | Comparison of ChatGPT’s and physicians’ depression treatment recommendations. | ChatGPT recommended psychotherapy more frequently than physicians for mild depression and aligned with guidelines for severe depression treatment, showing no biases. |
[48] | MH support | Conversational AI agent | A mental health app design involving participatory protocol with therapists and users for stress management. | Indicated notable symptom improvement and therapist endorsement for AI app integration, highlighting patient engagement benefits. |
[49] | Various, mainly MH treatment | Various | Explores mental health professionals’ views on the needs of AI in care | Showed themes on practice change, faster AI adoption, organizational readiness, and educational needs for optimizing care with AI. |
[50] | MH counseling/therapy | AI chatbot | Examined AI therapist MYLO’s test among youth with mental health issues. | Feedback was positive, noting decreased distress and improved goal conflict resolution. |
[51] | MH support | ML | Use machine learning to enhance mental health hotline efficiency by optimizing caller-to-counselor routing. | Significantly increased call volume handled and improved chat quality, outperforming traditional methods, notably during peak times. |
[52] | MH counseling/therapy | LLM | Evaluates ChatGPT 3.5’s response to severe depression and suicidal tendencies in simulations. | Agents escalated cases or shut down at critical risk; some provided crisis resources. |
[53] | Diagnosis | AI chatbot | Research on DEPRA chatbot’s depression detection effectiveness in Australians. | DEPRA effectively identified depression levels; high satisfaction reported. |
[54] | MH support | Conversational AI agent | Assesses Lumen, a voice-based virtual coach for problem solving, through user experience, workload, and interviews. | Users found Lumen highly usable and engaging, despite some feeling rushed in sessions. |
[55] | MH counseling/therapy | Conversational AI agent | Study assessed TEO’s effect on aging workers’ stress and anxiety. | No major differences in symptom reduction across groups; however, TEO with therapy improved well-being. |
[56] | MH treatment | NLP | Analyzed NLP model biases in psychiatry across demographics, examining health inequality risks. | Underscored AI’s role in enhancing patient management via administrative tasks, real-time support for clinicians, and personalized care through digital phenotyping. |
[57] | MH treatment | ML, NLP | Explores AI to enhance patient flow in NHS mental health units, through literature and expert insights, targeting administrative and clinical efficiency. | Underscored AI’s role in enhancing patient management via administrative tasks, real-time support for clinicians, and personalized care through digital phenotyping. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ni, Y.; Jia, F. A Scoping Review of AI-Driven Digital Interventions in Mental Health Care: Mapping Applications Across Screening, Support, Monitoring, Prevention, and Clinical Education. Healthcare 2025, 13, 1205. https://doi.org/10.3390/healthcare13101205
Ni Y, Jia F. A Scoping Review of AI-Driven Digital Interventions in Mental Health Care: Mapping Applications Across Screening, Support, Monitoring, Prevention, and Clinical Education. Healthcare. 2025; 13(10):1205. https://doi.org/10.3390/healthcare13101205
Chicago/Turabian StyleNi, Yang, and Fanli Jia. 2025. "A Scoping Review of AI-Driven Digital Interventions in Mental Health Care: Mapping Applications Across Screening, Support, Monitoring, Prevention, and Clinical Education" Healthcare 13, no. 10: 1205. https://doi.org/10.3390/healthcare13101205
APA StyleNi, Y., & Jia, F. (2025). A Scoping Review of AI-Driven Digital Interventions in Mental Health Care: Mapping Applications Across Screening, Support, Monitoring, Prevention, and Clinical Education. Healthcare, 13(10), 1205. https://doi.org/10.3390/healthcare13101205