Conversational AI in Pediatric Mental Health: A Narrative Review
Abstract
:1. Introduction
Research Questions
- What types of conversational AI applications have been developed or proposed for supporting pediatric mental health?
- What is the current evidence regarding the effectiveness, acceptability, and safety of these applications?
- What unique considerations apply to the use of conversational AI with children and adolescents compared to adults?
- What ethical, technical, and implementation challenges have been identified?
- What gaps exist in the current research landscape?
2. Methods
2.1. Information Sources and Search Strategy
2.2. Study Selection
2.3. Data Synthesis
3. Current Landscape of Pediatric Mental Health Challenges
Challenges of Current Mental Health Delivery
- Workforce Shortages: The global shortage of child psychiatrists, psychologists, and specialized mental health professionals creates a fundamental capacity limitation [24]. In many regions, the ratio of child mental health specialists to the pediatric population falls dramatically below recommended levels, creating bottlenecks in service delivery and extending wait times for initial assessments and ongoing treatment [25].
- Access Barriers: Even when services exist, multiple barriers impede access, including geographical limitations, transportation challenges, scheduling constraints, high costs, and insurance coverage limitations [26]. For many families, particularly those in rural or underserved areas, the nearest appropriate provider may be hours away, making regular attendance at appointments impractical or impossible [27].
- Fragmented Systems: Pediatric mental healthcare often spans multiple systems including healthcare, education, juvenile justice, and social services. Poor coordination between these systems creates fragmented care pathways, administrative burdens for families, and opportunities for vulnerable children to “fall through the cracks” [28].
- Detection and Referral Challenges: Many mental health conditions in children present initially as somatic complaints, behavioral problems, or academic difficulties and may not be readily recognized as mental health concerns by parents, teachers, or primary care providers [29]. These challenges contribute to delays in identification and appropriate referral, particularly in contexts where mental health literacy remains limited [30].
- Stigma and Help-Seeking Barriers: Despite some progress, mental health conditions continue to carry significant stigma that can discourage help-seeking behaviors among young people and their families [31]. Adolescents, in particular, often express concerns about confidentiality, judgment from peers, and reluctance to engage with traditional clinical environments [32].
- Developmental Considerations: Children’s mental health needs vary significantly across developmental stages, requiring age-appropriate assessment tools and intervention approaches that many systems struggle to properly differentiate and implement [33]. What works for an adolescent may be entirely inappropriate for a young child, yet services often lack the flexibility to adequately address these differences [34].
- Treatment Adherence and Engagement: Even when children do access care, engagement and adherence challenges are common, with dropout rates from traditional mental health services estimated at 40–60% [35,36]. Factors contributing to poor engagement include practical barriers, perceived lack of cultural competence, misalignment with youth preferences, and failure to involve families effectively [37].
4. Emergence of AI Conversational Agents in Mental Health
4.1. Historical Context and Technological Evolution
4.2. Types of Conversational AI in Mental Health Applications
- Rule-Based Systems: Rule-based systems follow predetermined conversation flows and decision trees [51]. While limited in handling unexpected user inputs, they offer precise control over therapeutic content and safety guardrails [52]. Examples include Woebot and Wysa, which deliver structured cognitive behavioral therapy exercises through guided conversations [53].
- Retrieval-Based Systems: Retrieval-based systems select appropriate responses from a database based on user input patterns. They provide more flexibility than rule-based systems while maintaining consistency in therapeutic approaches [54].
- Generative AI Systems: Generative AI systems generate novel responses based on patterns learned during training rather than selecting from predefined options [55]. Modern LLMs fall into this category, offering unprecedented conversational flexibility but raising questions about consistency and safety [56].
- Hybrid Approaches: Many deployed systems combine elements of these approaches, using rule-based frameworks to guide the overall therapeutic structure while employing generative or retrieval-based techniques for specific conversational components [57].
Type | Key Characteristics | Examples in Mental Health | Strengths | Limitations | Pediatric Considerations |
---|---|---|---|---|---|
Rule-Based Systems | Predetermined conversation flows and decision trees; Script-based interactions | Woebot, Wysa | Precise control over therapeutic content; Safety guardrails; Consistent delivery of interventions | Limited flexibility with unexpected user inputs; May feel mechanical; Cannot easily adapt to novel situations | Can be designed with age-appropriate content; Safety-focused; Less likely to generate inappropriate responses |
Retrieval-Based Systems | Select responses from existing database based on user input patterns | Tess, Replika (in more structured modes) | More flexible than rule-based systems; Consistency in therapeutic approach; Can incorporate evidence-based responses | Limited to existing response database; Less adaptable to unique user needs | Can incorporate developmentally appropriate response sets; May struggle with child-specific language patterns |
Generative AI Systems | Generate novel responses based on training data patterns; Not limited to predefined responses | ChatGPT, Claude when used for mental health support | Highly flexible conversations; Can address unexpected inputs; More natural dialogue flow | Less predictable responses; Safety and accuracy concerns; Potential for harmful content | Higher risk of developmentally inappropriate responses; Require substantial safety measures; Often trained primarily on adult language |
Hybrid Approaches | Combine rule-based frameworks with generative or retrieval capabilities | Emerging integrated platforms; Wysa’s newer versions | Balance of structure and flexibility; Combine safety controls with conversational naturalness | More complex to develop and maintain; May have inconsistent interaction quality | Potentially optimal for pediatric applications; Can incorporate developmental safeguards while maintaining engagement |
- Screening and Assessment Tools: Conversational interfaces that gather information about symptoms and experiences to support early identification of mental health concerns [58].
- Psychoeducational Agents: Systems that provide information about mental health conditions, coping strategies, and available resources through interactive dialogue rather than static content [59].
- Guided Self-Help Programs: Structured therapeutic interventions delivered through conversational interfaces, often based on evidence-based approaches like cognitive behavioral therapy [60].
- Emotional Support Companions: Applications designed primarily for empathetic listening and validation rather than formal therapeutic interventions [61].
- Adjuncts to Traditional Therapy: Tools that complement professional care by supporting homework completion, skills practice, or monitoring between sessions [62].
4.3. Initial Evidence in Adult Populations
5. Evidence for Efficacy and Therapeutic Mechanisms
5.1. Evidence in Adult Populations
5.2. Emerging Evidence in Pediatric Populations
- Feasibility and Acceptability Evidence: Multiple studies demonstrate that children and adolescents generally engage well with conversational AI mental health applications. High completion rates (often exceeding 90%) have been reported in short-term feasibility studies [65,66], with qualitative feedback suggesting that young people appreciate the accessibility, privacy, and non-judgmental nature of these interactions. User satisfaction appears particularly strong among adolescents, who value the autonomy and confidentiality these systems provide [67,68].
- Implementation Challenges: Studies exploring real-world implementation have identified several important barriers, including technical difficulties, variable engagement over time, parental concerns about supervision, and integration challenges with existing care systems [68,69]. Dropout rates increase substantially in longer-term implementations, suggesting that initial novelty effects may diminish without ongoing adaptation and support.
- Professional Perspectives: Research examining healthcare providers’ views reveals a consistent pattern of cautious optimism balanced with concerns about clinical oversight and safety. Primary care physicians and mental health specialists generally view conversational AI as potentially valuable for supplementing human care rather than replacing it [8,70]. Professional stakeholders consistently emphasize the need for transparent clinical governance and clear pathways for escalation to human providers when needed.
- Ethical and Developmental Considerations: Several studies specifically examine ethical dimensions of pediatric applications, highlighting tensions between autonomy and protection, privacy and safety, and intended versus actual use patterns [69,71]. These analyses consistently emphasize that developmental considerations must be central rather than peripheral in system design and implementation.
- Limitations of Current Evidence: The pediatric literature shares several methodological limitations with the broader digital mental health field. Sample sizes remain relatively small, with most studies including fewer than 50 participants. Comparison conditions are often absent, making it difficult to distinguish the specific effects of conversational AI from non-specific effects of digital engagement. Follow-up periods are typically brief (2–8 weeks), providing limited insight into sustainable benefits or potential developmental impacts over time.
5.3. Therapeutic Mechanisms
- Reduced Barriers to Self-Disclosure: The perception of anonymity and non-judgment appears to facilitate disclosure of sensitive information that users might hesitate to share with human providers [72]. This fact may be particularly relevant for adolescents navigating identity development and heightened sensitivity to peer evaluation [73].
- Cognitive Change: Structured conversational interventions based on cognitive behavioral principles appear capable of promoting cognitive reframing and challenging maladaptive thought patterns [28]. Text-based interactions may provide opportunities for reflection and cognitive processing that differ from face-to-face exchanges [74].
- Emotional Validation: Analysis of user–chatbot interactions suggests that even simple acknowledgment and reflection of emotions by AI systems can provide a sense of validation that users find supportive [75], aligning with fundamental therapeutic processes identified in human psychotherapy research.
- Behavior Activation: Conversational agents have demonstrated effectiveness in promoting engagement in positive activities and behavioral experiments, core components of evidence-based treatments for depression [76]. The interactive format and ability to send reminders may enhance compliance with behavioral recommendations.
- Skill Development and Practice: Regular interaction with conversational agents provides opportunities for repeated practice of coping skills and emotion regulation strategies in naturalistic contexts [77]. This distributed practice may enhance skill acquisition compared to less frequent traditional therapy sessions.
- Bridging to Human Care: Several studies suggest that conversational agents may serve as “digital gateways” that increase willingness to seek professional help among those who might otherwise avoid traditional services [78]. This bridging function may be especially valuable for adolescents who typically show low rates of help-seeking for mental health concerns [79]. The current evidence suggests that conversational AI applications have promising potential in supporting pediatric mental health, particularly for common conditions like anxiety and depression, and for specific functions such as psychoeducation, skills practice, and bridging to professional care. However, the field remains in its early stages of development, with substantial need for larger, more rigorous studies specifically designed for pediatric populations across developmental stages [80].
- Pediatric-Specific Mechanisms: Several therapeutic mechanisms appear uniquely relevant or modified in pediatric populations. Children and adolescents, as digital natives, often demonstrate greater comfort with technology-mediated communication than adults, potentially facilitating more natural engagement with conversational AI. Research suggests that younger populations may form different types of relationships with non-human entities, with some studies indicating children more readily attribute social presence and therapeutic alliance to AI systems [81]. Age-specific engagement patterns have also been observed, with gamification elements proving particularly effective for younger children, while adolescents often value privacy and autonomy features more highly. Additionally, the reduced power differential between user and AI (compared to adult–child therapeutic relationships) may facilitate different disclosure patterns, particularly among adolescents navigating authority relationships.
6. Special Considerations for Pediatric Applications
6.1. Developmental Considerations
- Cognitive and Language Development: Children’s cognitive and language abilities evolve significantly throughout development, necessitating age-appropriate adjustments to conversational complexity, vocabulary, abstract concepts, and interaction patterns [82]. What works for adolescents may be incomprehensible to younger children, while content designed for younger children may appear patronizing to adolescents.
- Emotional Development: Children’s ability to identify, articulate, and regulate emotions develops gradually, impacting how they express mental health concerns and engage with therapeutic content [83]. Conversational agents must adapt to varying levels of emotional awareness and vocabulary across developmental stages.
- Identity Formation: Adolescence in particular is characterized by intensive identity exploration and formation [84]. Interactions with AI systems during this sensitive period may influence self-concept and beliefs in ways that require careful consideration and safeguards.
- Suggestibility and Critical Thinking: Younger children typically demonstrate greater suggestibility and less developed critical thinking skills, potentially increasing their vulnerability to misinformation or inappropriate advice [85]. This fact necessitates heightened attention to content accuracy and age-appropriate framing of information.
- Digital Literacy: While often characterized as “digital natives”, children and adolescents show significant variation in digital literacy skills that affect their ability to understand AI’s limitations and interpret AI-generated content appropriately [86]. Educational components may be necessary to establish appropriate expectations and boundaries.
- Attention Span and Engagement Preferences: Children’s attention spans and engagement preferences differ from adults and vary across developmental stages, requiring adaptations to conversation length, interaction style, and multimedia integration [87]. Gamification elements may enhance engagement but must be developmentally appropriate [88].
6.2. Clinical and Therapeutic Considerations
- Presentation of Mental Health Concerns: Mental health conditions often present differently in children than adults, with more somatic complaints, behavioral manifestations, and developmental impacts [89]. Conversational agents must be trained to recognize and respond appropriately to these pediatric-specific presentations.
- Assessment Challenges: Accurate assessment of mental health in children often requires multi-informant approaches (child, parent, teachers) due to varying perspectives and limited self-awareness [90]. This complicates the design of conversational assessment tools that typically rely on single-user interaction.
- Parental Involvement: Effective mental health interventions for children generally involve parents/caregivers, raising questions about how conversational AI should manage family involvement while respecting the child’s growing autonomy and privacy needs [91]. Different models of parent–child–AI interaction may be needed across developmental stages.
- Comorbidity and Complexity: Children with mental health concerns frequently present with comorbid conditions or complex contextual factors that may exceed the capabilities of narrowly focused conversational interventions [92]. Clear pathways for escalation to human providers are essential when complexity emerges.
- School Context: For many children, mental health supports are accessed primarily through educational settings rather than healthcare systems [93]. This suggests the potential value of developing conversational agents specifically designed for school-based implementation with appropriate integration into existing support structures.
- Illness Severity Considerations: Conversational AI applications must incorporate robust assessment of symptom severity with clear protocols for cases requiring higher levels of care. Certain conditions such as active suicidality, psychosis, severe eating disorders, or substance use disorders typically require immediate human intervention and may be inappropriate for stand-alone AI management. Systems must be designed to recognize their limitations, effectively triage based on severity, and facilitate appropriate referrals when needed. This consideration is particularly important in pediatric contexts where symptom presentation may differ from adults and where certain high-risk behaviors require mandatory reporting obligations [94].
6.3. Safety and Ethical Considerations
- Content Safety: Heightened responsibility exists for ensuring age-appropriate content and preventing exposure to harmful, frightening, or developmentally inappropriate information [95]. Careful consideration of how mental health concepts are explained and discussed must be considered.
- Crisis Detection and Response: Robust protocols for detecting and responding to crisis situations, including suicidality, abuse disclosure, or emergent safety concerns, are particularly critical in pediatric applications [96]. Clear pathways for human intervention must exist when necessary.
- Privacy and Confidentiality: Complex balancing is required between respecting the growing need for privacy among older children and adolescents and ensuring appropriate adult oversight for safety and care coordination [97]. Different approaches may be needed across age groups and risk levels.
- Data Protection: Special protections apply to children’s data under various regulatory frameworks (e.g., COPPA in the US, GDPR in Europe), necessitating stringent data handling practices and transparent communication about data usage [98]. Beyond regulations specifically addressing children’s data, health information shared with conversational AI should be considered protected health information under regulations such as HIPAA in the US, requiring appropriate security measures, breach notification protocols, and limitations on data use and disclosure.
- Developmental Impact: Long-term effects of regular interaction with AI systems during critical developmental periods remain largely unknown, necessitating ongoing monitoring and research to identify potential unintended consequences [99].
- Autonomy and Agency: The complex patchwork of laws regarding minor consent to mental health treatment creates significant challenges for AI deployment across different regions. Age thresholds vary widely—from 13 in Washington state to 16 in Texas and 18 in many other jurisdictions—requiring systems to implement location-specific protocols for consent, parental involvement, and information sharing. Respect for developing autonomy requires giving children appropriate voice in decisions about using AI mental health tools while acknowledging their evolving capacity for informed consent [100]. This balance shifts across developmental stages.
6.4. Implementation Considerations
- Access and Equity: Digital divides affect children disproportionately, with socioeconomic factors influencing access to devices, internet connectivity, and private spaces for sensitive conversations [101]. Implementation strategies must address these disparities to avoid exacerbating existing inequities.
- Integration with Support Systems: For maximal effectiveness, conversational AI applications for children should integrate with existing support ecosystems, including schools, primary care, mental health services, and family systems [102]. Standalone applications may have limited impact without these connections.
- Cultural Responsiveness: Children develop within specific cultural contexts that shape understanding of mental health, help-seeking behaviors, and communication styles [103]. Conversational agents must demonstrate cultural humility and adaptability to diverse perspectives.
- Supervised vs. Independent Use: Decisions about whether and when children should engage with mental health AI independently versus under adult supervision require balancing safety concerns with developmental needs for privacy and autonomy [104]. Graduated independence may be appropriate across age ranges.
- Educational Support: Implementation in pediatric contexts may require more substantial educational components for both children and adults to establish appropriate expectations, boundaries, and understanding of AI limitations [105].
7. Discussion
7.1. The Promise and Limitations of Current Evidence
7.2. Developmental Appropriateness as a Fundamental Challenge
7.3. Balancing Innovation with Safety and Ethics
7.4. Integration Rather than Replacement
7.5. Equity and Access Considerations
7.6. Research Gaps and Future Directions
8. Conclusions
- What types of conversational AI applications have been developed for pediatric mental health? Our review identified diverse applications spanning rule-based, retrieval-based, generative, and hybrid systems. These technologies support functions including screening and assessment, psychoeducation, guided self-help, emotional support, and augmentation of traditional therapy. The most promising implementations are those designed specifically for pediatric populations rather than adapted from adult systems, with developmental considerations integrated throughout the design process.
- What is the current evidence regarding effectiveness, acceptability, and safety? The evidence base remains nascent, with most robust empirical studies focused on adult populations. Preliminary research with pediatric populations shows promising engagement metrics and user satisfaction, particularly for common conditions like anxiety and depression. However, efficacy studies with rigorous methodology, appropriate controls, and sufficient sample sizes are largely absent. Safety protocols appear inconsistent across implementations, with limited systematic evaluation of potential harms or unintended consequences.
- What unique considerations apply to children and adolescents compared to adults? Developmental considerations emerged as fundamental rather than peripheral factors, necessitating adaptations across cognitive, linguistic, emotional, and ethical dimensions. Children’s evolving capabilities for abstract thinking, emotional regulation, and decision-making require age-appropriate content and interaction patterns. The need for parental involvement balanced with growing autonomy presents unique challenges, as do the complex legal frameworks governing minor consent and data protection across jurisdictions.
- What ethical, technical, and implementation challenges have been identified? Critical challenges include ensuring privacy and confidentiality while enabling appropriate oversight, developing robust crisis detection and response protocols, addressing digital divides that may limit access, ensuring cultural responsiveness, and integrating systems with existing support networks rather than creating standalone interventions. Technical challenges involve adapting language models predominantly trained on adult text to pediatric communication patterns and ensuring age-appropriate content filtering.
- What gaps exist in the current research landscape? Substantial gaps include the need for developmental validation studies examining how children at different stages interact with conversational AI, longitudinal outcomes research assessing medium- to long-term impacts, implementation science approaches for integration into care systems, rigorous safety monitoring protocols, and equity-focused design and evaluation to ensure these technologies reduce rather than exacerbate existing disparities.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
AI | Artificial Intelligence |
LLM | Large Language Model |
CBT | Cognitive Behavorial Therapy |
ACT | Acceptance Commitment Therapy |
NLP | Natural Language Processing |
RCT | Randomized Controlled Trial |
COPPA | Children’s Online Privacy Protection Act |
GDPR | General Data Protection Regulation |
ELIZA | Early Natural Language Processing Computer Program |
References
- Gustavson, K.; Knudsen, A.K.; Nesvåg, R.; Knudsen, G.P.; Vollset, S.E.; Reichborn-Kjennerud, T. Prevalence and stability of mental disorders among young adults: Findings from a longitudinal study. BMC Psychiatry 2018, 18, 65. [Google Scholar] [CrossRef] [PubMed]
- Jones, P.B. Adult mental health disorders and their age at onset. Br. J. Psychiatry 2013, 202, s5–s10. [Google Scholar] [CrossRef]
- Meade, J. Mental Health Effects of the COVID-19 Pandemic on Children and Adolescents. Pediatr. Clin. N. Am. 2021, 68, 945–959. [Google Scholar] [CrossRef]
- Imran, N.; Zeshan, M.; Pervaiz, Z. Mental health considerations for children & adolescents in COVID-19 Pandemic. Pak. J. Med. Sci. 2020, 36, S67–S72. [Google Scholar] [CrossRef]
- Toure, D.M.; Kumar, G.; Walker, C.; Turman, J.E.; Su, D. Barriers to Pediatric Mental Healthcare Access: Qualitative Insights from Caregivers. J. Soc. Serv. Res. 2022, 48, 485–495. [Google Scholar] [CrossRef]
- Singh, S.P.; Tuomainen, H. Transition from child to adult mental health services: Needs, barriers, experiences and new models of care. World Psychiatry 2015, 14, 358–361. [Google Scholar] [CrossRef] [PubMed]
- McGorry, P.D.; Goldstone, S.D.; Parker, A.G.; Rickwood, D.J.; Hickie, I.B. Cultures for mental health care of young people: An Australian blueprint for reform. Lancet Psychiatry 2014, 1, 559–568. [Google Scholar] [CrossRef] [PubMed]
- Imran, N.; Hashmi, A.; Imran, A. Chat-GPT: Opportunities and Challenges in Child Mental Healthcare. Pak. J. Med. Sci. 2023, 39, 1191–1193. [Google Scholar] [CrossRef]
- Cheng, S.; Chang, C.; Chang, W.; Wang, H.; Liang, C.; Kishimoto, T.; Chang, J.P.; Kuo, J.S.; Su, K. The now and future of ChatGPT and GPT in psychiatry. Psychiatry Clin. Neurosci. 2023, 77, 592–596. [Google Scholar] [CrossRef]
- Carlbring, P.; Hadjistavropoulos, H.; Kleiboer, A.; Andersson, G. A new era in Internet interventions: The advent of Chat-GPT and AI-assisted therapist guidance. Internet Interv. 2023, 32, 100621. [Google Scholar] [CrossRef]
- Thapa, S.; Adhikari, S. GPT-4o and multimodal large language models as companions for mental wellbeing. Asian J. Psychiatry 2024, 99, 104157. [Google Scholar] [CrossRef]
- Torous, J.; Bucci, S.; Bell, I.H.; Kessing, L.V.; Faurholt-Jepsen, M.; Whelan, P.; Carvalho, A.F.; Keshavan, M.; Linardon, J.; Firth, J. The growing field of digital psychiatry: Current evidence and the future of apps, social media, chatbots, and virtual reality. World Psychiatry 2021, 20, 318–335. [Google Scholar] [CrossRef] [PubMed]
- Ennis, E.; O’Neill, S.; Mulvenna, M.; Bond, R. Chatbots supporting mental health and wellbeing of children and young people; applications, acceptability and usability. In Proceedings of the European Conference on Mental Health, Ljubljana, Slovakia, 12–15 September 2023; p. 57. [Google Scholar]
- Jang, S.; Kim, J.-J.; Kim, S.-J.; Hong, J.; Kim, S.; Kim, E. Mobile app-based chatbot to deliver cognitive behavioral therapy and psychoeducation for adults with attention deficit: A development and feasibility/usability study. Int. J. Med. Inform. 2021, 150, 104440. [Google Scholar] [CrossRef] [PubMed]
- Jamil Abusamra, H.N.; Ali, S.H.M.; Khidir Elhussien, W.A.; Ahmed Mirghani, A.M.; Alameen Ahmed, A.A.; Abdelrahman Ibrahim, M.E. Ethical and Practical Considerations of Artificial Intelligence in Pediatric Medicine: A Systematic Review. Cureus 2025, 7, e79024. [Google Scholar] [CrossRef]
- Meadi, M.R.; Sillekens, T.; Metselaar, S.; van Balkom, A.; Bernstein, J.; Batelaan, N. Exploring the Ethical Challenges of Conversational AI in Mental Health Care: Scoping Review. JMIR Ment. Health 2025, 12, e60432. [Google Scholar] [CrossRef]
- “Child and Adolescent Mental Health,” in 2022 National Healthcare Quality and Disparities Report [Internet], Agency for Healthcare Research and Quality (US). 2022. Available online: https://www.ncbi.nlm.nih.gov/books/NBK587174/ (accessed on 24 February 2025).
- Buecker, S.; Petersen, K.; Neuber, A.; Zheng, Y.; Hayes, D.; Qualter, P. A systematic review of longitudinal risk and protective factors for loneliness in youth. Ann. N. Y. Acad. Sci. 2024, 1542, 620–637. [Google Scholar] [CrossRef]
- Jabarali, A.; Williams, J.W. Effects of COVID-19 Pandemic on Adolescents’ Mental Health Based on Coping Behavior—Statistical Perspective. Adv. Data Sci. Adapt. Data Anal. 2024, 16, 2450003. [Google Scholar] [CrossRef]
- Elliott, T.R.; Choi, K.R.; Elmore, J.G.; Dudovitz, R. Racial and Ethnic Disparities in Receipt of Pediatric Mental Health Care. Acad. Pediatr. 2024, 24, 987–994. [Google Scholar] [CrossRef]
- Prichett, L.M.; Yolken, R.H.; Severance, E.G.; Young, A.S.; Carmichael, D.; Zeng, Y.; Kumra, T. Racial and Gender Disparities in Suicide and Mental Health Care Utilization in a Pediatric Primary Care Setting. J. Adolesc. Health 2024, 74, 277–282. [Google Scholar] [CrossRef]
- Zhang, Y.; Lal, L.S.; Lin, Y.-Y.; Swint, J.M.; Zhang, Y.; Summers, R.L.; Jones, B.F.; Chandra, S.; Ladner, M.E. Disparities and Medical Expenditure Implications in Pediatric Tele-Mental Health Services During the COVID-19 Pandemic in Mississippi. J. Behav. Health Serv. Res. 2025, 52, 109–122. [Google Scholar] [CrossRef]
- Adams, D.R. Availability and Accessibility of Mental Health Services for Youth: A Descriptive Survey of Safety-Net Health Centers During the COVID-19 Pandemic. Community Ment. Health J. 2024, 60, 88–97. [Google Scholar] [CrossRef]
- Hoffmann, J.A.; Attridge, M.M.; Carroll, M.S.; Simon, N.-J.E.; Beck, A.F.; Alpern, E.R. Association of Youth Suicides and County-Level Mental Health Professional Shortage Areas in the US. JAMA Pediatr. 2023, 177, 71–80. [Google Scholar] [CrossRef] [PubMed]
- Shaligram, D.; Bernstein, B.; DeJong, S.M.; Guerrero, A.P.S.; Hunt, J.; Jadhav, M.; Ong, S.H.; Robertson, P.; Seker, A.; Skokauskas, N. “Building” the Twenty-First Century Child and Adolescent Psychiatrist. Acad. Psychiatry 2022, 46, 75–81. [Google Scholar] [CrossRef] [PubMed]
- Hoffmann, J.A.; Alegría, M.; Alvarez, K.; Anosike, A.; Shah, P.P.; Simon, K.M.; Lee, L.K. Disparities in Pediatric Mental and Behavioral Health Conditions. Pediatrics 2022, 150, e2022058227. [Google Scholar] [CrossRef] [PubMed]
- Oluyede, L.; Cochran, A.L.; Wolfe, M.; Prunkl, L.; McDonald, N. Addressing transportation barriers to health care during the COVID-19 pandemic: Perspectives of care coordinators. Transp. Res. Part A Policy Pract. 2022, 159, 157–168. [Google Scholar] [CrossRef]
- Bringewatt, E.H.; Gershoff, E.T. Falling through the cracks: Gaps and barriers in the mental health system for America’s disadvantaged children. Child. Youth Serv. Rev. 2010, 32, 1291–1299. [Google Scholar] [CrossRef]
- Office of the Surgeon General (OSG). Protecting Youth Mental Health: The U.S. Surgeon General’s Advisory. In Publications and Reports of the Surgeon General; US Department of Health and Human Services: Washington, DC, USA, 2021. Available online: http://www.ncbi.nlm.nih.gov/books/NBK575984/ (accessed on 24 February 2025).
- Johnson, C.L.; Gross, M.A.; Jorm, A.F.; Hart, L.M. Mental Health Literacy for Supporting Children: A Systematic Review of Teacher and Parent/Carer Knowledge and Recognition of Mental Health Problems in Childhood. Clin. Child. Fam. Psychol. Rev. 2023, 26, 569–591. [Google Scholar] [CrossRef]
- Powell, K.; Huxley, E.; Townsend, M.L. Mental health help seeking in young people and carers in out of home care: A systematic review. Child. Youth Serv. Rev. 2021, 127, 106088. [Google Scholar] [CrossRef]
- Viksveen, P.; Bjønness, S.E.; Cardenas, N.E.; Game, J.R.; Berg, S.H.; Salamonsen, A.; Storm, M.; Aase, K. User involvement in adolescents’ mental healthcare: A systematic review. Eur. Child. Adolesc. Psychiatry 2022, 31, 1765–1788. [Google Scholar] [CrossRef]
- Henning, W.A. The Complete Infant and Early Childhood Mental Health Handbook: A Comprehensive Guide to Understanding, Supporting, and Nurturing Young Minds; Winifred Audrey Henning. 2025. Available online: https://www.barnesandnoble.com/w/the-complete-infant-and-early-childhood-mental-health-handbook-winifred-audrey-henning/1146812873;jsessionid=ADAC90C23B746C9A1B7BF0B7D78FF1C1.prodny_store02-atgap10 (accessed on 10 January 2025).
- Ratheesh, A.; Loi, S.M.; Coghill, D.; Chanen, A.; McGorry, P.D. Special Considerations in the Psychiatric Evaluation Across the Lifespan (Special Emphasis on Children, Adolescents, and Elderly). In Tasman’s Psychiatry; Tasman, A., Riba, M.B., Alarcón, R.D., Alfonso, C.A., Kanba, S., Ndetei, D.M., Ng, C.H., Schulze, T.G., Lecic-Tosevski, D., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 1–37. [Google Scholar] [CrossRef]
- Lehtimaki, S.; Martic, J.; Wahl, B.; Foster, K.T.; Schwalbe, N. Evidence on Digital Mental Health Interventions for Adolescents and Young People: Systematic Overview. JMIR Ment. Health 2021, 8, e25847. [Google Scholar] [CrossRef]
- Achilles, M.R.; Anderson, M.; Li, S.H.; Subotic-Kerry, M.; Parker, B.; O’Dea, B. Adherence to e-mental health among youth: Considerations for intervention development and research design. Digit. Health 2020, 6, 2055207620926064. [Google Scholar] [CrossRef] [PubMed]
- Hellström, L.; Beckman, L. Life Challenges and Barriers to Help Seeking: Adolescents’ and Young Adults’ Voices of Mental Health. Int. J. Environ. Res. Public Health 2021, 18, 13101. [Google Scholar] [CrossRef] [PubMed]
- Butzner, M.; Cuffee, Y. Telehealth Interventions and Outcomes Across Rural Communities in the United States: Narrative Review. J. Med. Internet Res. 2021, 23, e29575. [Google Scholar] [CrossRef]
- Chang, J.E.; Lai, A.Y.; Gupta, A.; Nguyen, A.M.; Berry, C.A.; Shelley, D.R. Rapid Transition to Telehealth and the Digital Divide: Implications for Primary Care Access and Equity in a Post-COVID Era. Milbank Q. 2021, 99, 340–368. [Google Scholar] [CrossRef]
- Benton, T.D.; Boyd, R.C.; Njoroge, W.F.M. Addressing the Global Crisis of Child and Adolescent Mental Health. JAMA Pediatr. 2021, 175, 1108–1110. [Google Scholar] [CrossRef] [PubMed]
- Bhugra, D.; Moussaoui, D.; Craig, T.J. Oxford Textbook of Social Psychiatry; Oxford University Press: Oxford, UK, 2022. [Google Scholar]
- Li, H.; Zhang, R.; Lee, Y.-C.; Kraut, R.E.; Mohr, D.C. Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. NPJ Digit. Med. 2023, 6, 236. [Google Scholar] [CrossRef]
- Berry, D.M. The Limits of Computation: Joseph Weizenbaum and the ELIZA Chatbot. Weizenbaum J. Digit. Soc. 2023, 3, 3. [Google Scholar] [CrossRef]
- Hatch, S.G.; Goodman, Z.T.; Vowels, L.; Hatch, H.D.; Brown, A.L.; Guttman, S.; Le, Y.; Bailey, B.; Bailey, R.J.; Esplin, C.R.; et al. When ELIZA meets therapists: A Turing test for the heart and mind. PLoS Ment. Health 2025, 2, e0000145. [Google Scholar] [CrossRef]
- Aisha, M.A.; Jamei, R.B. Conversational AI Revolution: A Comparative Review of Machine Learning Algorithms in Chatbot Evolution. East J. Eng. 2025, 1, 1–18. [Google Scholar]
- Rajaraman, V. From ELIZA to ChatGPT. Reson 2023, 28, 889–905. [Google Scholar] [CrossRef]
- Singh, R.; Thakur, J.; Mohan, Y. A Historical Analysis of Chatbots from Eliza to Google Bard. In Proceedings of the Fifth Doctoral Symposium on Computational Intelligence, Calabria, Italy, 10 May 2024; Swaroop, A., Kansal, V., Fortino, G., Hassanien, A.E., Eds.; Springer Nature: Singapore, 2024; pp. 15–39. [Google Scholar] [CrossRef]
- Annepaka, Y.; Pakray, P. Large language models: A survey of their development, capabilities, and applications. Knowl. Inf. Syst. 2025, 67, 2967–3022. [Google Scholar] [CrossRef]
- Malik, J.; Tan, M. Modern AI Unlocked: Large Language Models and the Future of Contextual Processing. Balt. Multidiscip. Res. Lett. J. 2025, 2, 1–7. [Google Scholar]
- Boucher, E.M.; Harake, N.R.; Ward, H.E.; Stoeckl, S.E.; Vargas, J.; Minkel, J.; Parks, A.C.; Zilca, R. Artificially intelligent chatbots in digital mental health interventions: A review. Expert Rev. Med. Devices 2021, 18, 37–49. [Google Scholar] [CrossRef]
- van der Waa, J.; Nieuwburg, E.; Cremers, A.; Neerincx, M. Evaluating XAI: A comparison of rule-based and example-based explanations. Artif. Intell. 2021, 291, 103404. [Google Scholar] [CrossRef]
- Dong, Y.; Mu, R.; Zhang, Y.; Sun, S.; Zhang, T.; Wu, C.; Jin, G.; Qi, Y.; Hu, J.; Meng, J.; et al. Safeguarding Large Language Models: A Survey. arXiv 2024, arXiv:2406.02622. [Google Scholar] [CrossRef]
- Vagwala, M.K.; Asher, R. Conversational Artificial Intelligence and Distortions of the Psychotherapeutic Frame: Issues of Boundaries, Responsibility, and Industry Interests. Am. J. Bioeth. 2023, 23, 28–30. [Google Scholar] [CrossRef] [PubMed]
- Qian, H.; Dou, Z.; Zhu, Y.; Ma, Y.; Wen, J.-R. Learning Implicit User Profile for Personalized Retrieval-Based Chatbot. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, in CIKM ’21, New York, NY, USA, 1–5 November 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 1467–1477. [Google Scholar] [CrossRef]
- Bandi, A.; Adapa, P.V.S.R.; Kuchi, Y.E.V.P.K. The Power of Generative AI: A Review of Requirements, Models, Input–Output Formats, Evaluation Metrics, and Challenges. Future Internet 2023, 15, 260. [Google Scholar] [CrossRef]
- Chen, Y.; Esmaeilzadeh, P. Generative AI in Medical Practice: In-Depth Exploration of Privacy and Security Challenges. J. Med. Internet Res. 2024, 26, e53008. [Google Scholar] [CrossRef]
- Beredo, J.L.; Ong, E.C. Analyzing the Capabilities of a Hybrid Response Generation Model for an Empathetic Conversational Agent. Int. J. As. Lang. Proc. 2022, 32, 2350008. [Google Scholar] [CrossRef]
- Balcombe, L.; De Leo, D. Human-Computer Interaction in Digital Mental Health. Informatics 2022, 9, 14. [Google Scholar] [CrossRef]
- Huq, S.M.; Maskeliūnas, R.; Damaševičius, R. Dialogue agents for artificial intelligence-based conversational systems for cognitively disabled: A systematic review. Disabil. Rehabil. Assist. Technol. 2024, 19, 1059–1078. [Google Scholar] [CrossRef] [PubMed]
- Fitzpatrick, K.K.; Darcy, A.; Vierhile, M. Delivering Cognitive Behavior Therapy to Young Adults with Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Ment. Health 2017, 4, e7785. [Google Scholar] [CrossRef] [PubMed]
- Chu, Y.; Liao, L.; Zhou, Z.; Ngo, C.-W.; Hong, R. Towards Multimodal Emotional Support Conversation Systems. arXiv 2024, arXiv:2408.03650. [Google Scholar] [CrossRef]
- Jiang, M.; Zhao, Q.; Li, J.; Wang, F.; He, T.; Cheng, X.; Yang, B.X.; Ho, G.W.K.; Fu, G. A Generic Review of Integrating Artificial Intelligence in Cognitive Behavioral Therapy. arXiv 2024, arXiv:2407.19422. [Google Scholar] [CrossRef]
- Fulmer, R.; Joerin, A.; Gentile, B.; Lakerink, L.; Rauws, M. Using Psychological Artificial Intelligence (Tess) to Relieve Symptoms of Depression and Anxiety: Randomized Controlled Trial. JMIR Ment. Health 2018, 5, e9782. [Google Scholar] [CrossRef]
- Gaffney, H.; Mansell, W.; Tai, S. Conversational Agents in the Treatment of Mental Health Problems: Mixed-Method Systematic Review. JMIR Ment. Health 2019, 6, e14166. [Google Scholar] [CrossRef]
- Vertsberger, D.; Naor, N.; Winsberg, M. Adolescents’ Well-being While Using a Mobile Artificial Intelligence–Powered Acceptance Commitment Therapy Tool: Evidence From a Longitudinal Study. JMIR AI 2022, 1, e38171. [Google Scholar] [CrossRef]
- Beaudry, J.; Consigli, A.; Clark, C.; Robinson, K.J. Getting Ready for Adult Healthcare: Designing a Chatbot to Coach Adolescents with Special Health Needs Through the Transitions of Care. J. Pediatr. Nurs. 2019, 49, 85–91. [Google Scholar] [CrossRef]
- Nicol, G.; Wang, R.; Graham, S.; Dodd, S.; Garbutt, J. Chatbot-Delivered Cognitive Behavioral Therapy in Adolescents with Depression and Anxiety During the COVID-19 Pandemic: Feasibility and Acceptability Study. JMIR Form. Res. 2022, 6, e40242. [Google Scholar] [CrossRef]
- Fujita, J.; Yano, Y.; Shinoda, S.; Sho, N.; Otsuki, M.; Takayama, M.; Moroga, T.; Yamaguchi, H.; Ishii, M. Challenges in Implementing an AI Chatbot Intervention for Depression Among Youth on Psychiatric Waiting Lists: A Study Termination Report. medRxiv 2024. [Google Scholar] [CrossRef]
- Opel, D.J.; Kious, B.M.; Cohen, I.G. AI as a Mental Health Therapist for Adolescents. JAMA Pediatr. 2023, 177, 1253–1254. [Google Scholar] [CrossRef] [PubMed]
- Ghadiri, P.; Yaffe, M.J.; Adams, A.M.; Abbasgholizadeh-Rahimi, S. Primary care physicians’ perceptions of artificial intelligence systems in the care of adolescents’ mental health. BMC Prim. Care 2024, 25, 215. [Google Scholar] [CrossRef] [PubMed]
- Moore, B.; Herington, J.; Tekin, Ş. The Integration of Artificial Intelligence-Powered Psychotherapy Chatbots in Pediatric Care: Scaffold or Substitute? J. Pediatr. 2025, 280, 114509. [Google Scholar] [CrossRef]
- Papneja, H.; Yadav, N. Self-disclosure to conversational AI: A literature review, emergent framework, and directions for future research. Pers. Ubiquit Comput. 2024. [Google Scholar] [CrossRef]
- Somerville, L.H. The Teenage Brain: Sensitivity to Social Evaluation. Curr. Dir. Psychol. Sci. 2013, 22, 121–127. [Google Scholar] [CrossRef]
- Kehrwald, B. Understanding social presence in text-based online learning environments. Distance Educ. 2008, 29, 89–106. [Google Scholar] [CrossRef]
- Al-Shafei, M. Navigating Human-Chatbot Interactions: An Investigation into Factors Influencing User Satisfaction and Engagement. Int. J. Hum. Comput. Interact. 2025, 41, 411–428. [Google Scholar] [CrossRef]
- Otero-González, I.; Pacheco-Lorenzo, M.R.; Fernández-Iglesias, M.J.; Anido-Rifón, L.E. Conversational agents for depression screening: A systematic review. Int. J. Med. Inform. 2024, 181, 105272. [Google Scholar] [CrossRef]
- Car, L.T.; Dhinagaran, D.A.; Kyaw, B.M.; Kowatsch, T.; Joty, S.; Theng, Y.-L.; Atun, R. Conversational Agents in Health Care: Scoping Review and Conceptual Analysis. J. Med. Internet Res. 2020, 22, e17158. [Google Scholar] [CrossRef]
- Balan, R.; Dobrean, A.; Poetar, C.R. Use of automated conversational agents in improving young population mental health: A scoping review. NPJ Digit. Med. 2024, 7, 1–9. [Google Scholar] [CrossRef]
- Sarkar, S.; Gaur, M.; Chen, L.K.; Garg, M.; Srivastava, B. A review of the explainability and safety of conversational agents for mental health to identify avenues for improvement. Front. Artif. Intell. 2023, 6, 1229805. [Google Scholar] [CrossRef]
- Ekellem, E.A.F. Enhancing Mental Health and Academic Performance in Youth with ADHD and Related Disorders through Conversational AI. Available online: https://www.authorea.com/doi/full/10.36227/techrxiv.170555330.07605038?commit=d4e4fa0578f3b80de0c7b1585bfb731cef0f156d (accessed on 24 February 2025).
- Xu, Y.; Thomas, T.; Yu, C.-L.; Pan, E.Z. What makes children perceive or not perceive minds in generative AI? Comput. Hum. Behav. Artif. Hum. 2025, 4, 100135. [Google Scholar] [CrossRef]
- Paul, R.; Norbury, C. Language Disorders from Infancy Through Adolescence—E-Book: Language Disorders from Infancy Through Adolescence—E-Book; Elsevier Health Sciences: Amsterdam, The Netherlands, 2012. [Google Scholar]
- Cole, P.M.; Michel, M.K.; Teti, L.O. The Development of Emotion Regulation and Dysregulation: A Clinical Perspective. Monogr. Soc. Res. Child Dev. 1994, 59, 73–100. [Google Scholar] [CrossRef]
- Crocetti, E. Identity Formation in Adolescence: The Dynamic of Forming and Consolidating Identity Commitments. Child Dev. Perspect. 2017, 11, 145–150. [Google Scholar] [CrossRef]
- Klemfuss, J.Z.; Olaguez, A.P. Individual Differences in Children’s Suggestibility: An Updated Review. J. Child Sex. Abus. 2020, 29, 158–182. [Google Scholar] [CrossRef] [PubMed]
- Ali, S.; DiPaola, D.; Lee, I.; Sindato, V.; Kim, G.; Blumofe, R.; Breazeal, C. Children as creators, thinkers and citizens in an AI-driven future. Comput. Educ. Artif. Intell. 2021, 2, 100040. [Google Scholar] [CrossRef]
- Mahone, E.M.; Schneider, H.E. Assessment of Attention in Preschoolers. Neuropsychol. Rev. 2012, 22, 361–383. [Google Scholar] [CrossRef]
- Kirk, H.E.; Spencer-Smith, M.; Wiley, J.F.; Cornish, K.M. Gamified Attention Training in the Primary School Classroom: A Cluster-Randomized Controlled Trial. J. Atten. Disord. 2021, 25, 1146–1159. [Google Scholar] [CrossRef]
- Reale, L.; Bonati, M. Mental disorders and transition to adult mental health services: A scoping review. Eur. Psychiatry 2015, 30, 932–942. [Google Scholar] [CrossRef]
- De Los Reyes, A.; Augenstein, T.M.; Wang, M.; Thomas, S.A.; Drabick, D.A.G.; Burgers, D.E.; Rabinowitz, J. The validity of the multi-informant approach to assessing child and adolescent mental health. Psychol. Bull. 2015, 141, 858–900. [Google Scholar] [CrossRef]
- Wild, C.E.K.; Rawiri, N.T.; Taiapa, K.; Anderson, Y.C. In safe hands: Child health data storage, linkage and consent for use. Health Promot. Int. 2023, 38, daad159. [Google Scholar] [CrossRef] [PubMed]
- Dulcan, M.K.; Ballard, R.R.; Jha, P.; Sadhu, J.M. Concise Guide to Child and Adolescent Psychiatry, 5th ed.; American Psychiatric Pub: Washington, DC, USA, 2017. [Google Scholar]
- Burns, B.J.; Costello, E.J.; Angold, A.; Tweed, D.; Stangl, D.; Farmer, E.M.Z.; Erkanli, A. Children’s Mental Health Service Use Across Service Sectors. Health Aff. 1995, 14, 147–159. [Google Scholar] [CrossRef] [PubMed]
- Pyland, C.P.; Williams, M.G.; Mollen, D. Ethical and diversity considerations of mandatory reporting: Implications for training. Train. Educ. Prof. Psychol. 2024, 18, 297–304. [Google Scholar] [CrossRef]
- Yu, Y.; Sharma, T.; Hu, M.; Wang, J.; Wang, Y. Exploring Parent-Child Perceptions on Safety in Generative AI: Concerns, Mitigation Strategies, and Design Implications. arXiv 2024, arXiv:2406.10461. [Google Scholar] [CrossRef]
- Annapragada, A.V.; Donaruma-Kwoh, M.M.; Annapragada, A.V.; Starosolski, Z.A. A natural language processing and deep learning approach to identify child abuse from pediatric electronic medical records. PLoS ONE 2021, 16, e0247404. [Google Scholar] [CrossRef] [PubMed]
- Padmore, J. The Mental Health Needs of Children and Young People: Guiding You to Key Issues and Practices in CAMHS; McGraw-Hill Education: London, UK, 2016. [Google Scholar]
- Pellizzari, A. Navigating the Intersections of AI and Data Protection: A Comparative Analysis of the EU and US Approach to Healthcare. 2025. Available online: https://unitesi.unipv.it/handle/20.500.14239/27705 (accessed on 24 February 2025).
- Ho, A. Live Like Nobody Is Watching: Relational Autonomy in the Age of Artificial Intelligence Health Monitoring; Oxford University Press: Oxford, UK, 2023. [Google Scholar]
- Zidaru, T.; Morrow, E.M.; Stockley, R. Ensuring patient and public involvement in the transition to AI-assisted mental health care: A systematic scoping review and agenda for design justice. Health Expect. 2021, 24, 1072–1124. [Google Scholar] [CrossRef]
- Livingstone, S.; Helsper, E. Gradations in digital inclusion: Children, young people and the digital divide. New Media Soc. 2007, 9, 671–696. [Google Scholar] [CrossRef]
- Constantinides, P. Digital Transformation in Healthcare: An Ecosystem Approach; Routledge: London, UK, 2023. [Google Scholar] [CrossRef]
- Guo, S.; Nguyen, H.; Weiss, B.; Ngo, V.K.; Lau, A.S. Linkages between mental health need and help-seeking behavior among adolescents: Moderating role of ethnicity and cultural values. J. Couns. Psychol. 2015, 62, 682–693. [Google Scholar] [CrossRef]
- Wies, B.; Landers, C.; Ienca, M. Digital Mental Health for Young People: A Scoping Review of Ethical Promises and Challenges. Front. Digit. Health 2021, 3, 697072. [Google Scholar] [CrossRef]
- Mohammed, P.S.; ‘Nell’ Watson, E. Towards Inclusive Education in the Age of Artificial Intelligence: Perspectives, Challenges, and Opportunities. In Artificial Intelligence and Inclusive Education: Speculative Futures and Emerging Practices; Knox, J., Wang, Y., Gallagher, M., Eds.; Springer: Singapore, 2019; pp. 17–37. [Google Scholar] [CrossRef]
- Wang, G.; Zhao, J.; Van Kleek, M.; Shadbolt, N. Informing Age-Appropriate AI: Examining Principles and Practices of AI for Children. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 30 April–5 May 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 1–29. [Google Scholar] [CrossRef]
- Danieli, M.; Ciulli, T.; Mousavi, S.M.; Silvestri, G.; Barbato, S.; Natale, L.D.; Riccardi, G. Assessing the Impact of Conversational Artificial Intelligence in the Treatment of Stress and Anxiety in Aging Adults: Randomized Controlled Trial. JMIR Ment. Health 2022, 9, e38067. [Google Scholar] [CrossRef]
- Barfield, J.K. Evaluating the self-disclosure of personal information to AI-enabled technology. In Research Handbook on Artificial Intelligence and Communication; Edward Elgar Publishing: Northampton, MA, USA, 2023; pp. 355–375. Available online: https://www.elgaronline.com/edcollchap/book/9781803920306/book-part-9781803920306-33.xml (accessed on 24 February 2025).
- Yıldız, T. The Minds We Make: A Philosophical Inquiry into Theory of Mind and Artificial Intelligence. Integr. Psych. Behav. 2025, 59, 10. [Google Scholar] [CrossRef] [PubMed]
- Chaudhry, B.M.; Debi, H.R. User perceptions and experiences of an AI-driven conversational agent for mental health support. Mhealth 2024, 10, 22. [Google Scholar] [CrossRef] [PubMed]
- McGINTY, B. The Future of Public Mental Health: Challenges and Opportunities. Milbank Q. 2023, 101, 532–551. [Google Scholar] [CrossRef]
- Atkins, S.; Badrie, I.; Otterloo, S. Applying Ethical AI Frameworks in practice: Evaluating conversational AI chatbot solutions. Comput. Soc. Res. J. 2021, 1, 1–6. [Google Scholar] [CrossRef]
- McTear, M. Conversational AI: Dialogue Systems, Conversational Agents, and Chatbots; Springer Nature: Singapore, 2022. [Google Scholar]
- Balcombe, L.; Leo, D.D. Digital Mental Health Challenges and the Horizon Ahead for Solutions. JMIR Ment. Health 2021, 8, e26811. [Google Scholar] [CrossRef]
- Hussain, S.A.; Bresnahan, M.; Zhuang, J. The bias algorithm: How AI in healthcare exacerbates ethnic and racial disparities—A scoping review. Ethn. Health 2025, 30, 197–214. [Google Scholar] [CrossRef]
- Aleem, M.; Zahoor, I.; Naseem, M. Towards Culturally Adaptive Large Language Models in Mental Health: Using ChatGPT as a Case Study. In Companion Publication of the 2024 Conference on Computer-Supported Cooperative Work and Social Computing, in CSCW Companion ’24; Association for Computing Machinery: New York, NY, USA, 2024; pp. 240–247. [Google Scholar] [CrossRef]
- Björling, E.A.; Rose, E. Participatory Research Principles in Human-Centered Design: Engaging Teens in the Co-Design of a Social Robot. Multimodal Technol. Interact. 2019, 3, 8. [Google Scholar] [CrossRef]
Study | Population | AI System & Therapeutic Approach | Study Design | Key Findings | Limitations |
---|---|---|---|---|---|
Vertsberger et al. (2023) [65] | Adolescents | Kai.ai (Acceptance Commitment Therapy) | Longitudinal study | Improvements in stress management and emotional well-being | Self-reporting biases; Lack of long-term follow-up |
Papneja et al. (2024) [72] | Youth (various ages) | Multiple systems, including Wysa (CBT) | Systematic review | AI valuable as supplement to traditional therapy; Cannot replace human clinicians | Limited long-term efficacy data |
Nicol et al. (2023) [67] | Adolescents with depression and anxiety | CBT-based chatbot | Feasibility and acceptability study | High engagement; Perceived helpfulness | Small sample size; No control group |
Ghadiri et al. (2024) [70] | Primary care physicians providing adolescent mental healthcare | N/A (Physicians’ perceptions of AI) | Qualitative study | Skepticism about AI reliability; Recognition of potential as supplementary tool | Limited to physician perspectives |
Fujita et al. (2023) [68] | Adolescents on psychiatric waiting lists | Emol | Implementation study | Identified technical challenges and dropout rates | Focus on implementation rather than outcomes |
Beaudry et al. (2019) [66] | Teenagers | Mood management conversational agent | Feasibility study | 97% completion rate; High user satisfaction | Small sample; No efficacy measures |
Moore et al. (2025) [71] | Children and adolescents | Various AI chatbots | Ethical analysis | AI should support rather than replace therapy; Concerns about parental consent | Limited empirical data |
Opel et al. (2024) [69] | Adolescents | AI mental health therapist systems | Systematic analysis | Improved accessibility; Need for regulation to mitigate risks | Theoretical focus; Limited experimental evidence |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mansoor, M.; Hamide, A.; Tran, T. Conversational AI in Pediatric Mental Health: A Narrative Review. Children 2025, 12, 359. https://doi.org/10.3390/children12030359
Mansoor M, Hamide A, Tran T. Conversational AI in Pediatric Mental Health: A Narrative Review. Children. 2025; 12(3):359. https://doi.org/10.3390/children12030359
Chicago/Turabian StyleMansoor, Masab, Ali Hamide, and Tyler Tran. 2025. "Conversational AI in Pediatric Mental Health: A Narrative Review" Children 12, no. 3: 359. https://doi.org/10.3390/children12030359
APA StyleMansoor, M., Hamide, A., & Tran, T. (2025). Conversational AI in Pediatric Mental Health: A Narrative Review. Children, 12(3), 359. https://doi.org/10.3390/children12030359