Previous Article in Journal
Correction: Heo et al. (2024). How Are Perfectionism Groups Determined Among Korean College Students? Applying the Best-Fitting Short Form of Hewitt and Flett’s Multidimensional Perfectionism Scale (HF-MPS) to a Korean Sample. Psychology International, 6(4), 1028–1039
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Bridging the Gap: The Role of AI in Enhancing Psychological Well-Being Among Older Adults

1
Department of Social Welfare, Inha University, Incheon 22212, Republic of Korea
2
School of Social Work, Michigan State University, East Lansing, MI 48824, USA
*
Author to whom correspondence should be addressed.
Psychol. Int. 2025, 7(3), 68; https://doi.org/10.3390/psycholint7030068 (registering DOI)
Submission received: 25 June 2025 / Revised: 25 July 2025 / Accepted: 1 August 2025 / Published: 4 August 2025
(This article belongs to the Section Neuropsychology, Clinical Psychology, and Mental Health)

Abstract

As the global population ages, older adults face growing psychological challenges such as loneliness, cognitive decline, and loss of social roles. Meanwhile, artificial intelligence (AI) technologies, including chatbots and voice-based systems, offer new pathways to emotional support and mental stimulation. However, older adults often encounter significant barriers in accessing and effectively using AI tools. This review examines the current landscape of AI applications aimed at enhancing psychological well-being among older adults, identifies key challenges such as digital literacy and usability, and highlights design and training strategies to bridge the digital divide. Using socioemotional selectivity theory and technology acceptance models as guiding frameworks, we argue that AI—especially in the form of conversational agents—holds transformative potential in reducing isolation and promoting emotional resilience in aging populations. We conclude with recommendations for inclusive design, participatory development, and future interdisciplinary research.

1. Introduction

Older adults represent one of the fastest-growing demographics worldwide. As individuals age, they often experience risks of loneliness, social isolation, depression, and cognitive decline (Bai et al., 2022; Courtin & Knapp, 2017; Flake et al., 2009; Guarnera et al., 2023; Hu et al., 2022; Pais et al., 2020). These psychological stressors are further exacerbated by diminished social networks and mobility limitations (Cao et al., 2024; Domenech-Abella et al., 2017; Kemperman et al., 2019; McCaffery et al., 2020; Stoeckel & Litwin, 2015; Tomida et al., 2024). The COVID-19 pandemic highlighted these vulnerabilities, with many older adults becoming socially cut off from family, friends, and community services (MacLeod et al., 2021; Su et al., 2023). Loneliness and isolation are not merely emotional states—they are associated with a host of physical and cognitive impairments, including increased risk of dementia, cardiovascular disease, and mortality (Holt-Lunstad & Steptoe, 2022; Shankar et al., 2011). In parallel, artificial intelligence (AI) technologies—such as voice assistants, chatbots, and emotion-sensitive interfaces—have become increasingly integrated into healthcare, communication, and everyday life. AI systems are being used to monitor health conditions, provide mental health support, and offer companionship through natural language interfaces (Chaturvedi et al., 2023; D’Alfonso, 2020; Kasaudhan, 2025). While younger populations have rapidly adopted these tools, older adults often lag in adoption rates, not due to lack of interest, but because of systemic barriers and insufficient user-centered design (Wang et al., 2019; Wilson et al., 2021; Wong et al., 2025). This divergence creates a critical challenge: ensuring that technological progress serves the most vulnerable populations, not just the most digitally connected.
In recent years, digital aging has emerged as an area of growing importance, and the integration of AI into geriatric psychology represents a novel opportunity to support mental wellness, autonomy, and connection. With global aging trends accelerating, and with AI rapidly transforming health and communication sectors, a focused review of this intersection is both urgent and necessary. By centering emotional well-being—rather than purely physical or cognitive function—this paper responds to a neglected but highly relevant dimension of aging.
This review paper aims to explore how AI can enhance psychological well-being among older adults. Specifically, it (1) examines challenges older adults face in accessing and using AI tools, (2) suggests training and design strategies for facilitating use, and (3) highlights the role of AI-based conversational agents in reducing loneliness and fostering emotional support. The discussion is grounded in socioemotional selectivity theory and technology acceptance models to better understand both the needs and behaviors of older adults in digital contexts. By doing so, this paper seeks to contribute a multidisciplinary perspective to the evolving conversation on digital aging, emphasizing the emotional dimensions often overlooked in technology discourse.

2. Theoretical Framework

Socioemotional selectivity theory (SST) posits that as individuals age, their time horizons shrink, leading them to prioritize emotionally meaningful goals and relationships (Carstensen, 1992, 2021; Carstensen et al., 1999). This framework supports the idea that older adults are motivated to seek emotional satisfaction and social connection—needs that AI tools like chatbots may fulfill by offering companionship and empathetic interaction (Carstensen, 1992, 2021; Carstensen et al., 1999). SST helps explain why older adults might value consistent, friendly conversations more than purely informational tasks (Carstensen, 1992, 2021; Carstensen et al., 1999). Chatbots that simulate emotional reciprocity—through tone, memory of past interactions, or tailored responses—align with SST’s premise that emotional depth becomes a priority in older age. SST further implies that emotional technologies must adapt not just to cognitive function but to users’ evolving motivational landscapes. AI that mimics social engagement must therefore reflect warmth, familiarity, and continuity—qualities often associated with meaningful human contact.
In addition, the Unified Theory of Acceptance and Use of Technology (UTAUT) helps explain older adults’ adoption behaviors. Key constructs—performance expectancy, effort expectancy, social influence, and facilitating conditions—highlight the importance of perceived usefulness, ease of use, and available support systems in influencing technology uptake among older users (Venkatesh et al., 2003). When older adults believe that a technology will meaningfully improve their lives (e.g., by reducing loneliness), and when they have the support to use it effectively, their adoption rates significantly increase (Venkatesh et al., 2003). UTAUT underscores the need to design for usability, promote relevance, and build ecosystems that support sustained use. SST and UTAUT highlight both emotional and instrumental motivations behind older adults’ technology behavior, underscoring the value of interdisciplinary frameworks in designing AI systems that are not only functional but emotionally intelligent.

3. Method

This study employed a narrative review approach instead of a systematic review, as the topic of AI in enhancing psychological well-being among older adults is broad and interdisciplinary. Addressing this topic necessitates the integration and interpretation of a diverse body of literature, including qualitative, quantitative, and theoretical studies. This narrative review synthesizes findings from peer-reviewed articles published between 2010 and 2025 because AI began to emerge as a topic of interest in psychology around 2010. Searches were conducted using databases including PubMed, PsycINFO, Scopus, and Google Scholar. Key terms included the following: “AI and older adults,” “chatbots and aging,” “psychological well-being and technology,” “digital divide in aging,” and “AI companionship.” Empirical studies, conceptual articles, and theoretical frameworks relevant to aging, psychological well-being, and AI were included. Articles focusing solely on technical development without reference to older populations or mental health were excluded. The majority of included studies were from North America, Europe, and East Asia, reflecting regional concentrations of aging populations and technological advancement. The thematic categories were derived inductively through a narrative synthesis of the included studies. Rather than applying a predefined conceptual framework, we identified recurring patterns across the literature and organized them into three primary domains: (1) barriers to AI adoption among older adults, (2) training and support strategies, and (3) outcomes of chatbot and AI assistant interventions on psychological well-being. To further enhance relevance and interpretive richness, findings were synthesized thematically rather than chronologically, allowing for a multidimensional exploration of overlapping concepts such as trust, usability, and emotional engagement. In addition, specific attention was paid to studies including qualitative user feedback, which offers critical insight into the lived experiences and perceptions of older AI users.

4. Barriers to AI Adoption

4.1. Digital Literacy and Usability

Many older adults lack foundational digital literacy skills, making it difficult to operate AI-enabled devices or applications (Vercruyssen et al., 2023). Interfaces often assume familiarity with smartphones, touchscreens, or multi-step navigation systems that can overwhelm new users (Vercruyssen et al., 2023). Complex user interfaces, technical jargon, and lack of intuitive design act as further deterrents. Moreover, older adults may suffer from age-related physical changes such as reduced vision, hearing, and motor dexterity, all of which make it harder to interact with devices not designed with accessibility in mind (Piper et al., 2017; Wildenbos et al., 2018). In some cases, even minor design oversights—like small fonts, unclear icons, or fast-paced speech responses—can act as major roadblocks. These small but critical user experience failures can compound feelings of frustration, exclusion, or helplessness. The assumption that older adults are simply ‘non-tech-savvy’ ignores the structural barriers that exclude them by design.

4.2. Psychological and Motivational Barriers

Technophobia, anxiety about “breaking” devices, and low self-efficacy are common psychological barriers (Sun & Ye, 2024; Vercruyssen et al., 2023). Some older adults perceive AI as impersonal or unnecessary, especially if their prior technology experiences were negative (Wong et al., 2025). These perceptions reduce motivation to engage with new systems, even when those systems are designed to support them. Feelings of embarrassment when asking for help or failing to understand technology may further discourage participation. AI may be perceived not just as complex, but as alien or emotionally cold. These feelings are intensified in individuals who have experienced significant technological change within their lifetimes—prompting apprehension and even distrust.
Thus, psychological barriers are not only internal but culturally reinforced, rooted in ageism and stereotypes about capability.

4.3. Infrastructure and Access Issues

Limited internet access, particularly in rural or low-income areas, continues to inhibit AI adoption. Devices such as smart speakers or tablets may be unaffordable or unavailable. Moreover, lack of consistent technical support limits continued use even when access is initially provided. Policy-level issues, such as underfunding of digital inclusion programs for older adults, compound these infrastructural challenges. Accessibility is not only a matter of affordability but also of relevance: when public systems fail to prioritize older adults in digital initiatives, they implicitly label them as unimportant participants in the tech economy. Infrastructure must therefore be coupled with intentional outreach and inclusive policy planning.

5. Training and Design Strategies

5.1. Training and Support Models

Community-based training programs, including peer mentoring and intergenerational learning (e.g., digital training by grandchildren), have shown promise in improving digital confidence (Pihlainen et al., 2021; Miller et al., 2024). Training should be ongoing, personalized, and hands-on to effectively build competence and reduce anxiety. In addition to technical skills, training should address common fears, build self-efficacy, and demonstrate the personal value of AI tools. Online and in-person workshops, user manuals written in plain language, and responsive help desks are also valuable. Embedding digital skills instruction in senior centers, libraries, or healthcare settings can improve accessibility. Programs that integrate AI tools into daily routines—such as medication reminders or social chat features—help reinforce their usefulness.
Crucially, these training programs must not treat older adults as passive learners. Empowerment-based approaches, where older individuals can contribute feedback or shape content, foster greater autonomy and ownership. This shift from “tech help” to digital empowerment changes the social framing of older adults’ relationship with AI.

5.2. Design Recommendations

AI systems designed for older adults should feature voice-first interfaces, clear instructions, and simplified navigation. Personalization, empathetic language, and multimodal communication (e.g., audio and visual cues) improve engagement (Huang et al., 2025; Khamaj, 2025; Rodriguez-Martinez et al., 2024). Chatbots that “remember” the user’s name, preferences, and conversation history provide a sense of continuity and care. Design principles should reflect not only universal accessibility but also cultural sensitivity. For instance, formality, tone, and humor in chatbot dialogue may need to be adapted across age groups, languages, or regions. Moreover, age-related needs—like managing chronic conditions or navigating bereavement—should inform functional features within AI tools.
Building trust is also critical—users need to understand what AI can and cannot do. Transparent data policies, opt-in consent models, and the ability to control privacy settings contribute to greater trust and acceptance (Gudala et al., 2022; Huang et al., 2025; Shandilya & Fan, 2024). Design should emphasize inclusion and be informed by co-design processes involving older adult participants. By involving older adults as co-designers—not just test subjects—developers gain access to lived expertise, leading to more intuitive, respectful tools. This participatory design approach helps close the empathy gap between creators and users.

6. Outcomes of AI Interventions

6.1. Reducing Loneliness and Isolation

Conversational agents such as Replika, ElliQ, and GPT-based chatbots have demonstrated potential to reduce feelings of loneliness among older adults (Alotaibi & Alshahre, 2024; Chou et al., 2024; Rodriguez-Martinez et al., 2024). These tools provide companionship, listen empathetically, and maintain routine interactions that create a sense of consistency and care. Daily or scheduled conversations can simulate social rhythm and help older adults feel less alone. Consistent engagement with voice assistants or chatbots may be related to improved mood and lower perceived loneliness (Alotaibi & Alshahre, 2024; Chou et al., 2024; Rodriguez-Martinez et al., 2024). For older adults with mobility issues or those living alone, AI chatbots serve as a critical communication bridge—sometimes becoming their most regular conversational partner.
Unlike occasional family calls or brief clinician check-ins, AI companions can offer persistent emotional availability. This always-on support—even if artificial—can be experienced as a stabilizing force in daily life. Importantly, the success of these agents depends not only on technological sophistication but on emotional design—tone of voice, empathy simulation, and narrative coherence.

6.2. Promoting Emotional Expression and Mental Stimulation

Chatbots can prompt users to reflect on their emotions, recount memories, or engage in light humor—all of which support emotional processing and cognitive engagement. Some systems are programmed to recognize mood shifts and respond accordingly, providing tailored psychological support (Durak, 2024). For example, a chatbot might detect a downbeat tone and offer encouraging responses or relaxation prompts. AI can also provide cognitive stimulation through games, quizzes, or storytelling prompts. These activities not only enhance engagement but may contribute to delaying cognitive decline. Furthermore, voice-based systems are particularly suitable for individuals with visual impairments or early-stage dementia. There is emerging potential for AI systems to offer therapeutic dialogues, using structured prompts inspired by techniques from reminiscence therapy or cognitive behavioral strategies. When integrated thoughtfully, such functions allow AI to become not just a reactive tool, but a proactive partner in emotional health.

7. Discussion

AI technologies hold promise in addressing core psychological needs among older adults, especially when designed and implemented with empathy and inclusivity. The reviewed evidence suggests that when AI tools are tailored to the preferences and limitations of older users, they can effectively support emotional well-being, reduce isolation, and enhance day-to-day quality of life. However, technology alone is not a solution. Interdisciplinary collaboration across psychologists, designers, gerontologists, and social workers is essential to create systems that are both functional and emotionally resonant. Design teams must prioritize co-creation and test interventions with diverse older adult populations to avoid assumptions about needs or capabilities. Moreover, ethical issues around surveillance, data security, and emotional manipulation must be addressed head-on. Clear guidelines and regulations are needed to ensure that AI tools promote autonomy rather than dependency, and that emotional support is delivered authentically and respectfully. Developers must also guard against reinforcing stereotypes—AI should empower older adults, not infantilize them. In addition, there are risks of over-reliance on AI for emotional support, which could reduce real relationships with others and discourage physical activity. Cultural differences in technology acceptance must also be considered to ensure AI tools are inclusive and effective. Finally, data privacy concerns are particularly important for older adults, who may be more vulnerable to breaches of sensitive personal information.
A key insight emerging from this review is that AI’s success in aging contexts depends less on technological power and more on the human values embedded within its design and delivery. Emotional authenticity, perceived agency, and meaningful connection are central psychological needs that AI can either support—or undermine—depending on its implementation. It is also important to recognize that older adults are not a monolithic group. Differences in socioeconomic status, education, ethnicity, cognitive health, and previous tech exposure all shape their experiences with AI. Future work must avoid generalizing “the older user” and instead embrace user diversity. Tailoring interventions by context, culture, and capacity will help ensure equitable access and meaningful benefit. Summaries of the key points identified in the research are presented in Table 1.

8. Future Directions

Future research should focus on longitudinal outcomes of AI use on mental health, comparative effectiveness of different chatbot designs, and cultural adaptation across diverse aging populations. Large-scale randomized controlled trials are needed to establish causal links between AI interaction and psychological outcomes. Additionally, exploring AI’s role in group communication platforms or support networks may extend its reach beyond individual companionship. Participatory design methods involving older adults from the outset will ensure relevance and increase adoption. Cross-sector partnerships—including public health institutions, technology companies, and senior advocacy organizations—can facilitate inclusive innovation. Policy support and funding for digital inclusion programs will be essential to bridge systemic gaps. Ethical considerations, including data privacy, algorithmic transparency, and emotional authenticity, must remain central. As AI becomes more sophisticated, the line between artificial and human empathy may blur, raising profound questions about trust, identity, and companionship in the digital age.
Moreover, research should investigate how AI agents might serve as bridges between formal care systems and informal social networks. For example, future chatbots could integrate with electronic health records, notify caregivers of mood changes, or facilitate access to virtual community events. Another important direction involves exploring how AI can support transitions in later life—such as bereavement, retirement, or relocation. These emotionally intense periods represent opportunities for AI to act as a stable companion during psychologically vulnerable times. Finally, ethical frameworks must evolve in parallel with technology. As AI systems grow more emotionally intelligent, transparency about their limitations, intentions, and boundaries becomes essential to maintaining user trust and dignity.

9. Conclusions

Artificial intelligence, particularly in the form of conversational agents, offers a powerful opportunity to enhance the psychological well-being of older adults. By addressing digital literacy barriers and incorporating supportive training and inclusive design, AI can serve as a meaningful companion and support tool. Grounded in theory and informed by evidence, the strategic deployment of AI can promote emotional resilience and reduce isolation in our rapidly aging world. However, responsible development, ethical use, and user-centered design must remain at the forefront of this innovation.
If implemented with care, AI can evolve into more than a set of tools—it can become an emotionally intelligent interface that complements human relationships and enriches aging with dignity, connection, and purpose. Ultimately, the promise of AI in aging lies not in replacing human bonds, but in reinforcing them—amplifying emotional support where it is missing and sustaining mental wellness in later life.

Author Contributions

Study Design, J.L.; Writing Original Draft, J.L. and J.A.; Manuscript Review and Editing, J.L. and J.A. All authors contributed to editorial changes in the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was funded by Inha University.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alotaibi, J. O., & Alshahre, A. S. (2024). The role of conversational AI agents in providing support and social care for isolated individuals. Alexandria Engineering Journal, 108, 273–284. [Google Scholar] [CrossRef]
  2. Bai, W., Chen, P., Cai, H., Zhang, Q., Su, Z., Cheung, T., Jackson, T., Sha, S., & Xiang, Y.-T. (2022). Worldwide prevalence of mild cognitive impairment among community dwellers aged 50 years and older: A meta-analysis and systematic review of epidemiology studies. Age and Ageing, 51(8), afac173. [Google Scholar] [PubMed]
  3. Cao, B.-F., Zhou, R., Chen, H.-W., Liang, Y.-Q., Liu, K., Fan, W.-D., Huang, R.-D., Huang, Y.-N., Zhong, Q., & Wu, X.-B. (2024). Association between mobility limitations and cognitive decline in community-dwelling older adults: The English longitudinal study of ageing. The Gerontologist, 64(12), gnae139. [Google Scholar] [CrossRef] [PubMed]
  4. Carstensen, L. L. (1992). Motivation for social contact across the life span: A theory of socioemotional selectivity. Nebraska Symposium on Motivation, 40, 209–254. [Google Scholar]
  5. Carstensen, L. L. (2021). Socioemotional selectivity theory: The role of perceived endings in human motivation. The Gerontologist, 61(8), 1188–1196. [Google Scholar] [CrossRef] [PubMed]
  6. Carstensen, L. L., Isaacowitz, D. M., & Charles, S. T. (1999). Taking time seriously: A theory of socioemotional relativity. American Psychologist, 54(3), 165–181. [Google Scholar] [CrossRef]
  7. Chaturvedi, R., Verma, S., Das, R., & Dwivedi, Y. K. (2023). Social companionship with artificial intelligence: Recent trends and future avenues. Technological Forecasting and Social Change, 193, 122634. [Google Scholar] [CrossRef]
  8. Chou, Y.-H., Lin, C., Lee, S.-H., Lee, Y.-F., & Cheng, L.-C. (2024). User-friendly chatbot to mitigate the psychological stress of older adults during the COVID-19 pandemic: Development and usability study. JMIR Formative Research, 8, e49462. [Google Scholar] [CrossRef]
  9. Courtin, E., & Knapp, M. (2017). Social isolation, loneliness and health in old age: A scoping review. Health & Social Care in the Community, 25(3), 799–812. [Google Scholar]
  10. D’Alfonso, S. (2020). AI in mental health. Current Opinion in Psychology, 36, 112–117. [Google Scholar] [CrossRef]
  11. Domenech-Abella, J., Lara, E., Rubio-Valera, M., Olaya, B., Moneta, M. V., Rico-Uribe, L. A., Ayuso-Mateos, J. L., Mundó, J., & Haro, J. M. (2017). Loneliness and depression in the elderly: The role of social network. Social Psychiatry and Psychiatric Epidemiology, 52, 381–390. [Google Scholar] [CrossRef]
  12. Durak, E. S. (2024). Harnessing artificial intelligence (AI) for psychological assessment and treatment in older adults. Journal of Aging and Long-Term Care, 7(2), 55–82. [Google Scholar] [CrossRef]
  13. Flake, A., Wetherell Loebach, J., & Gatz, M. (2009). Depression in older adults. Annual Review of Clinical Psychology, 5, 363–389. [Google Scholar] [CrossRef]
  14. Guarnera, J., Yuen, E., Macpherson, H., & Dourado, M. C. N. (2023). The impact of loneliness and social isolation on cognitive aging: A narrative review. Journal of Alzheimer’s Disease Reports, 7(1), 699–714. [Google Scholar] [CrossRef]
  15. Gudala, M., Trail Ross, M. E., Mogalla, S., Lyons, M., Ramaswamy, P., & Roberts, K. (2022). Benefits of, barriers to, and needs for an artificial intelligence-powered medication information voice chatbot for older adults: Interview study with geriatrics experts. JMIR Aging, 5(2), e32169. [Google Scholar] [CrossRef] [PubMed]
  16. Holt-Lunstad, J., & Steptoe, A. (2022). Social isolation: An underappreciated determinant of physical health. Current Opinion in Psychology, 43, 232–237. [Google Scholar] [CrossRef]
  17. Hu, T., Zhao, X., Wu, M., Li, Z., Luo, L., Yan, C., & Yang, F. (2022). Prevalence of depression in older adults: A systematic review and meta-analysis. Psychiatry Research, 311, 114511. [Google Scholar] [CrossRef]
  18. Huang, Y., Zhou, Q., & Piper, A. M. (2025). Designing conversational AI for aging: A systematic review of older adults’ perceptions and needs. In N. Yamashita, V. Evers, K. Yatani, X. S. Ding, B. Lee, M. Chetty, & P. Toups-Dugas (Eds.), CHI ’25: Proceedings of the 2025 CHI conference on human factors in computing systems. Association for Computing Machinery. [Google Scholar]
  19. Kasaudhan, K. (2025). AI-driven tools for detecting and monitoring mental health conditions through behaviour patterns. International Journal of Preventive Medicine and Health, 5(3), 14–19. [Google Scholar] [CrossRef]
  20. Kemperman, A., van den Berg, P., Weijs-Perree, M., & Uijtdewillegen, K. (2019). Loneliness of older adults: Social network and the living environment. International Journal of Environmental Research and Public Health, 16(3), 406. [Google Scholar] [CrossRef] [PubMed]
  21. Khamaj, A. (2025). AI-enhanced chatbot for improving healthcare usability and accessibility for older adults. Alexandria Engineering Journal, 116, 202–213. [Google Scholar] [CrossRef]
  22. MacLeod, S., Tkatch, R., Kraemer, S., Fellows, A., McGinn, M., Schaeffer, J., & Yeh, C. S. (2021). COVID-19 era social isolation among older adults. Geriatrics, 6(2), 52. [Google Scholar] [CrossRef]
  23. McCaffery, J. M., Anderson, A., Coday, M., Espeland, M. A., Gorin, A. A., Johnson, K. C., Knowler, W. C., Myers, C. A., Rejeski, W. J., Steinberg, H. O., Steptoe, A., & Wing, R. R. (2020). Loneliness relates to functional mobility in older adults with type 2 diabetes: The look AHEAD study. Journal of Aging Research, 1, 7543702. [Google Scholar] [CrossRef]
  24. Miller, L. M. S., Callegari, R. A., Abah, T., & Fann, H. (2024). Digital literacy training for low-income older adults through undergraduate community-engaged learning: Single-group pretest-posttest study. JMIR Aging, 7, e51675. [Google Scholar] [CrossRef] [PubMed]
  25. Pais, R., Ruano, L., Carvalho, O. P., & Barros, H. (2020). Global cognitive impairment prevalence and incidence in community-dwelling older adults: A systematic review. Geriatrics, 5(4), 84. [Google Scholar] [CrossRef]
  26. Pihlainen, K., Korjonen-Kuusipuro, K., & Karna, E. (2021). Perceived benefits from non-formal digital training sessions in later life: Views of older adult learners, peer tutors, and teachers. International Journal of Lifelong Education, 40(2), 155–169. [Google Scholar] [CrossRef]
  27. Piper, A. M., Brewer, R., & Cornejo, R. (2017). Technology learning and use among older adults with late-life vision impairments. Universal Access in the Information Society, 16, 699–711. [Google Scholar] [CrossRef]
  28. Rodriguez-Martinez, A., Amezcua-Aguilar, T., Cortes-Moreno, J., & Jimenez-Delgado, J. J. (2024). Qualitative analysis of conversational chatbots to alleviate loneliness in older adults as a strategy for emotional health. Healthcare, 12(1), 62. [Google Scholar] [CrossRef] [PubMed]
  29. Shandilya, E., & Fan, M. (2024). Understanding older adults’ perceptions and challenges in using AI-enabled everyday technologies. In Chinese CHI ’22: Proceedings of the tenth international symposium of Chinese CHI. Association for Computing Machinery. [Google Scholar]
  30. Shankar, A., McMunn, A., Banks, J., & Steptoe, A. (2011). Loneliness, social isolation, and behavioral and biological health indicators in older adults. Health Psychology, 30(4), 377–385. [Google Scholar] [CrossRef]
  31. Stoeckel, K. J., & Litwin, H. (2015). The impact of social networks on the relationship between functional impairment and depressive symptoms in older adults. International Psychogenetics, 28(1), 39–47. [Google Scholar] [CrossRef]
  32. Su, Y., Rao, W., Li, M., Caron, G., D’Arcy, C., & Meng, X. (2023). Prevalence of loneliness and social isolation among older adults during the COVID-19 pandemic: A systematic review and meta-analysis. International Psychogeriatrics, 35(5), 229–241. [Google Scholar] [CrossRef]
  33. Sun, E., & Ye, X. (2024). Older and fearing new technologies? The relationship between older adults’ technophobia and subjective age. Aging & Mental Health, 28(4), 569–576. [Google Scholar]
  34. Tomida, K., Shimoda, T., Nakajima, C., Kawakami, A., & Shimada, K. (2024). Social isolation/loneliness and mobility disability among older adults. Current Geriatrics Reports, 13, 86–92. [Google Scholar] [CrossRef]
  35. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. [Google Scholar] [CrossRef]
  36. Vercruyssen, A., Schirmer, W., Geerts, N., & Mortelmans, D. (2023). How “basic” is basic digital literacy for older adults? Insights from digital skills instructors. Frontiers in Education, 8, 1231701. [Google Scholar]
  37. Wang, S., Bolling, K., Mao, W., Reichstadt, J., Jeste, D., Kim, H. C., & Nebeker, C. (2019). Technology to support aging in place: Older adults’ perspectives. Healthcare, 7(2), 60. [Google Scholar] [CrossRef] [PubMed]
  38. Wildenbos, G. A., Peute, L., & Jaspers, M. (2018). Aging barriers influencing mobile health usability for older adults: A literature based framework (MOLD-US). International Journal of Medical Informatics, 114, 66–75. [Google Scholar] [CrossRef] [PubMed]
  39. Wilson, J., Heinsch, M., Betts, D., Booth, D., & Kay-Lambkin, F. (2021). Barriers and facilitators to the use of e-health by older adults: A scoping review. BMC Public Health, 21, 1556. [Google Scholar] [CrossRef] [PubMed]
  40. Wong, A. K. C., Lee, J. H. T., Zhao, Y., Lu, Q., Yang, S., & Hui, V. C. C. (2025). Exploring older adults’ perspectives and acceptance of AI-driven health technologies: Qualitative study. JMIR Aging, 8, e66778. [Google Scholar] [CrossRef]
Table 1. Summary of Key Themes in AI and Older Adults’ Psychological Well-Being.
Table 1. Summary of Key Themes in AI and Older Adults’ Psychological Well-Being.
ThemeSubcategoriesKey Insights
Challenges in AI AccessDigital literacy, usability, psychological barriers, accessOlder adults face physical, cognitive, and motivational obstacles in using AI technologies.
Training and Support StrategiesCommunity training, intergenerational learning, tailored helpPersonalized and community-based training builds confidence and facilitates sustained use.
Design ConsiderationsVoice-first design, accessibility, personalizationEmpathetic and simple design enhances usability and emotional connection with AI systems.
Theoretical FrameworksSocioemotional selectivity, UTAUTOlder adults prioritize emotionally meaningful interactions; adoption improves with support.
AI for Well-BeingChatbots, voice agents, cognitive stimulationConversational AI can reduce loneliness, provide emotional support, and promote engagement.
Ethical and Future DirectionsData privacy, co-design, equityOngoing research, inclusive innovation, and ethical safeguards are critical for sustainable use.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lee, J.; Allen, J. Bridging the Gap: The Role of AI in Enhancing Psychological Well-Being Among Older Adults. Psychol. Int. 2025, 7, 68. https://doi.org/10.3390/psycholint7030068

AMA Style

Lee J, Allen J. Bridging the Gap: The Role of AI in Enhancing Psychological Well-Being Among Older Adults. Psychology International. 2025; 7(3):68. https://doi.org/10.3390/psycholint7030068

Chicago/Turabian Style

Lee, Jaewon, and Jennifer Allen. 2025. "Bridging the Gap: The Role of AI in Enhancing Psychological Well-Being Among Older Adults" Psychology International 7, no. 3: 68. https://doi.org/10.3390/psycholint7030068

APA Style

Lee, J., & Allen, J. (2025). Bridging the Gap: The Role of AI in Enhancing Psychological Well-Being Among Older Adults. Psychology International, 7(3), 68. https://doi.org/10.3390/psycholint7030068

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop