Human-Computer Interaction in Digital Mental Health

: Human-computer interaction (HCI) has contributed to the design and development of some efﬁcient, user-friendly, cost-effective, and adaptable digital mental health solutions. But HCI has not been well-combined into technological developments resulting in quality and safety concerns. Digital platforms and artiﬁcial intelligence (AI) have a good potential to improve prediction, identiﬁcation, coordination, and treatment by mental health care and suicide prevention services. AI is driving web-based and smartphone apps; mostly it is used for self-help and guided cognitive behavioral therapy (CBT) for anxiety and depression. Interactive AI may help real-time screening and treatment in outdated, strained or lacking mental healthcare systems. The barriers for using AI in mental healthcare include accessibility, efﬁcacy, reliability, usability, safety, security, ethics, suitable education and training, and socio-cultural adaptability. Apps, real-time machine learning algorithms, immersive technologies, and digital phenotyping are notable prospects. Generally, there is a need for faster and better human factors in combination with machine interaction and automation, higher levels of effectiveness evaluation and the application of blended, hybrid or stepped care in an adjunct approach. HCI modeling may assist in the design and development of usable applications, and to effectively recognize, acknowledge, and address the inequities of mental health care and suicide prevention and assist in the digital therapeutic alliance.


Introduction
Human-computer interaction (HCI) involves the design and development of computer technology with a focus on facilitating its use from accumulated influences [1]. HCI also offers the implementation and evaluation of technologies with an importance on user experience (UX).

Early Evolution of Psychological Science in Human-Computer Interaction
Although HCI emerged in the 1950s, its psychological aspects did not arise until the 1970s when cognitive engineering interfaced with computer science and technology [2]. HCI prospered in the 1980s, becoming an important factor for computer scientists to design and develop successful computer UXs, notably graphical interfaces via multiple tiled windows with various applications (most commonly text processing and spreadsheets but also video games and the World-Wide-Web) [3]. In the early 1980s, there was a strong clinical resistance to the use of automation in mental health stemming from a lack of understanding of the psychological factors of HCI [4]. In the mid-1980s, human needs of HCI were overshadowed by the technological advances of harder (more technical) sciences and constrained by obstacles related to the potential of its body of knowledge (low impact, scope, affect and user application) [5]. Innovative computer system design in the late 1980s sought solutions to a range of cognitive, social and organizational problems [6].
Developments in computer technology continued to dominate HCI theory development in the 1990s because of the inherently unclear conceptual structure of cognitive psychology [7]. There was a progressive and challenging test domain for technological enhancement and application [8]. Few computer services were used in regular mental health care mainly because of usability and adoption issues, but randomized controlled trials (RCTs) demonstrated the potential of self-help and clinical integration from the use of desk-or laptop computers for screening and tracking for phobic, anxiety, panic, and obsessive-compulsive disorders, nonsuicidal depression, obesity, and smoking cessation [9].

Modern Developments in Human-Computer Interaction and Digital Mental Health
Themes of early modern developments in HCI (since 2005) suggest that researchers grappled with human factors. An evaluation of an automated telephony mental health care system guided HCI included the findings that users preferred a system that sounded and spoke like a human-professional rather than a machine [10]. Information and communication technologies (ICTs) were primarily designed for commercial purposes-there were suggestions of bias, misinformation and disinformation because no web-based technologies and designs met the needs of people recovering from a severe mental illness who are seeking employment (codesign for HCI and mental health innovation were generally lacking for marginalized populations) [11]. But young people were very satisfied with internet-based self-help [12].
A lack of understanding of socio-cultural values and a lack of evaluation of the patterns of use [13] led to a call for culturally sensitive user interfaces and more inclusive theory and evaluation processes [14] to improve access to culturally appropriate mental health care for Indigenous Australians [15,16]. After meaningful community engagement, a conceptual framework of an Indigenous Australian project was changed from 'digital mental health' to the culturally relevant 'digital social and emotional wellbeing', which led to outcomes mostly through an online therapy program and comprehensive website [17].
Strong evidence was established for technology-based mental health interventions in anxiety and depression treatment [18]. Design and evaluation guidelines described the accumulated HCI knowledge [19]. A review of behavioral intervention technologies (BITs) in mental health discussed the varied levels of evaluation (ranging from basic development and evaluation to implementation studies) for web-based and mobile interventions, social media, virtual worlds as well as humans and gaming [20]. There was a call for the integration of theoretical and research concepts from behavioral and psychological sciences, engineering, computer science, HCI, and communications, and designs to improve development and evaluation methodologies, as well as to develop implementation models for tests [20]. Computational innovations from natural language processing (NLP) used in machine learning algorithms were not yet captured [21]. Other HCI challenges included design interfaces, sustainable adoption and the unobtrusive use of mobile phones and wearable sensors [22], digital phenotyping for mental health, [23] and wearable interfaces, online communities, and positive computing [24].
Digital mental health treatments did not make a successful transition from positive efficacy trials to implementations [25]. Although the potential of digital mental health (including machine learning and gaming) means the genre must be generally considered in the context of real-world use [26], a research scan found that only 30% of the papers consisted of clinical studies [27]. It was recommended to separately consider the efficacy and retention of digital tools for interventions as well as the opportunity for creativity in computer and games design to precisely build and assess preventive or therapeutic tools [27]. An integrative review found a general oversight of HCI by mental health care practitioners in the deployment of e-mental health interventions for therapeutic purposes, resulting in many applications falling short of safety and quality assurance [28]. The use of digital technologies in mental health care was hindered by privacy, trust and UX issues (e.g., substantial cognitive load and a lack of personalization) which negatively affected engagement and retention [29][30][31][32]. Future digital health studies were recommended to provide increased priority to the human factors in combination with machine interaction and automation categories of HCI [33]. To our knowledge, there is no literature specifically focusing on these two categories of HCI in relation to digital mental health.
The COVID-19 pandemic overwhelmed mental health care resources [34]. Web-based therapies and apps facilitated access and follow-up (although lacking in evidence from population studies) [35]. Virtual psychiatric care visits became standard practice and there was a marked increase in the use of mental health care apps (e.g., Talkspace) [36]. The strong evidence base for telemental health meant that it was recommended for rapid scaleup and envisaged as an adjunct approach in the long term (especially for youth and indigenous populations) [37]. But there are inequities stemming from accessibility to and competency with digital mental health care, in addition to safety and privacy issues from insecure digital platforms, and the lack of a stepped model of care [36]. It was recommended to rectify these issues and focus investment on secure, stable digital platforms and interactive, personalized artificial intelligence (AI)-based apps for real-time monitoring and treatment [36].
Although the use of digital mental health tools has increased, the literature suggests that HCI is among several issues that needs to be addressed if these tools are to be effective. Therefore, this integrative review will explore some recent and relevant literature on digital mental health and HCI. The aim is to provide an integrated summary of the potential for various stakeholders to assist HCI to stimulate better and faster digital tools and technology resulting in safe and higher quality mental health care.

Methods
Two authors independently assessed all abstracts against the inclusion and exclusion criteria according to the five-step amendment [38] (see Table 1) of a modified integrative review framework [39]. This methodology was applied to purposively sample, critique and synthesize the empirical and theoretical literature converging "digital mental health" and "human-computer interaction" (i.e., with a focus on effectiveness, feasibility, accessibility, sociocultural inclusion, rigor and readiness for adoption and upkeep). The review followed the HCI categories outlined by Stowers and Mouloua [33] (i.e., usability; safety; security, privacy and trust; automation; training and simulation; information/patient records; virtual mental healthcare; and human factors-machine interaction). The human factors-machine interaction and automation categories were especially focused on in the literature search as recommended by Stowers and Mouloua [33]. The searches were within the Science Direct, Sage, Google Scholar, CrossRef and ACM Digital Library databases as well as reference lists of relevant reviews. Search terms included human-computer interaction, digital mental health, web-based and smartphone technology, artificial intelligence, digital interventions, digital phenotyping, telehealth, phone, email, internet, virtual reality, video games, and combinations of these terms. Journal articles and book chapters informed a summary of the historical period of HCI in mental health (1970 to 2004) and conference proceedings, journal articles, media articles as well as websites were analyzed for relevant contributions in the modern era (2005 to 2022).
The purposive sample results were classified from the main themes of investigation/discussion according to the main groups outlined in our definition of digital mental health: web-based and smartphone technologies; artificial intelligence; digital phenotyping; and immersive technologies [40]. The results summarize the body of knowledge and contextualize the opportunities and challenges of HCI in pertinent digital mental health solutions. The discussion critically elaborates on the key HCI challenges and the potential of digital mental health in an assistive capacity. It also expands upon recently addressed issues-ethics, the therapeutic relationship, and applicable models of care for an adjunct approach to mental health care [40].

Web-Based and Smartphone Technologies
Digital mental health interventions are predominantly delivered by self-guided or clinician support (guided) approaches via online programs and apps (web-based and smartphone technologies) for evidence-based therapy to the user [30]. Although web-based interventions (using text-based didactic information, audio, video and animation) were noted in 2013 as being effective in standalone or coach or therapist-assisted treatment, little was known about their design and implementation (e.g., the relationship between the quality and design of websites and user retention and outcomes) [20]. A systematic review conducted in the same year found Internet-based interventions with a cognitive behavioral focus are the most promising in reducing symptoms of depression in young people (with regards to efficacy, adherence and engagement) [41]. Although limited by the high number of heterogenous web-based interventions and a low number of included studies, a systematic review and meta-analysis found effectiveness in terms of a reduction of depression and anxiety and enhanced quality of life and mindfulness skills (e.g., in those with clinical anxiety) [42]. An abundance of mental health-related apps via mobile and desktop devices increased the accessibility and use of internet-based mental health screening, treatment and after-care [43], as well as design for mental wellbeing [44]. Metaanalyses of RCTs including computerized tools found positive outcomes for different mental health disorders [45,46], but there was difficulty in translating these findings in clinical practice because of low engagement [27,47,48].
Digital mental health interventions require innovative methods to increase their potential. A mixed-methods study investigated the behavior and experiences of web-based and smartphone intervention users with the intent to increase engagement-the data analysis specified differences between the overall intervention and different aspects of it with the aim to demonstrate how passive data can help to individualize treatment and improve/assure quality [49]. The limited uptake of some evidence-based services called for higher levels of empirical evidence and adherence to engagement for web-based interventions (including in trials)-comprehensive evaluation was recommended to increase patient safety and fidelity to clinical service guidelines [30]. An RCT of a virtual clinic for university students with psychological distress found utility in the results; although not effective in reducing symptoms of depression, anxiety, or psychological distress there was satisfaction with the virtual approach and suggestions for investigating guided and/or tailored treatments in a stepped model of care [50]. The stepped approach was described as targeting those with mild to moderate anxiety and depression-those people could start with self-help or chatbots followed by therapist-guided digital therapy where there is no improvement; non-responders are coordinated to face-to-face therapy [36]. An evaluation of Internet-delivered treatment systems noted its clinical effectiveness, but recommended guidelines be established for testing of usability with regards to up-to-date and relevant design and technology [51].
Varied levels of empirical evidence have been noted for digital platforms with suggestions for hybrid and stepped models of care to increase usability. A platform that integrates standalone e-mental health services with a global network of face-to-face youth mental health services (MOST+) was noted as being reliable, acceptable, and scalable at the pilot evaluation stage [52]. An RCT with Australian secondary schools established a small but positive effect on help-seeking intentions for mental health from a web-based mental health service that integrates screening, with stepped intervention and CBT [53]. Future trials were recommended to determine a threshold for a clinically meaningful significance. A scoping review of online preventive (early) interventions for youth found a wide-ranging effectiveness, usability, and acceptability but recommended the codesign of clinical trials based on the clinical staging model [54]. A qualitative study recommended service users with a severe mental illness and mental health workers cooperatively use e-mental health resources (e.g., an interactive website) in a community mental health practice [55]. Although an investigation of a parents' online fora of informational support on children's mental health found a significant amount of evidence-based knowledge, the quality of mental health information shared on digital platforms is relatively unknown [56].
A youth codesigned mobile phone app was proposed as potentially useful for selfmonitoring and the management of mood symptoms in young people with depression, suicidal ideation and self-harm [57]. But various studies have concluded that apps used for self-help for a range of mental health disorders are of poor quality [27]. The efficacy and usability of apps is less than that of digital platforms (a small number of RCTs demonstrated an effect on the primary outcome measure) [58]. It was suggested to evaluate the potential of apps beyond empirical evidence and usefulness as a monitoring tool for digital phenotyping [58]. A cluster RCT with adolescents evaluated the effectiveness of a mental health self-monitoring mobile app. Efficacy was not found but there is a potential assistive capacity with regards to providing real-time summaries of client data to therapists. Codesign was noted as important to increase engagement, but digital platforms/websites may be better investments for providing mental health information [59]. A review of the usability of mobile mental health apps found it to not be the focus of HCI evaluation; there was a suggestion to establish a usability questionnaire to measure prospects beyond effectiveness, efficiency, and satisfaction [60].
HCI methods and knowledge may help to foster the digital therapeutic alliance (e.g., in mental health apps) [61]. A conceptual study proposed that HCI theories (i.e., persuasive system design, affective computing, eudemonic psychology and positive computing, and the human-smartphone connection) may contribute to a befitting, customized measure of the digital therapeutic alliance (as opposed to translating from traditional measures of the therapeutic alliance) [61]. A review found clients were generally satisfied with the therapeutic alliance through videoconference therapy, but computers may assist or interfere with the client-psychologist relationship [62]. A qualitative exploration of the acceptability of a digital mental health platform for delivering CBT by a virtual coach with university students found engagement issues related to a lack of interpersonal factors in the digital therapeutic alliance and suggested to improve the platform's functionality and to change the avatar to be less humanlike to increase usability and effectiveness [63].
Social media use and the associated analysis of behavioral change is proposed to have potential in explaining and intervening with mental health. A meta-review analyzed computer-mediated communication (CMC) via ICTs (e.g., email, mobile texting, instant messenger, and social network sites) in terms of their operationalization (i.e., technologycentered or user-centered) and association with a diversity of mental health [64]. There was a very small negative effect from the use of social network sites-more comprehensive mental health outcomes are required as well as more rigorous understanding of the characteristics of interactions and transmitted messages (rather than just analyzing screen time) [64]. An unsupervised approach (clustering via machine learning algorithms that used NLP) was applied in an analysis to understand users' behavioral features with a social network site (mainly Twitter) and distinguish normal users from at-risk users (the latter were characterized by the scale of change of their use) [65]. The proposed early intervention approach lacked discussion on the potentially hindering human factors-machine interaction with regards to a clinical psychologist needing to be resourced and trained to observe and verify findings (determining true and false positive cases) as well as providing counseling and treatment. A similar study that identified patterns of language in social media users aimed to distinguish between users diagnosed with a mental disorder and healthy users with a model of emotion evolution to assist clinicians in diagnosing patients (with depression, anorexia, and self-harm tendencies) [66].
A blended approach possibly using a combination of digital tools in addition to pharmaceuticals may elicit enhanced insights. A machine learning algorithm applied NLP method in content analysis of SMS text messages on a digital mental health service platform (i.e., Talkspace) to identify users' COVID-19 pandemic-related concerns [67]. This study demonstrated the potential to increase understanding of contextual mental ill-health from a blend of internet-based technology with digital phenotyping and AI. The overall significant increase in anxiety was drawn from a blended approach (i.e., unstructured therapy transcript data as well as clinical assessment for anxiety and depression). A bioaffective-digitalism theoretical framework was proposed in a case study noting the potential of mixing human and non-human factors through an ingestible sensor (Abilify MyCite) that is connected to an online portal on the Internet and smart phones as well as trackers, wearable patches, apps, and programming [68].
A blend of peer-to-peer and digital platform approaches may be a formidable adjunct to clinical care. In 2015, a systematic review of online peer-to-peer support for young people found a lack of studies and effectiveness despite increased use as an active intervention [69]. More recently, there is the suggestion that utilizing (para-)professionals offers a good potential to fill gaps in mental health care (i.e., if traditional mental health care systems effectively collaborate). A framework of engagement was applied in the analysis of patterns of use in online peer-to-peer support platforms (Reddit and Talklife) which facilitate mental health support but requires users to interact and engage [70]. Increased engagement with online platforms may be derived from the use of mental health subcommunities (e.g., as per Reddit) and mutual discourse for peer-supporter retention (particularly relevant on Talklife) [70]. However, it was acknowledged that encouraging users to report selfdisclosures may have potentially negative consequences with regard to their vulnerability. There is increasing use of peer support workers, but it is not yet known if this approach can successfully blend with clinical support and engagement [71]. A scalable peer-to-peer paraprofessional training and supervision program in an American university setting demonstrated reliability, replicability, and adaptability for supporting CBT delivery in a hybrid model of care [72].

Artificial Intelligence
AI such as computer vision, NLP, machine learning and reinforcement learning systems facilitates machines to perform sophisticated and anthropomorphic functions [73]. Early evidence of HCI through affective NLP found that automated assistive systems have the potential to emotionally respond from interpreting human language and an accumulation of sentiment from text and speech [74]. In recent years, AI is being applied in multi-faceted ways-predictive tools are being tested and used to determine mental ill-health and suicide risks [75] and to coordinate tailored treatment plans [76]. Furthermore, therapeutic chatbots provide readily available support [77,78] and interventions, [79] including through video games [80]. A review of AI and mental health outlined data sources as electronic health records, mood rating scales, brain imaging data, monitoring systems and social media platforms to predict, organize, or subgroup a range of mental ill-health and suicidality [81]. Machine learning automates processes, analyzes big data, and assists mental health care practitioners with making decisions on an individual's mental ill-health or suicide risk, but there has yet to be any accurate prediction of specific risks across populations [40,82].
AI technology design and use in mental health care has increased quality, accessibility, affordability, convenience, and efficiency [83][84][85][86]. The main advantages of machine learning are that it is scalable and highly accurate in mental ill-health prediction, but it is mostly conceptual and lacking in empirical evidence, which limits its clinical application [87]. The main disadvantages are a lack of information on model building and uncertain accuracy for suicide risk prediction [81,88,89], a lack of external evaluation of population studies [90,91], different evaluation approaches in cohort studies [92,93]), and a lack of user-centered design processes that thwart HCI [87].
Usability challenges for machine learning include the sufficient skills and time required to develop and run models, users lacking trust in the models, and the struggle rooted in human-machine learning disagreement [82]. Human-centered AI (HAI) was suggested to counter HCI deficiencies-it is a user-centered HCI approach consisting of human factors design, ethically aligned design, and technology that covers human intelligence [94]. Human factors design can benefit from cost-benefit analyses to provide information on expectations and to clarify the prediction target [82]. In addition, it is important to build trust, decrease disagreement, improve responsibility, explain a model's logic, quantify specific contributions to the prediction, assess the performance metrics, and illustrate historical predictions from previous studies [82].
The adjunct potential of chatbots in mental health care is a good example of how HAI can be applied in research. A review of chatbots and conversational agents used in mental health found a small number of academic psychiatric studies with limited heterogeneitythere is a lack of high-quality evidence for diagnosis, treatment or therapy but there is a high potential for effective and agreeable mental health care if correctly and ethically implemented [95]. A major research constraint is that chatbots and predictive algorithms may be biased and perpetuate inequities in the underserved and the unserved [96][97][98][99]. The ethics of a patient-therapist relationship and the limited skills and emotional intelligence of chatbots requires a solution [100]. NLP and other machine learning algorithms could potentially help solve problems by identifying an ideal digital therapeutic bond [101].
The failure to achieve effectiveness and external evaluation of useful and real problemsolving AI solutions in the first two waves of AI (1950s-1970s and 1980s-1990s) led to the call for optimized and ethical user-centered designs (UCD) for explainable, comprehensible, useful, and usable AI [94]. The most notable UCD application is in Explainable AI (XAI) e.g., it may facilitate users to understand the algorithm and psychological theory parameters as well as the outputs (characterized by evaluation of strengths and weaknesses) to assist in increasing the decision-making efficiency [94,102]. XAI is mostly applied with predictive technologies-a user interface (UI) provides a holistic understanding of the machinepatient-therapist relationship to improve safety and efficacy and instil responsibility [103]. The concept of Explainable Robotics introduced the potential of human-robot interactions whereby machine learning algorithms give explanations that help robots communicate with humans in a trustworthy and acceptable way [104].
The call for blending clinical and AI approaches in the screening and treatment of psychiatric disorders may potentially boost UCD in AI. The lack of research focusing on the prevention of the sequalae of psychiatric disorders and the vulnerability of ex-COVID patients to mental health disorders led to a conceptual analysis that explored blending clinical approaches (i.e., drawn from clinical rating scales and self-rating questionnaires) with state-of-the-art AI-based tools for expedited and thorough diagnosis, prevention, and treatment of psychiatric disorders [105]. A multimodal psychopathology prediction protocol combined AI methods with comprehensive psychological, neurophysiological, semantic, acoustic, and facial/oculometric measurements and features [105]. These different categories of multidisciplinary data were suggested to be integrated and coordinated with large scale research efforts. But the lack of use of a HAI framework and the global context of needing more fruitful and efficient strategies in coping with post-COVID overall mental health deterioration requires direction and support. It was suggested that the World Health Organization (WHO) may assist by establishing a multinational interdisciplinary task force for policy design, planning and development of more advanced AI-based innovations in digital psychiatry [105].

Digital Phenotyping
Digital phenotyping is personal sensing from capturing metadata. It unobtrusively measures how a user interacts with the device and might provide a depiction of cognitive traits and affective states as well as add precision to mental health diagnoses and outcomes from combining sensor data, speech and voice data, and HCI [106]. A HCI protocol involving digital biomarkers for cognitive function-a psychometric assessment in conjunction with monitoring of the use a smartphone app (that ran unobtrusively in the background and captured perceptible user activity e.g., swipes, taps, and keystroke events)-found that it could possibly act as a continuous ecological surrogate for laboratory approaches [107].
The potential of digital phenotyping for assisting with effective care of young people with psychological distress called for research to address the practicalities of its future clinical application [108]. A novel hybrid study addressed unmet needs of HCI-the need to increase the adoption and reuse of clinically relevant apps and platforms as well as digital phenotyping [109]. The design of a freely available smartphone platform suitable for youth and underserved populations accounted for HCI considerations e.g., patient demands for trust, control, and community as well as clinician demands for transparent, data driven, and translational tools [109]. The large array of smartphone apps and lack of reproducibility influenced a digital phenotyping design (tracking on a smartphone app) of young adults with a psychotic illness (noting that population-level models are not yet possible) [108]. The individualized actionable clinical insights were noted as feasible but require replication studies and the training of clinicians to expressively use the data, as well as the integration of apps into clinical care [110].
Digital phenotyping is potentially useful to predict abnormal behavior, but it does not provide a causal explanation or psychological understanding of it [111]. Digital sensory phenotyping may provide objective and continuous assessment which may facilitate better clinical interventions, but data security needs to be improved and further research is needed to determine the utility of its data, evaluation and efficiency [112]. Privacy, confidentiality and data sharing concerns were noted about unregulated digital phenotyping and associated digital neuromarketing which potentially undermines human-human interaction [113]. The following recommendations were made to deter negative consequences: technical and public evaluation of technologies and media before release; regulatory processes need to be in place with careful monitoring; public awareness and education on apps; and taxing the information gathering [113].
The benefits of a blended approach of digital phenotyping with clinical assessment have been proposed. For example, digital phenotyping of passive data from smartphones and wearables in addition to questionnaire results could potentially measure the suicidal ideation process through ecological momentary assessment [114]. After analysis of a systematic review [115], digital phenotyping was not recommended for random screening of mental disorders because of the risk of 'false positives', but RCTs were advised for monitoring the mental health pilots and as an assistive tool for clinicians to predict recovery or early relapse [116]. The main challenges of digital phenotyping were noted as reliability, clinical utility, privacy, regulation and application-it was suggested that clinicians help to resolve quality, safety and data security issues by guiding the use of approved and reliable apps for voluntary monitoring [116].

Immersive Technologies
HCI for computer and video games was limited to entertainment purposes until the 2000s, when pilot studies of serious video games emerged as an adjunct to psychotherapy for adolescents [117] and a clinical intervention for schizophrenia [118], anxiety disorders [119] and attention deficit hyperactivity disorders (ADHDs) [120]. A controlled longitudinal study analyzed the effectiveness of a video game on the PlayMancer platform [121]. Enhanced HCI included emotion recognition from speech audio data and integration of user requirements with the game scenario. The alternative therapeutic approach assisted with coping and self-control strategies in patients with diagnosed eating disorders and pathological gambling. However, the evaluation did not progress beyond trials. A prototype evaluation study used an adapted eye-tracking device as a novel HCI integration in video games-73% of users reported that it was easy to fix their visual attention on that point for a couple of seconds to trigger a system action [122]. The design and integration of this system was noted as low-cost-the eye-tracker hardware is affordable, and the software was designed from open source code libraries [122]. Eye-tracking has been established for a decade with the aim to improve human factor aspects of HCI in psychological research [123] but it has yet to be outlined for which psychological symptoms or disorders it is useful.
There has been an assortment of studies that demonstrate that video games may be effective in an adjunct approach to the treatment of ADHD. An evaluation of clinical trials with ADHD adolescents found that video games help the therapeutic relationship, but the HCI design needs to better account for the target audience (e.g., literacy difficulties) to increase engagement and improve the user experience [124]. A systematic review and meta-analysis of RCTs related to the effectiveness of serious games for the treatment of mental disorders found a positive moderate effect on symptoms [125]. A randomized, double-blind, parallel-group, controlled trial using a video-game interface in a digital therapeutic intervention for pediatric patients with ADHD found no adverse effect and it is potentially useful for objectively measuring inattention [126]. In addition, there is effectiveness with a medication-treated pediatric ADHD population and note of the need for pragmatic RCTs to build the evidence generation complemented by qualitative humancentered investigations [80]. An RCT tested the effectiveness of a serious video game (i.e., development and usability) and found that it may complement the current multimodal approach for treating ADHD [127].
A narrative review [128] and a systematic review [129] found efficacy for stress and anxiety reduction from the use of commercial off-the-shelf video games (i.e., exergames, casual video games, action games, action-adventure games, and augmented reality games used on various gaming platforms, including consoles, personal computers, smartphones, mobile consoles, and virtual reality systems). Although commercial video games have design features that instil a sense of flow [128], the systematic review acknowledged that custom-made games (i.e., for serious purposes like education, training, or behavior modification) better integrate biofeedback techniques for relaxation and are more appropriate for adults in regard to their stress and anxiety responses [129]. Future studies were suggested to include a diversity of age groups, a variety of video game genres, the most recent, popular and widely used gaming platforms, the amount of uses required for optimal effect, to follow methodological guidelines for reporting research findings, as well as to describe the individual characteristics of users (e.g., personality and cognitive ability) and their preferences for genre and gaming platform.
Virtual Reality (VR) has been noted as becoming widely accepted in psychology and neuroscience as the most advanced form of HCI allowing individuals to act, communicate and be present in a computer-generated environment [130]. Researchers and VR video game companies are working together for mental health support. The use of a VR biofeedback video game led to a decrease in trait anxiety, but like previous studies the game design did not meet expectations requiring more spatial opportunities for optimized engagement (no improvements observed over a control application for guided relaxation) [131]. Commercially available VR games were used to demonstrate the potential of effectively decreased state anxiety and increased positive emotions in users-the pilot study reinforced adoption of the emotional design conceptual framework that is widely adopted in designing appealing immersive solutions [132]. VR is marked by preliminary evidence; it can be used for improving some symptoms of psychosis in those with schizophrenia [133], with a clinical trial finding excellent usability of a therapeutic VR human-human interface [134]. But VR is mostly noted for use in a CBT intervention for depressive and anxiety disorders-there is a good potential to generally expand immersive intervention for psychological symptoms (e.g., exposure therapy) [128].

Human-Computer Interaction Challenges and Digital Mental Health as an Adjunct to Care
Logical applications were found to be lacking in digital mental health implementation [25], which in effect called for HCI challenges to be addressed. In 2018, a HCI symposium focused on understanding the users of mental health technology, their context of use, how therapeutic technology is used as well as the importance and methods of design [135]. Five key challenges were noted (i.e., entrepreneurship, publishing, funding, theory, and outcomes), and interdisciplinary collaboration was advised to evolve from lessons learned as well as to increase and integrate knowledge and practice to realize faster and better global mental health [135]. HCI was recognized as a key factor for increasing accessibility to diverse communities and decreasing the inequities of mental healthcare through a human-centered approach, as well as delivering enjoyable experiences. Humancentered design (HCD) aims to connect with and understand the needs of service users while retaining a systems viewpoint. HCD methods include journey mapping, prototyping, and user testing [136].
The traditional practice of psychology and psychiatry have inherent issues e.g., clinical mental health diagnostic systems perform poorly in the detection of the early stages of mental disorders [137] and there are inequities that arise from constrained healthcare systems-those with severe mental disorders take up the bulk of resources, in effect marginalizing the underserved (predominantly those with low to moderate mental distress) [138]. Social and environmental factors are often not considered enough in predicting and explaining mental illnesses and disorders [139]. The integration of predictive models with digital screening tools was proposed for faster and better assessment of the underserved for preventive and early intervention of mental health problems in 'majority web delivered and minority inclinic' care [138]. The expected rise in the future mental ill-health burden is likely to be compounded by the socio-economic challenges of the COVID-19 response and widening inequalities (making some people more at-risk) [140]. In addition to prevention via advocacy for vulnerable communities with and without pre-existing psychiatric conditions, mental health practitioners and researchers need to prepare for faster and better treatment through the provision, maintenance, and improvement of existing services [140].
Digital approaches are broadly considered to have a good potential as a mental health care adjunct [63,[141][142][143]. For example, a noted prospect is merging facial recognition and NLP emotion-detection software to provide a complex picture of mood and mental states [79]. Demand outstripping supply of mental health resources during COVID-19 led to the increased use of telehealth, but investment is needed through funding, research, policy changes, training, and equity to provide better access and quality [35]. HCI was suggested as useful to uncover UX and clinical design implications from clinical trials to create patient-centered telehealth solutions [144]. Although digital interventions via common digital tools (e.g., video conferences, social networks, telephone calls, and emails) are effective at the population level for common disorders, mental health care practitioners are grappling with using and overcoming difficulties in connecting with other digital tools (e.g., web-based screening and intervention, AI, smartphone, immersive and wearable technologies, the Internet of Things (IoT) and digital phenotyping) [40]. Specifications are needed for subpopulations [145,146] and the psychological disorders and symptoms for which different types of digital tools are effective and readily useful [147].

Ethics, the Digital Therapeutic Alliance and Blended, Hybrid and Stepped Models of Care
Human factors in combination with machine interaction and automation may positively or negatively affect ethical, quality, safe and secure research and clinical care. A case study highlighted the ethical predicament of promoting a mental health app that uses digital phenotyping to predict negative mood states-the use of anonymized behavioral data (e.g., digital biomarkers) for commercial purposes was proposed to provide an incomplete assessment (e.g., does not address the psychosocial and sociopolitical determinants of mental health and the context in which people experience emotional distress) [148]. There are a range of ethical risks related to the security of sensitive data, socio-cultural adaptability, and fitting education and training of medical professionals [81,149,150]. In addition, there should be consideration of quality therapeutic aspects (e.g., the need for empathetic and inclusive care) which makes the digital therapeutic relationship important to delivering effective, efficient, and patient-centered care [151] in a blended [84] hybrid [152,153] or stepped model of care [36,50,53].
A relatively new HCI theory-concordance-was applied in a three-month user study that found encouraging evidence that users can develop a therapeutic alliance with an interactive online support system [154]. The small feasibility trial suggested larger scale studies employ a design approach involving peer/moderator support as well as automated feedback [154]. A narrative review investigated the psychological aspects of HCI via assessment of the Digital Therapeutic Alliance (DTA) for people with serious mental illnesses and recommended that evidence-based studies facilitate responsible outcomes from a three-tiered approach from the perspectives of patient/user, mental health care practitioners and machines [155]. The therapeutic relationship is central to driving positive change in mental health care [101].
The use of multimodal digital platforms for Technology Enabled Clinical Care (TECC) [156] and informatics infrastructure [157] are emerging as ways to assist faster and better mental health care pathways including for severe psychological distress and suicidality. A fundamental issue is how to not continue adding to the array of digital platforms, apps and electronic medical record systems that are effectively serving in isolation from one another [156]. The design and development of TECC using dynamic simulation modelling and health service implementation research is crucial to how these technologies are adopted and implemented if service efficiency developments and clinical outcomes are to be attained, especially for youth mental health services [156]. The building and testing of HCI models of the UI components for different types of digital health interventions (i.e., predictive HCI modeling of applications) and the development and evaluation of UI for digital health systems such as electronic health record systems was proposed to add to HCD processes and heuristics techniques [158]. Integrating predictive modeling with HCD (i.e., adding real humans into the loop of simulations by computer algorithms that run human-created models) may be useful for advising evidence-based UI design guidelines to support the development of safer and more effective UIs for digital health interventions [158].

Conclusions
The adjunct approach of integrating digital mental health solutions into clinical care is promising, but mental health practitioners are required to play a larger role in overcoming the challenges of HCI and collaborating with researchers, policymakers, governing bodies and developers/entrepreneurs on ways for products and services to be effectively designed, developed, used, strategized, funded and scaled. HCI research and development has unfulfilled potential in furthering long-term evidence for mental health and suicidality. Web-based interventions require higher levels of empirical evidence (e.g., quality and design), engagement and user retention. A significant finding of this integrative review with regards to theoretical and empirical literature synthesis (focusing on effectiveness, feasibility, accessibility, sociocultural inclusion, rigor and readiness for adoption and upkeep) is the potential to design, develop and use an integrated-multimodal digital mental health platform in a stepped, blended or hybrid approach. The evidence base suggested a lack of quality, useful, usable apps. Digital phenotyping is potentially useful in combination with psychometric assessment (to predict abnormal behavior). Serious video games may serve as a complementary approach to treating ADHD in young people, but custom-made games are appropriate for reducing stress and anxiety in adults. VR is an advanced form of HCI-there is potential for using immersive games to help those with schizophrenia as well as in CBT for those with depressive and anxiety disorders.
Although at a conceptual stage, HCI theories are central to tailoring an apt DTA. The HAI framework is of note (i.e., consideration of human factors design, ethically aligned design, and technology that fully reflects human intelligence). The most relevant to HCI professionals is human factors design. There is a need to increase the capacity of explainable, comprehensible AI and useful, usable AI in a stepped, blended or hybrid approach. The human factors-machine interaction and automation categories of HCI are marked by the need for clinical mental health practitioners to observe and verify real-time machine learning findings from screening and treatment. There is potential for peer-to-peer approaches to help fill mental health care gaps. A primarily web-delivered approach for mental health care has the potential to benefit the underserved (often assessed as low-moderate cases of anxiety or depression) as well as assist in better and faster coordination of individualized care for those at-risk of suicide (TECC). There can be severe and compounding consequences of not grasping the window of opportunity for effectively intervening in the sequalae of mental ill-health or suicidality.
It is important that HCI is better incorporated into technological developments for digital mental health (e.g., digital platforms and AI) resulting in higher quality, safety and usability. HCI modeling may help achieve these results by raising evidence-based digital health system design. It points towards integration of predictive modeling with usability and software engineering approaches (e.g., up-to-date and relevant patient/user safety and clinical guidelines). More specifically for digital mental health, there also needs to be a pragmatic codesign incorporating demands from mental health practitioners and users to help strengthen the HCD process and to instil an understanding of how an application achieves real-world effectiveness. Mental health care practitioners may assist towards effective, responsible and very fast digital mental health tools and user populations may assist in aiming towards excellent, enjoyable UX and consistency in validated practices.
Summary of key points: • The integrative review found that HCI has long needed to be better integrated into technological developments for mental health care.

•
The design, development, implementation, and evaluation of digital mental health tools has the potential to help resolve systemic mental health care issues (e.g., through better and faster service for the underserved with low to moderate anxiety and depression as well as TECC for those at-risk of suicide).

•
Digital mental health tools best serve as an adjunct to mental health care-users and mental health practitioners can help improve effective outcomes through codesign of HCI (e.g., the DTA, clinical guidelines on validating machine learning findings as well as stepped models of care that utilize supporting resources-peer workers).

•
There are many web-based or smartphone technology products and services available (especially apps) which serve in telehealth and (self-)guided digital interventions as well as AI, immersive technologies, and digital phenotyping. But a lack of HCI investment has resulted in unrealized potential (e.g., a secure, trusted and eminent integrated-multimodal digital platform using AI has yet to be effectively designed, developed, used, strategized, funded and scaled). • Future research for enhanced quality, safety and usability may benefit from integrating a predictive model with HCD (i.e., adding real humans into the loop of simulations by computer algorithms that run human-created models). Institutional Review Board Statement: Not applicable.

Conflicts of Interest:
The authors declare that they have no conflict of interest.