Next Article in Journal
The Violent Aspect of Widowhood Rites in the South African Context
Previous Article in Journal
Digital Access to Judicial Services in the Brazilian Amazon: Barriers and Potential
Previous Article in Special Issue
Patient and Clinician Experiences with Sharing Data Visualizations Integrated into Mental Health Treatment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Research Trends in the Study of Acceptability of Digital Mental Health-Related Interventions: A Bibliometric and Network Visualisation Analysis

by
Maria Armaou
School of Health Sciences, University of Nottingham, Nottingham NG7 2HA, UK
Soc. Sci. 2024, 13(2), 114; https://doi.org/10.3390/socsci13020114
Submission received: 16 January 2024 / Revised: 7 February 2024 / Accepted: 9 February 2024 / Published: 12 February 2024

Abstract

:
The acceptability of digital health interventions is a multifaceted concept that is central to user engagement. It is influenced by cultural and social norms and it is, also, a key consideration for intervention development and evaluation. For this reason, it is important to have a clear overview of how research in digital interventions’ acceptability has evolved, what type of measures or assessments have been most frequently utilised, and what may be the implications for the knowledge area and future research directions. The purpose of this bibliometric and network visualization analysis was to explore the main research patterns in the study of the acceptability of digital mental health interventions and highlight the key characteristics of knowledge production on this topic. The Web of Science was searched for relevant primary studies, with 990 documents selected for inclusion in this bibliometric analysis. Publications’ metrics, text and author keyword analysis, and bibliographical coupling of the documents provided insights into how technological developments, specific research interests, research priorities, and contexts have shaped research in the field. The main differentiation in acceptability approaches emanated from the studies’ research designs, the stage of intervention development and evaluation, and the extent to which there was a focus on user attitudes, experience, and engagement. These differentiations further indicate the importance of having clarity as to what concepts or elements of acceptability a study addresses as well as approaches that have the potential to address the complexities of acceptability.

1. Introduction

The examination of intervention acceptability is considered necessary in the iterative process of the design, implementation, and evaluation of complex interventions (Skivington et al. 2021). Digital mental health interventions (DMHIs) are complex and they can include a variety of therapeutic components, be delivered via different technologies, be self-guided or offer different types of support, offer access to treatment, and aim to prevent poor mental well-being in the general population (Gega et al. 2022; Renfrew et al. 2021). There is an array of definitions of acceptability including “social acceptability”, “treatment acceptability”, and acceptability frameworks, while other studies have explained their approach to acceptability and how it is measured within their research design (Park et al. 2022; Sekhon et al. 2017). As a result, acceptability is frequently addressed in interventions at different stages of development and evaluation with authors adopting different approaches and assessment methods.
Multiple reviews of studies reporting digital mental health interventions have shown that, although there is evidence for their effectiveness, their main shortcoming is poor patient engagement (Lipschitz et al. 2023). Such findings have driven calls to develop clear criteria in reporting on intervention engagement (Lipschitz et al. 2022, 2023). At the same time, low engagement levels with digital mental health interventions and high attrition lead to a loss of participants in clinical trials, which reduces the prospect of ‘digital therapeutics’ to complete regulatory trials (Nwosu et al. 2022). For this reason, there is a growing need for pragmatic studies to evaluate the usefulness of digital mental health tools (Hekler et al. 2016; Torous and Haim 2018).
Digital mental health is frequently hailed as an opportunity to transform mental health services, providing instantaneous and at-scale access to tailored mental health support across diverse populations and contexts (Hunter et al. 2023; Roland et al. 2020). Many healthcare systems across the world have been turning to telehealth as a means to boost access to mental health services, a trend ever increased following the response to the COVID-19 pandemic (Adepoju 2020; The Scottish Government 2021). However, emerging evidence also highlights the importance of addressing persistent gaps in access to mental health treatment across different demographics within high-, low-, and middle-income countries (Lu et al. 2022). Specific challenges can involve barriers faced by vulnerable groups in accessing digital health services, which can include communication difficulties and inadequate formal and informal support for service users (Gama et al. 2022; Goodman et al. 2021; Kaihlanen et al. 2022). At the same time, sociocultural diversity, poor infrastructure, and low bandwidth can limit intervention adoption (Banerjee et al. 2021).
Intervention acceptability is a separate concept from intervention engagement, but their close relationship forms a dynamic relationship that pertains to users’ attitudes, users’ engagement with the intervention, and intervention usability (Perski and Short 2021). For example, patients’ engagement with digital mental health interventions (DMHIs) has been defined in terms of their adoption rates and the degree of users’ sustained interactions with them (Arnold et al. 2021; Borghouts et al. 2021). Studies on the adoption of digital mental health interventions often deploy technology acceptance theories, such as the Technology Acceptance Model, precisely because they address the impact of technology acceptance constructs (i.e., perceived usefulness and perceived ease of use) on users’ intentions to use technology (Sawrikar and Mote 2022). As shown by previous research, an assessment of users’ intentions to use technology can be a predictor of the adoption of mHealth services and e-mental health for different stakeholders (Alam et al. 2021; De Veirman et al. 2022; Semwanga et al. 2021). Perki and Short’s dynamic model of intervention acceptability demonstrates the complex relationship between acceptability and user engagement (Perski and Short 2021). The key point in Perki and Short’s model is that acceptability is defined as an emergent outcome of the relationship between people’s cognitions, beliefs, and affective responses; these include individuals’ evaluation of the intervention’s potential burden, usefulness, and fitness within their own value systems. Within this model, an individual’s sociocultural contexts are considered to have a significant influence on people’s beliefs that in turn shape and are shaped by user needs. This approach is reflected in previous reviews of the literature on barriers and facilitators of user engagement with digital health interventions (Borghouts et al. 2021; Gauthier-Beaupré and Grosjean 2023). A systematic review of user engagement in DMHIs for common mental health problems identified three clusters of user-centered, program-related, and technology- and environment-related factors that can influence user engagement (Borghouts et al. 2021). User-centered factors included demographic variables, personal traits, beliefs, mental health status, and previous experiences with technology. Program-related constructs included type of content, perceived fit with the user’s culture, level of guidance, social connectedness, and impact of the intervention. Finally, technology- and environment-related constructs included cost and usability, privacy and confidentiality, social influence, and implementation factors (Borghouts et al. 2021). Similarly, a meta-ethnographic study on the acceptability of digital health technologies among francophone-speaking communities across the world showed how the perspectives and attitudes of different stakeholders towards different applications of technology (mobile technologies, robot technologies, telemedicine, sensors, and wearable technologies) can determine their willingness and commitment to use those technologies (Gauthier-Beaupré and Grosjean).
With the global digital mental health market size valued at USD 19.5 billion in 2022 and with a projection of it reaching USD 72.3 billion in 2032 (Gotadki 2024), it is vital to understand the trends of knowledge production that pertain to intervention acceptability. A bibliometric analysis allows an overview of knowledge patterns and identifies knowledge gaps and emergent research fields, which is invaluable, especially when there is a fragmented area of knowledge (Hernández-Torrano et al. 2020; Skute et al. 2019).

2. Objectives

The aim of this study was to explore the knowledge production patterns associated with the concept of acceptability of digital mental health-related interventions in primary research studies. For this reason, the study objectives were to:
(a)
Analyse the evolution of publications of primary studies from 2008 to 2023.
(b)
Analyse the main contributors to the knowledge area.
(c)
Report on authors’ approaches to intervention acceptability based on an analysis of the documents’ bibliometric features.

3. Methods

3.1. Data Sources and Search Strategy

All editions of citation indexes in the Web of Science Core Collection (WOSCC) were used to search articles in January 2024. The WOSCC was selected for this analysis as it is regarded as one of the most authoritative databases and because it is more selective than other databases and has greater keyword sensitivity (Niel et al. 2015). Although its overall coverage is smaller than that of Scopus, the use of only one database allows for a better quality of citation analysis (Caputo and Kargina 2022; Gusenbauer 2022).
The search period was set from January 2008 to December 2023. The year 2008 was selected as that was the first year following the release of the first iPhone. Searches included all primary research studies in all languages. The following query was used to search for relevant topics:
TS = (digital or online or on-line or internet-based or internet* or web-based or web* or computer-based or app* or smartphone or mobile* or mobile-assisted or computer* or wearable or virtual or chatbot or augmented reality) AND
DTS = (mental health or mental disorder or psychological wellbeing or mental wellbeing or mental well-being or mental illness)
AND TS = (acceptability or usability)) NOT (ALL = (protocol) OR ALL = (meta-analysis) OR ALL = (REVIEW) OR ALL = (health records))
TS in the WOS collection includes searches in titles, abstracts, and keywords.
The papers included were articles or proceeding papers. On the other hand, papers were excluded if they were any of the following: book chapters, review or editorial papers, retracted publications, book reviews or corrections, item withdrawals or retractions, withdrawn publications, or data papers.
The keyword searches for papers addressing the concept of acceptability included only the keywords “acceptability” or “usability” as they represent two distinct but greatly interlinked generic concepts. This strategy allowed the inclusion of empirical studies that discuss different ways in which intervention acceptability has been operationalised.

3.2. Eligibility Criteria

Titles and abstracts were screened by applying the following inclusion criteria: study population (any), study design (any), intervention (any digital intervention), outcomes (any). Documents were excluded (a) if they did not make explicit reference to digital tools or interventions, (b) if they reported a digital intervention that was not mental health related either in the relation to the population or its outcomes, (c) if they reported on analysis of health records, or (d) if they only reported on product capabilities or characteristics. Furthermore, review papers and theoretical pieces were excluded.

3.3. Data Selection

All selected papers were saved as “a marked list” on the researcher’s WOS account. The final screening included checking the content of this list to exclude any duplicates and irrelevant papers. A final list of the selected documents was saved as a .txt document.

3.4. Analysis

The analysis was conducted via WOS statistics features, while VOSviewer (V. 1.6.19) was used to elicit tables and visualisation of text and keywords’ co-occurrence, citations and publications, and documents’ bibliographic coupling. Bibliographic coupling is a similarity measure that assesses the frequency in which specific documents are cited together in other publications and, as a result, it is an indicator of shared research themes.
An early review of the selected documents showed that the terms “acceptability” and “usability” were always present in the selected documents’ title but frequently, the abstract and the keywords did not provide adequate information on how those two concepts were assessed. For this reason, the content of 10% (n = 100) of all selected documents was checked, and specific information was extracted on how those concepts were specifically addressed in each document which informed the analytic strategy for a comprehensive review of the documents’ bibliometric features.

4. Results

(A)
Descriptives
A total of 2514 references were identified in the WoS from 2008 to 2023 and were screened based on the study’s inclusion and exclusion criteria. In total, 990 documents were selected from 5598 authors, 1672 institutions, and 78 countries. Figure 1 shows the publication and citation rates. The year 2022 has the peak publication number (192) and citation number (3020), followed by 2023, with 191 documents having 3131 total citations.
(B)
Analysis of knowledge production
A co-authorship analysis was conducted to identify the most productive research authors, research teams, journals, and organisations.

4.1. Authors

Of the 5598 authors, 36 produced at least five papers. Table 1 shows the top 10 most productive authors of the extracted documents. The most productive author was Nick Titov with 14 papers from Macquarie University in Australia, followed Blake F. Dear from Macquarie University, David C. Mohr from Northwestern University in USA, Mario Alvarez-Jimenez from the University of Melbourne, and Gerhard Andersson from Linköping University in Sweden, with 11 papers each.
Figure 2 depicts collaboration maps among the key authors who had at least five documents. The colours represent working groups, the size of the circle depicts the number of articles published by each author, and the lines represent the total strength of the co-authorship between authors. It shows 15 clusters of authors from different countries. Three clusters of researchers (yellow, green, and red clusters) include authors from Australian institutions, while the purple cluster includes authors from USA-based institutions.
Among those, there was only one cluster of consistent co-authorships between European and USA-based working groups. It consisted of the working group in red, which included Pim Cuijpers and Heleen Riper from the Vrije University of Amsterdam and David and Daniel Ebert from the Technische Universität München (TUM). Pim Cuijpers also collaborated with Dr. Patel Vlkram from Harvard University, while John Torous from Harvard University collaborated with Sandra Bucci from the University of Manchester.

4.2. Journals

The most popular sources were JMIR Formative Research with 89 records, JMIR mental health with 72 records, and Journal of Medical Internet Research with 59 records. JMIR Mental Health was ranked first in citations, with 1893, followed by the Journal of Medical Internet Research with 1472 citations, and PlOS One with 784 citations (Table 2).

4.3. Institutions

The most productive institutions were the University of Melbourne with 46 records, Kings College London with 41 records, and University of Sydney with 34 records. The University of New South Wales was ranked first in citations, with 950, followed by the University of Melbourne with 800 citations, and King’s College London with 638 citations (Table 2).

4.4. Countries

The most productive countries were the USA with 391 records and 5737 citations, followed by Australia with 193 records and 3671 citations, and England with 163 records and 1955 citations (Table 2).
(C)
Analysis of authors’ approaches to the acceptability of digital mental health-related interventions

4.5. Text Co-Occurrence Analysis

A co-occurrence analysis of text found at least ten times across all documents’ titles and abstracts was conducted in order to obtain a broad overview of key areas of interest and further inform the data analysis and synthesis strategy. A binary frequency was applied, which means that the presence or the absence of a term in a document was counted, but the number of times the term was used in each document was not counted. Of the 22,942 terms, 734 met the set threshold and, subsequently, the most relevant terms were selected to calculate a relevance score. The default choice of VOSviewer is to select the 60% most relevant terms, which is what was selected in this study, with 440 terms included in the analysis.
Figure 3 shows three broad areas of interest. The larger cluster of items in red (n = 202) corresponds to intervention development and implementation. Within this cluster, the most frequent terms were “development” (n = 244), “user” (n = 176), “technology” (n = 156), and “implementation” (n = 142). Other terms in the same cluster were “perspective” (77), “feature” (n = 99), “perception” (n = 77), “focus group (n = 69), “knowledge” (n = 93), “application” (n = 93), and “recommendation” (n = 67).
The second cluster in green with 169 items corresponds to intervention outcomes. In the green cluster, the most frequent terms were “depression” (n = 322), “anxiety” (n = 249), “week” (n = 249), and “baseline” (n = 195), and “internet” (n = 175). Other terms within the same cluster were “effect” (n = 184), “trial” (n = 193), “effect” (n = 184), adherence (n = 107), and “significant improvement” (n = 75).
Finally, the third smaller cluster in blue with 71 items corresponds more to intervention and sample characteristics. In that cluster, the most frequent terms within the smaller blue cluster were “female” (n = 55), “respondent” (n = 47), and “characteristic” (n = 45). Other terms within the same cluster were “mental health support” (n = 36), “primary care” (n = 31), “male”, and “suicide” (n = 27).

4.6. Authors’ Keywords Co-Occurrence Analysis

Overall, three analyses of authors’ keywords were conducted. In the first one, all authors’ keywords were included to obtain a broad view of authors’ key research interests. The most frequent keywords were “mental health” (n = 264), “depression” (n = 168), “mhealth” (n = 107), “anxiety” (n = 93), “mobile phone” (n = 76), and “acceptability” (n = 56). However, the network analysis (Figure 4) did not provide comprehensive information about the co-occurrence patterns of keywords connected with intervention acceptability. For this reason, two separate analyses of authors’ keywords were conducted. The analyses included manually identified and selected keywords within the Vosviewer system, with an occurrence frequency of at least two. Of 2287 keywords, 643 met that threshold. The first analysis reported on keywords that occurred at least twice and were associated with digital technologies and their application to the delivery of mental health interventions. Subsequently, those were mapped according to the documents’ publication years. The second analysis reported on keywords that were relevant to acceptability or usability, research design, and research context or population characteristics.

4.7. Digital Technology Application in the Delivery of Mental Health Intervention Studies

Of the 643 keywords, 115 relevant keywords were manually identified and included in the analysis. Table 3 shows relevant authors’ keywords with a frequency of at least 10. The keyword “mhealth” was ranked first (n = 107), followed by “mobile phone” (n = 76) and “internet” (n = 54).
Figure 5 shows authors’ keywords’ occurrence frequencies mapped across the documents’ average publication year. The earliest used keywords had low overall frequencies and included “smartphones” (n = 4, Avg. pub. Year: 2016.50), “internet-based treatment” (n = 6, Avg. pub. Year: 2017.33), and “m-health” (n = 5, Avg. pub. Year: 2017.80). The keyword “internet” ranked third overall (n = 54) and had an average publication year in early 2018 (Avg. pub. Year = 2018.17). Other keywords with an average publication year in 2018 included “telepsychiatry” (n = 10, Avg. pub. Year: 2018.89), “e-health “ (n = 19, Avg. pub. Year: 2018.89), and “internet interventions”(n = 11, Avg. pub. Year = 2018.82). Keywords relevant to the application of digital technology in mental health-related research with an average publication year in 2019 included “ehealth” (n = 42, Avg. pub. Year: 2019.74), “web-based” (n = 14, Avg. pub. Year = 2019.14), “internet-based” (n = 6, Avg. pub. Year: 2019.17), “smartphone” (Avg. pub. Year = 2019.78), “videoconferencing (n = 7, Avg. pub. Year: 2019.86), and “web-based intervention” (Avg. pub. Year: 2019. 88).
Keywords’ analysis shows that from 2020 onwards, interventions were more frequently described as mobile or digital interventions rather than online or internet-based interventions that coincided with advancements in technology and the need for digital delivery of interventions due to the COVID-19 pandemic. The most frequently occurring keywords with an average publication year in 2020 was “mhealth” (n = 107, Avg. pub. Year: 2020.82). Other keywords for the same year were “telehealth” (n = 38, Avg. pub. Year: 2020.47), “technology” (n = 33, Avg. pub. Year = 2020.12), “e-mental health” (n = 29, Avg. pub. Year: 2020.00), “telemedicine” (n = 37, Avg. pub. Year = 2020.08), “mobile apps” (n = 26, Avg. pub. Year: 2020.73)”,“internet intervention” (n = 14, Avg. pub. Year: 2020.43), and “ chatbots” (n = 5, Avg. pub. Year: 2020.40).
The most frequently occurring keyword with an average publication in 2021 was “mobile phone” (n = 76, Avg. pub. Year: 2021.34). Other keywords were “digital health” (n = 47, Avg. pub. Year = 2021.57), “mobile health” (n = 46, Avg. pub. Year: 2021.43), app” (n = 12, Avg. pub. Year = 2021.17), “online intervention” (n = 18, Avg. pub. Year = 2021.06), “conversational agent” (n = 11, Avg. pub. Year: 2021.55), “artificial intelligence” (n = 13, Avg. pub. Year: 2021.85), “chatbot” (n = 10, Avg. pub. YearL2021.80), “gamification” (n = 5, Avg. pub. Year: 2021.00), “digital technology” (n = 5, Avg. pub. Year_2021.00), “serious games” (n = 4, Avg. pub. Year: 2021.50). The most frequently occurring keywords in papers with an average publication in 2022 were “digital mental health” (n = 33, Avg. pub. Year = 2022.21) followed by “digital intervention” (n = 30, Avg. pub. Year = 2022.07), “vr” (n = 5; “Avg. pub. Year: 2022.80), and “biofeedback” (n = 4, Avg. pub. Year = 2022.00).

4.8. Acceptability and Usability Studies of Digital Mental Health-Related Interventions

Authors’ approaches to intervention acceptability were examined by an analysis of the co-occurrence of keywords that (a) were related to assessments/approaches to intervention acceptability or usability, (b) referred to a study’s research design, and (c) referred to specific target population characteristics. Keywords related to mental health outcomes were not selected in that analysis as their bibliometric features were assessed as part of the bibliographical coupling. Overall, 183 relevant keywords were identified and included in the analysis. Table 4 shows relevant authors’ keywords with a frequency of at least five.
Figure 6 shows that author keywords were grouped in 10 clusters. The first cluster with 29 items had the keyword “COVID-19” (n = 45). This cluster did not include a keyword directly addressing acceptability or usability. It included relevant research designs such as “feasibility study” (n = 10) and “pilot study” (=9). Specific populations of interest were “health care workers” (n = 3), “healthcare provider” (n = 3), “nurses” (n = 3) and “African Americans” (n = 3).
The second cluster in green with 25 items had the keyword “veterans” (n = 21) and “college students” (n = 14). Keywords associated with acceptability or usability were “attitudes” (n = 9) and “barriers” (n = 5). Keywords related to research design approaches were “implementation science” (n = 4) and “clinical trials” (n = 6). Other populations of interest were “young adults” (n = 8), “children” (n = 8), “parent”, “military” (n = 4), and “India” (n = 3).
The third cluster in blue with 23 items had the keyword “adolescents” (n = 32). This cluster did not have a specific keyword directly referring to acceptability and usability but had the most relevant research designs, including “user-centered design” (n = 13), “co-design” (n = 11), and “coproduction” (n = 3). Populations of interest included “youth” (n = 21), “cystic fibrosis” (n = 4), “teenager” (n = 3), “family caregivers” (n = 3), and “LGBTIQQ plus” (n = 3).
The fourth cluster in yellow with 22 items had the keyword “development” (n = 8) and “usability testing” (n = 9). Other keywords relevant to acceptability or usability were “usage” (n = 3), “needs assessment” (n = 2), and “user” (n = 2). Relevant research design keywords were “user-centered development” (n = 2), “participatory action research” (n = 2), “stakeholder participation” (n = 3), and “focus group” (n = 3). Relevant populations of interest were “older adults” (n = 11) and “rural” (n = 5).
The fifth cluster with 19 items in purple had the keywords “acceptability” (n = 57), “feasibility” (n = 45), “usability” (n = 42), and “pilot study” (n = 9). Other keywords in this cluster directly relevant to acceptability were “technology acceptance model” (n = 4) and “uptake” (n = 4). Populations of interest included “caregivers” (n = 13), “veteran” (n = 5), and “Hispanic” (n = 3).
The sixth cluster with 18 items in light blue had the keyword “veterans” (n = 22). The only keyword relevant to acceptability or usability within this cluster was “attitudes” (n = 9), and the only one relevant to research design was “qualitative methods” (n = 2). Other populations of interest within this cluster were “pregnancy” (n = 9), “family” (n = 4) “breast cancer” (n = 5), and “school” (n = 3).
The seventh cluster with 12 items in orange had the keywords “qualitative research” (n = 20) and “implementation” (n = 24). Keywords that were relevant to acceptability were “acceptability of healthcare” (n = 4) and “attitude to computers” (n = 4), while keywords about populations of interest included “low- and middle- income countries” (n = 4), and “workplace” (n = 7).
The eighth cluster with 12 items in brown had the keywords “feasibility study” (n = 10) and “cultural adaptation” (n = 5). Another keyword relevant to research design was “participatory design” (n = 5), and populations of interest included “university students” (n = 10), “aboriginal” (n = 3), “indigenous” (n = 4), “first nations” (n = 3) “Indonesia” (n = 4), “refugees” (n = 4), and “South Africa” (n = 3).
The ninth cluster with 12 items in pink had the keyword “user experience” (n = 17). Some keywords relevant to intervention acceptability and its links with intervention engagement (n < 3) were “user feedback”, “user satisfaction”, app usability”, “system usability”, and “treatment engagement”. Keywords relevant to research designs were “qualitative” (n = 14), “interview” (n = 2), “focus group” (n = 3), and “thematic analysis” (n = 4).
The tenth cluster with 10 items in fuchsia had the keyword “engagement” (n = 14). Keywords relevant to acceptability were “adherence” (n = 8), “uptake” (n = 4), “usability study” (n = 4), and “user engagement” (n = 4). It did not include specific keywords about the research design of the studies, while two populations of interest (n < 3) were “overweight” and “developing countries”.

4.9. Bibliographical Coupling

A bibliographical coupling of documents was conducted to examine how often the same documents were present in the reference lists of other publications and thus identify key drivers of publication interests. Table 5 shows the 10 most cited documents. The most cited paper was Fitzpatrick et al.’s (2017) with 665 citations.
Figure 7 offers a visualisation of their bibliographical coupling among documents that had at least 10 citations. Overall, documents were clustered in seven groups.
The largest cluster in red had seventy-seven documents that included studies that reported on measures of acceptability of web-based psychological interventions with mindfulness-based intervention being frequently cited within this cluster. The most cited documents within this cluster was Huberty et al.’s (2019) “Efficacy of the Mindfulness Meditation Mobile App “Calm” to Reduce Stress Among College Students: Randomized Controlled Trial” with 163 citations and Levin et al.’s (2017) “Web-Based Acceptance and Commitment Therapy for Mental Health Problems in College Students: A Randomized Controlled Trial” with 124 citations.
The second cluster in green had seventy-seven documents and included studies that reported on the acceptability of digital mental health technologies primarily by service users and practitioners in interventions within primary care. The most cited documents within this cluster were Donaghy et al.’s (2019) “Acceptability, benefits, and challenges of video consulting: a qualitative study in primary care” with 242 citations, and Johnson et al.’s (2009) “Computerized ambulatory monitoring in psychiatry: a multi-site collaborative study of acceptability, compliance, and reactivity” with 94 citations.
The third cluster in blue with sixty-five documents included studies that often focused on the development and delivery of digital mental health-related interventions and reported on intervention acceptability and usability. The most cited documents within this group was Ben-zeev et al.’s (2013) “Development and usability testing of FOCUS: A smartphone system for self-management of schizophrenia” with 189 citations and Comer et al.’s (2017) “Remotely delivering real-time parent training to the home: An initial randomized trial of Internet-delivered parent–child interaction therapy (I-PCIT)” with 126 citations.
The fourth cluster in yellow with fourty-nine documents and the fifth cluster in purple with thirty-six documents included studies that reported on the acceptability of digital psychological interventions, primarily for the treatment of depression, generalised anxiety, and psychiatric disorders. The yellow cluster included more pilot studies, survey studies, and targeted outcomes that focused explicitly on perceived acceptability. The most cited papers in the yellow cluster were Musiat et al.’s (2014) “Understanding the acceptability of e-mental health—attitudes and expectations towards computerised self-help treatments for mental health problems” with 150 citations and Gun et al.’s (2011) “Acceptability of Internet Treatment of Anxiety and Depression” with 119 citations. On the other hand, the purple cluster included more randomised controlled trials, where acceptability was reported as a secondary outcome. The four most cited papers in this cluster were among the top 10 most cited papers. Those were Firzpatrick et al.’s (2017) “Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial” with 655 citations, Titov et al.’s (2010) “Internet Treatment for Depression: A Randomized Controlled Trial Comparing Clinician vs. Technician Assistance” with 235 citations, Titov et al.’s (2013) “Improving Adherence and Clinical Outcomes in Self-Guided Internet Treatment for Anxiety and Depression: Randomised Controlled Trial” with 173 citations, and Robinson et al.’s (2010) “Internet Treatment for Generalized Anxiety Disorder: A Randomized Controlled Trial Comparing Clinician vs. Technician Assistance” with 162 citations.
The sixth cluster in light blue with twenty-one documents included pilot studies and survey studies that reported on the acceptability and utility of digital mental health tools. The most cited papers with this cluster were Hetrick et al.’s (2018) “Youth Codesign of a Mobile Phone App to Facilitate Self-Monitoring and Management of Mood Symptoms in Young People With Major Depression, Suicidal Ideation, and Self-Harm” with 64 citations and Rice et al.’s (2018) “Moderated online social therapy for depression relapse prevention in young people: pilot study of a ‘next generation’ online intervention” with 64 citations.
The seventh cluster in orange with twenty documents included studies that reported on attitudes and perspectives towards e-mental health. The most cited papers within this cluster were Apolinário-Hagen et al.’s (2018) “Public Attitudes Toward Guided Internet-Based Therapies: Web-Based Survey Study” with 77 citations and Apolinário-Hagen et al.’s (2017) “Current Views and Perspectives on E-Mental Health: An Exploratory Survey Study for Understanding Public Attitudes Toward Internet-Based Psychotherapy in Germany” with 49 citations.
Finally, the eighth cluster in brown with sixteen documents included development, pilot, and usability studies that reported on the acceptability of novel digital mental health tools. The most cited papers within this cluster were Prochaska et al.’s (2021) “A Therapeutic Relational Agent for Reducing Problematic Substance Use (Woebot): Development and Usability Study” with 49 citations and Suganuma et al.’s (2018) “An Embodied Conversational Agent for Unguided Internet-Based Cognitive Behavior Therapy in Preventative Mental Health: Feasibility and Acceptability Pilot Trial” with 48 citations.

5. Discussion

The purpose of this study was to explore the development of research in the acceptability of digital mental health-related interventions. The bibliometric analysis based on the WOS database covered 990 documents published by 5598 authors in 1672 institutions from 2008 to 2023. This is the first study that examined the development of research outputs claiming to address the acceptability of digital mental health-related interventions within their research procedures.
Bibliometric tools can provide insights into the most influential research outputs in a field and highlight emergent areas of focus (Agarwal et al. 2016). The analysis of several metrics such as authors’ keywords, article-level metrics, and citation patterns provided details on the key areas of interest and research priorities that have been the key drivers of knowledge production. At the same time, the visual network analyses of publication patterns illustrated the popularity and dominance of different approaches to intervention acceptability.

5.1. Publication Trends

Our study findings highlight the increase in publications reporting the acceptability of digital mental health-related interventions and show how technological developments and research contexts have shaped their evolution. Our study findings confirm the results of previous bibliometric analyses in technology-based treatments in psychology that showed an increase in popularity in the 2010s (Flujas-Contreras et al. 2023). Additionally, this study showed an increase in studies addressing intervention acceptability after 2019, which coincides with the increased focus on the importance of feasibility and pilot studies, also shown by the increase in the publication of guidelines on their conduct and role in intervention development and evaluation (Bowen et al. 2009; Lancaster and Thabane 2019; Skivington et al. 2021). Expectedly, publications focusing on digital mental health interventions increased after the onset of the COVID-19 pandemic, a trend that has also been reported in other bibliometric analyses and reviews of the literature (Ellis et al. 2021; Riboldi et al. 2023). However, there was no further increase in 2023, which may be the outcome of the pandemic pressures in healthcare systems across the world subsiding and as a result, services and research programmes reverted to in-person intervention delivery (Mindsolent n.d.). Intervention acceptability is typically addressed in feasibility studies (Orsmond and Cohn 2015). For this reason, it is expected that the publication of studies reporting on the acceptability of interventions will follow the timeline of technological innovations that allowed the implementation of novel interventions with diverse ways of use engagement. The mapping of technology-related keywords across the studies’ average publication years demonstrated how the acceptability of mobile-based mental health interventions has been the dominant area of interest since 2020. The research focus on app-delivered interventions has continued in the later years, while intervention delivery via chatbots has also increased in popularity. A strength of mobile interventions is that they can be personalised to the needs of individual patients and use numerous engagement strategies. However, as shown by recent reviews, there are also numerous user-centered and intervention-specific parameters that influence user satisfaction and engagement with an intervention and the degree to which they are preferable over face-to-face delivery methods (Chan and Honey 2022; Gan et al. 2022). Our study results showed how research activity between 2020 and 2022 was shaped by research conducted in the context of the COVID-19 pandemic. The keyword “COVID-19” was the fifth most frequent keyword within all the documents selected since 2008. It represented unique research content due to the rapid development and implementation of numerous remotely delivered mental health-related interventions either directed toward healthcare professionals or service users (Dominguez-Rodriguez et al. 2022; Witteveen et al. 2022).

5.2. Visual Network Analyses: Key Findings

The results of the visual network analyses illustrated the key drivers of research activity that have shaped the research topic. A total of 990 research publications were published in 78 countries. In line with well-documented trends in e-mental health (Helha and Wang 2022; Zale et al. 2021), the main contributors and the authors of the most influential publications were based in high-income countries (namely the USA, Australia, England, Canada, and Germany), with China and India being the most productive among middle-income countries. The co-authorship analysis showed that several clusters of productive research groups are based in Australia, which may reflect specific regional research interests, such as a focus on overcoming barriers to the accessibility or effectiveness of digital mental health services (Balcombe and De Leo 2021). Moreover, the network analysis of co-authorship showed that only a few authors maintained collaborations with authors based in other countries.
A key finding of the visual network analyses conducted was that the most influential research interests and trends in citations were not always aligned with those that drove the highest volume of publications. For example, the results of both the text co-occurrence analysis and the authors’ keyword analysis showed that the largest cluster of documents reported studies focusing on intervention development and implementation. Such studies often targeted the needs of specific population groups (e.g., adolescents and veterans) and consequently adopted compatible research designs (e.g., user-centered development and participatory action research). On the other hand, the results of the studies’ bibliographical coupling showed that the most influential studies were those that reported on the efficacy of interventions, while citation patterns were defined by shared interests in specific intervention approaches (e.g., mindfulness-based interventions) and technology acceptance within primary health services. The only exception to this pattern of results appeared to be research addressing the acceptability of digital mental health-related interventions which was conducted in the context of the COVID-19 pandemic. Such observations may reflect differences in research priorities associated with research funding. Previous research has illustrated that the quantity and quality of publications are related to funding allocation (Ebadi and Schiffauerova 2015), while the increased prioritization of funding allocation in studies focusing on technological applications to mental health by large funding bodies has played a role in their growing popularity (Zale et al. 2021). Similarly, the impact of historical events such as the COVID-19 pandemic can lead to rapid adoption of technology (Zale et al. 2021). This is characterised by agility in developing and evaluating interventions, which also requires equally robust intervention development frameworks. However, discussions, even prior to the COVID-19 pandemic, highlight that constraints associated with more traditional approaches to intervention development essentially limit the potential for the real-world impact of digital mental health technologies (Balcombe and De Leo 2021; Torous and Haim 2018)

5.3. Acceptability Approaches

Our study results demonstrated that the variation in authors’ approaches to acceptability is often associated with the stage of intervention development during which a certain study is conducted. Sekhon et al.’s (2017) define acceptability as “a multi-faceted construct that reflects to the extent to which people delivering or receiving a healthcare intervention consider it to be appropriate, based on anticipated or experimental cognitive and emotional responses to the intervention” and distinguish between prospective acceptability (before participation in an intervention), concurrent acceptability (during participation), and retrospective acceptability (after participation). The text analysis of abstracts and titles showed that acceptability approaches could be distinguished into two broad categories: the development or implementation of an intervention where the focus is on user experience and attitudes and the preliminary efficacy, where the focus on intervention outcomes is such that intervention accessibility can be a secondary outcome. The bibliographical coupling demonstrated that four out of the eight clusters had a distinct focus on acceptability, whereas the other half were clustered around intended primary intervention outcomes (e.g., depression) or intervention approaches (e.g., mindfulness interventions). However, the most cited cluster of studies were those that reported acceptability, assessed as part of randomised controlled trials. This is expected as RCT is the most suitable research design for demonstrating the effectiveness of a specific intervention. At the same time, however, within the overall design of a randomised controlled trial, acceptability can be conflated with satisfaction and, thus, pay limited attention to the complexities of acceptability that are more associated with the sustained adoption of an intervention. Recent reviews of the literature, for example, on the implementation and adoption of digital mental health care during the COVID-19 pandemic highlight the importance of addressing real-world parameters that can influence interventions acceptability by different stakeholders (Witteveen et al. 2022).

5.4. Limitations

This study has several limitations that need to be acknowledged. First, the article sampling consisted only of articles published on the Web of Science (WOS). Thus, articles published in other sources were not included in the sample. Furthermore, review papers and book chapters were excluded from this study and, consequently, their bibliometric features were not assessed. Although there were no language restrictions in the article selection, most of the documents were written in the English language, which may reflect an innate limitation of selecting documents from the WOS. Moreover, the search for documents stating “usability” or “acceptability” in their title, abstract, or keywords may mean that some documents that addressed the development of interventions may not have been captured if they did not use those terms. Finally, the use of publication metadata as a point of analysis of publication trends means that analysis of trends can be impacted by potential discrepancies found within the source data (e.g., incorrect characters, misspellings of authors’ names, institutions affiliations, omissions of citations, etc.) (Pranckutė 2021).

6. Conclusions

Despite these limitations, this study contributes to the analysis of the evolution of the ways that intervention acceptability is understood, assessed, and prioritised in studies of digital mental health-related interventions. Furthermore, the use of the term “digital mental health-related interventions” allowed the inclusion of studies that reported interventions to support general mental well-being and were directed to any type of user. Finally, the comprehensive examination of authors’ keywords allowed us to map technology-related terms separately from intervention acceptability terms and, thus, be able to have a clearer view of key drivers of publication interests.

Implications for Future Research

The results of this study demonstrated how the study of the acceptability of digital mental health-related interventions has evolved to encompass a large variety of parameters that transcend the different stages of intervention development and evaluation. Future research will need to identify the parameters of acceptability that are addressed in different studies and the degree to which the assessments undertaken adequately address acceptability, and eventually, explore its links with intervention adoption rates and real-world impact. For example, systematic reviews in high-, middle-, and low-income countries can explore the relevant evidence on interventions’ acceptability within those contexts and the barriers for their implementation. Moreover, future research will need to explore situations and contexts where digital mental health-related interventions may even extend existing inequalities (Krukowski et al. 2024). Finally, both reviews of existing evidence and empirical research can inform the development of guidelines for intervention acceptability to be adequately reported even in cases where it serves as a secondary outcome to large randomised controlled trials.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review was not required for this study is it did not involve humans.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Adepoju, Paul. 2020. Africa turns to telemedicine to close mental health gap. The Lancet Digital Health 2: e571–e572. [Google Scholar] [CrossRef]
  2. Agarwal, Ashok, Damayanthi Durairajanayagam, Sindhuja Tatagari, Sandro C. Esteves, Avi Harlev, Ralf Henkel, Shubhadeep Roychoudhury, Sheryl Homa, Nicolás Garrido Puchalt, and Ranjith Ramasamy. 2016. Bibliometrics: Tracking research impact by selecting the appropriate metrics. Asian Journal of Andrology 18: 296. [Google Scholar] [CrossRef]
  3. Alam, Mirza Mohammad Didarul, Mohammad Zahedul Alam, Syed Abidur Rahman, and Seyedeh Khadijeh Taghizadeh. 2021. Factors influencing mHealth adoption and its impact on mental well-being during COVID-19 pandemic: A SEM-ANN approach. Journal of Biomedical Informatics 116: 103722. [Google Scholar] [CrossRef]
  4. Apolinário-Hagen, Jennifer, Mathias Harrer, Fanny Kählke, Lara Fritsche, Christel Salewski, and David Daniel Ebert. 2018. Public attitudes toward guided internet-based therapies: Web-based survey study. JMIR Mental Health 5: e10735. [Google Scholar] [CrossRef]
  5. Apolinário-Hagen, Jennifer, Viktor Vehreschild, and Ramez M. Alkoudmani. 2017. Current views and perspectives on e-mental health: An exploratory survey study for understanding public attitudes toward internet-based psychotherapy in Germany. JMIR Mental Health 4: e6375. [Google Scholar] [CrossRef]
  6. Arnold, Chelsea, John Farhall, Kristi-Ann Villagonzalo, Kriti Sharma, and Neil Thomas. 2021. Engagement with online psychosocial interventions for psychosis: A review and synthesis of relevant factors. Internet Interventions 25: 100411. [Google Scholar] [CrossRef] [PubMed]
  7. Balcombe, Luke, and Diego De Leo. 2021. Digital mental health challenges and the horizon ahead for solutions. JMIR Mental Health 8: e26811. [Google Scholar] [CrossRef] [PubMed]
  8. Banerjee, Debanjan, Bhavika Vajawat, and Prateek Varshney. 2021. Digital gaming interventions: A novel paradigm in mental health? Perspectives from India. International Review of Psychiatry 33: 435–41. [Google Scholar] [CrossRef] [PubMed]
  9. Ben-Zeev, Dror, Susan M. Kaiser, Christopher J. Brenner, Mark Begale, Jennifer Duffecy, and David C. Mohr. 2013. Development and usability testing of FOCUS: A smartphone system for self-management of schizophrenia. Psychiatric Rehabilitation Journal 36: 289. [Google Scholar] [CrossRef] [PubMed]
  10. Borghouts, Judith, Elizabeth Eikey, Gloria Mark, Cinthia De Leon, Stephen M. Schueller, Margaret Schneider, Nicole Stadnick, Kai Zheng, Dana Mukamel, and Dara H. Sorkin. 2021. Barriers to and facilitators of user engagement with digital mental health interventions: Systematic review. Journal of Medical Internet Research 23: e24387. [Google Scholar] [CrossRef] [PubMed]
  11. Bowen, Deborah J., Matthew Kreuter, Bonnie Spring, Ludmila Cofta-Woerpel, Laura Linnan, Diane Weiner, Suzanne Bakken, Cecilia Patrick Kaplan, Linda Squiers, and Cecilia Fabrizio. 2009. How we design feasibility studies. American Journal of Preventive Medicine 36: 452–57. [Google Scholar] [CrossRef]
  12. Caputo, Andrea, and Mariya Kargina. 2022. A user-friendly method to merge Scopus and Web of Science data during bibliometric analysis. Journal of Marketing Analytics 10: 82–88. [Google Scholar] [CrossRef]
  13. Chan, Amy Hai Yan, and Michelle L. L. Honey. 2022. User perceptions of mobile digital apps for mental health: Acceptability and usability-An integrative review. Journal of Psychiatric and Mental Health Nursing 29: 147–68. [Google Scholar] [CrossRef]
  14. Comer, Jonathan S., Jami M. Furr, Elizabeth M. Miguel, Christine E. Cooper-Vince, Aubrey L. Carpenter, R. Meredith Elkins, Caroline E. Kerns, Danielle Cornacchio, Tommy Chou, and Stefany Coxe. 2017. Remotely delivering real-time parent training to the home: An initial randomized trial of Internet-delivered parent–child interaction therapy (I-PCIT). Journal of Consulting and Clinical Psychology 85: 909. [Google Scholar] [CrossRef]
  15. De Veirman, Ann E. M., Viviane Thewissen, Matthijs G. Spruijt, and Catherine A. W. Bolman. 2022. Factors Associated With Intention and Use of e–Mental Health by Mental Health Counselors in General Practices: Web-Based Survey. JMIR Formative Research 6: e34754. [Google Scholar] [CrossRef] [PubMed]
  16. Dominguez-Rodriguez, Alejandro, Reyna Jazmín Martínez-Arriaga, Paulina Erika Herdoiza-Arroyo, Eduardo Bautista-Valerio, Anabel de la Rosa-Gómez, Rosa Olimpia Castellanos Vargas, Laura Lacomba-Trejo, Joaquín Mateu-Mollá, Miriam de Jesús Lupercio Ramírez, and Jairo Alejandro Figueroa González. 2022. E-health psychological intervention for COVID-19 healthcare workers: Protocol for its implementation and evaluation. International Journal of Environmental Research and Public Health 19: 12749. [Google Scholar] [CrossRef]
  17. Donaghy, Eddie, Helen Atherton, Victoria Hammersley, Hannah McNeilly, Annemieke Bikker, Lucy Robbins, John Campbell, and Brian McKinstry. 2019. Acceptability, benefits, and challenges of video consulting: A qualitative study in primary care. British Journal of General Practice 69: e586–e594. [Google Scholar] [CrossRef] [PubMed]
  18. Ebadi, Ashkan, and Andrea Schiffauerova. 2015. Bibliometric analysis of the impact of funding on scientific development of researchers. International Journal of Computer and Information Engineering 9: 1541–51. [Google Scholar]
  19. Ellis, Louise A., Isabelle Meulenbroeks, Kate Churruca, Chiara Pomare, Sarah Hatem, Reema Harrison, Yvonne Zurynski, and Jeffrey Braithwaite. 2021. The application of e-mental health in response to COVID-19: Scoping review and bibliometric analysis. JMIR Mental Health 8: e32948. [Google Scholar] [CrossRef]
  20. Fitzpatrick, Kathleen Kara, Alison Darcy, and Molly Vierhile. 2017. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health 4: e7785. [Google Scholar] [CrossRef]
  21. Flujas-Contreras, Juan M., Azucena García-Palacios, and Inmaculada Gómez. 2023. Technology in psychology: A bibliometric analysis of technology-based interventions in clinical and health psychology. Informatics for Health and Social Care 48: 47–67. [Google Scholar] [CrossRef]
  22. Gama, Ana, Maria J. Marques, João Victor Rocha, Sofia Azeredo-Lopes, Walaa Kinaan, Ana Sá Machado, and Sónia Dias. 2022. ‘I didn’t know where to go’: A mixed-methods approach to explore migrants’ perspectives of access and use of health services during the COVID-19 pandemic. International Journal of Environmental Research and Public Health 19: 13201. [Google Scholar] [CrossRef]
  23. Gan, Daniel Z. Q., Lauren McGillivray, Mark E. Larsen, Helen Christensen, and Michelle Torok. 2022. Technology-supported strategies for promoting user engagement with digital mental health interventions: A systematic review. Digital Health 8: 20552076221098268. [Google Scholar] [CrossRef]
  24. Gauthier-Beaupré, Amélie, and Sylvie Grosjean. 2023. Understanding acceptability of digital health technologies among francophone-speaking communities across the world: A metaethnographic study. Frontiers in Communication 8: 1230015. [Google Scholar] [CrossRef]
  25. Gega, Lina, Dina Jankovic, Pedro Saramago, David Marshall, Sarah Dawson, Sally Brabyn, Georgios F. Nikolaidis, Hollie Melton, Rachel Churchill, and Laura Bojke. 2022. Digital interventions in mental health: Evidence syntheses and economic modelling. Health Technology Assessment (Winchester, England) 26: 1. [Google Scholar] [CrossRef]
  26. Goodman, Ruth, Linda Tip, and Kate Cavanagh. 2021. There’s an app for that: Context, assumptions, possibilities and potential pitfalls in the use of digital technologies to address refugee mental health. Journal of Refugee Studies 34: 2252–74. [Google Scholar] [CrossRef]
  27. Gotadki, Rahul. 2024. Digital Mental Health Market Research Report Information by Component (Software and Services), by Disorder Type (Anxiety Disorder, Bipolar Disorder, Post-Traumatic Stress Disorder (PTSD). Substance Abuse Disorder, and Others), by Age Group (Hildren & Adolescents, Adult, and Geriatric), End User (Patients, Payers, and Providers) and by Region (Noth America, Europe, Asia-Pacific, and Rest of the World)—Forecase till 2032. Available online: https://www.marketresearchfuture.com/reports/digital-mental-health-market-11062 (accessed on 5 January 2024).
  28. Gun, Shih Ying, Nickolai Titov, and Gavin Andrews. 2011. Acceptability of Internet treatment of anxiety and depression. Australasian Psychiatry 19: 259–64. [Google Scholar] [CrossRef]
  29. Gusenbauer, Michael. 2022. Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases. Scientometrics 127: 2683–745. [Google Scholar] [CrossRef]
  30. Hekler, Eric B., Predrag Klasnja, William T. Riley, Matthew P. Buman, Jennifer Huberty, Daniel E. Rivera, and Cesar A. Martin. 2016. Agile science: Creating useful products for behavior change in the real world. Translational Behavioral Medicine 6: 317–28. [Google Scholar] [CrossRef]
  31. Helha, Fernandes-Nascimento Maria, and Yuan-Pang Wang. 2022. Trends in complementary and alternative medicine for the treatment of common mental disorders: A bibliometric analysis of two decades. Complementary Therapies in Clinical Practice 46: 101531. [Google Scholar] [CrossRef]
  32. Hernández-Torrano, Daniel, Laura Ibrayeva, Jason Sparks, Natalya Lim, Alessandra Clementi, Ainur Almukhambetova, Yerden Nurtayev, and Ainur Muratkyzy. 2020. Mental health and well-being of university students: A bibliometric mapping of the literature. Frontiers in Psychology 11: 1226. [Google Scholar] [CrossRef]
  33. Hetrick, Sarah Elisabeth, Jo Robinson, Eloise Burge, Ryan Blandon, Bianca Mobilio, Simon M. Rice, Magenta B. Simmons, Mario Alvarez-Jimenez, Simon Goodrich, and Christopher G. Davey. 2018. Youth codesign of a mobile phone app to facilitate self-monitoring and management of mood symptoms in young people with major depression, suicidal ideation, and self-harm. JMIR Mental Health 5: e9041. [Google Scholar] [CrossRef]
  34. Huberty, Jennifer, Jeni Green, Christine Glissmann, Linda Larkey, Megan Puzia, and Chong Lee. 2019. Efficacy of the mindfulness meditation mobile app “calm” to reduce stress among college students: Randomized controlled trial. JMIR mHealth and uHealth 7: e14273. [Google Scholar] [CrossRef]
  35. Hunter, John F., Lisa C. Walsh, Chi-Keung Chan, and Stephen M. Schueller. 2023. The good side of technology: How we can harness the positive potential of digital technology to maximize well-being. Frontiers in Psychology 14: 1304592. [Google Scholar] [CrossRef]
  36. Johnson, Elizabeth I., Olivier Grondin, Marion Barrault, Malika Faytout, Sylvia Helbig, Mathilde Husky, Eric L. Granholm, Catherine Loh, Louise Nadeau, and Hans-Ulrich Wittchen. 2009. Computerized ambulatory monitoring in psychiatry: A multi-site collaborative study of acceptability, compliance, and reactivity. International Journal of Methods in Psychiatric Research 18: 48–57. [Google Scholar] [CrossRef]
  37. Kaihlanen, Anu-Marja, Lotta Virtanen, Ulla Buchert, Nuriiar Safarov, Paula Valkonen, Laura Hietapakka, Iiris Hörhammer, Sari Kujala, Anne Kouvonen, and Tarja Heponiemi. 2022. Towards digital health equity-a qualitative study of the challenges experienced by vulnerable groups in using digital health services in the COVID-19 era. BMC Health Services Research 22: 188. [Google Scholar] [CrossRef]
  38. Krukowski, Rebecca A., Kathryn M. Ross, Max J. Western, Rosie Cooper, Heide Busse, Cynthia Forbes, Emmanuel Kuntsche, Anila Allmeta, Anabelle Macedo Silva, and Yetunde O. John-Akinola. 2024. Digital health interventions for all? Examining inclusivity across all stages of the digital health intervention research process. Trials 25: 98. [Google Scholar] [CrossRef]
  39. Lancaster, Gillian A., and Lehana Thabane. 2019. Guidelines for reporting non-randomised pilot and feasibility studies. Pilot and Feasibility Studies 5: 114. [Google Scholar] [CrossRef]
  40. Levin, Michael E., Jack A. Haeger, Benjamin G. Pierce, and Michael P. Twohig. 2017. Web-based acceptance and commitment therapy for mental health problems in college students: A randomized controlled trial. Behavior Modification 41: 141–62. [Google Scholar] [CrossRef]
  41. Lipschitz, Jessica M., Chelsea K. Pike, Timothy P. Hogan, Susan A. Murphy, and Katherine E. Burdick. 2023. The engagement problem: A review of engagement with digital mental health interventions and recommendations for a path forward. Current Treatment Options in Psychiatry 10: 119–35. [Google Scholar] [CrossRef]
  42. Lipschitz, Jessica M., Rachel Van Boxtel, John Torous, Joseph Firth, Julia G Lebovitz, Katherine E. Burdick, and Timothy P. Hogan. 2022. Digital mental health interventions for depression: Scoping review of user engagement. Journal of Medical Internet Research 24: e39204. [Google Scholar] [CrossRef]
  43. Lu, Zheng-An, Le Shi, Jian-Yu Que, Yong-Bo Zheng, Qian-Wen Wang, Wei-Jian Liu, Yue-Tong Huang, Xiao-Xing Liu, Kai Yuan, and Wei Yan. 2022. Accessibility to digital mental health services among the general public throughout COVID-19: Trajectories, influencing factors and association with long-term mental health symptoms. International Journal of Environmental Research and Public Health 19: 3593. [Google Scholar] [CrossRef]
  44. Mindsolent. n.d. Available online: https://www.solentmind.org.uk/news-events/news/phased-return-to-mental-health-face-to-face-services/ (accessed on 5 January 2024).
  45. Musiat, Peter, Philip Goldstone, and Nicholas Tarrier. 2014. Understanding the acceptability of e-mental health-attitudes and expectations towards computerised self-help treatments for mental health problems. BMC Psychiatry 14: 109. [Google Scholar] [CrossRef]
  46. Niel, Gilles, Fabrice Boyrie, and David Virieux. 2015. Chemical bibliographic databases: The influence of term indexing policies on topic searches. New Journal of Chemistry 39: 8807–17. [Google Scholar] [CrossRef]
  47. Nwosu, Adaora, Samantha Boardman, Mustafa M Husain, and P. Murali Doraiswamy. 2022. Digital therapeutics for mental health: Is attrition the Achilles heel? Frontiers in Psychiatry 13: 900615. [Google Scholar] [CrossRef] [PubMed]
  48. Orsmond, Gael I., and Ellen S. Cohn. 2015. The distinctive features of a feasibility study: Objectives and guiding questions. OTJR: Occupation, Participation and Health 35: 169–77. [Google Scholar] [CrossRef]
  49. Park, Susanna Y., Chloe Nicksic Sigmon, Debra Boeldt, and Chloe A. Nicksic Sigmon. 2022. A Framework for the Implementation of Digital Mental Health Interventions: The Importance of Feasibility and Acceptability Research. Cureus 14: e29329. [Google Scholar] [CrossRef]
  50. Perski, Olga, and Camille E. Short. 2021. Acceptability of digital health interventions: Embracing the complexity. Translational Behavioral Medicine 11: 1473–80. [Google Scholar] [CrossRef]
  51. Pranckutė, Raminta. 2021. Web of Science (WoS) and Scopus: The titans of bibliographic information in today’s academic world. Publications 9: 12. [Google Scholar] [CrossRef]
  52. Prochaska, Judith J., Erin A. Vogel, Amy Chieng, Matthew Kendra, Michael Baiocchi, Sarah Pajarito, and Athena Robinson. 2021. A therapeutic relational agent for reducing problematic substance use (Woebot): Development and usability study. Journal of Medical Internet Research 23: e24850. [Google Scholar] [CrossRef]
  53. Renfrew, Melanie Elise, Darren Peter Morton, Jason Kyle Morton, and Geraldine Przybylko. 2021. The influence of human support on the effectiveness of digital mental health promotion interventions for the general population. Frontiers in Psychology 12: 716106. [Google Scholar] [CrossRef] [PubMed]
  54. Riboldi, Ilaria, Daniele Cavaleri, Angela Calabrese, Chiara Alessandra Capogrosso, Susanna Piacenti, Francesco Bartoli, Cristina Crocamo, and Giuseppe Carrà. 2023. Digital mental health interventions for anxiety and depressive symptoms in university students during the COVID-19 pandemic: A systematic review of randomized controlled trials. Revista de Psiquiatria y Salud Mental 16: 47–58. [Google Scholar] [CrossRef] [PubMed]
  55. Rice, Simon, John Gleeson, Christopher Davey, Sarah Hetrick, Alexandra Parker, Reeva Lederman, Greg Wadley, Greg Murray, Helen Herrman, and Richard Chambers. 2018. Moderated online social therapy for depression relapse prevention in young people: Pilot study of a ‘next generation’ online intervention. Early Intervention in Psychiatry 12: 613–25. [Google Scholar] [CrossRef] [PubMed]
  56. Robinson, Emma, Nickolai Titov, Gavin Andrews, Karen McIntyre, Genevieve Schwencke, and Karen Solley. 2010. Internet treatment for generalized anxiety disorder: A randomized controlled trial comparing clinician vs. technician assistance. PLoS ONE 5: e10942. [Google Scholar] [CrossRef]
  57. Roland, Jonty, Emma Lawrance, Tom Insel, and Helen Christensen. 2020. The Digital Mental Health Revolution: Transforming Care through Innovation and Scale-Up. Available online: https://www.wish.org.qa/wp-content/uploads/2021/08/044E.pdf (accessed on 5 January 2024).
  58. Sawrikar, Vilas, and Kellie Mote. 2022. Technology acceptance and trust: Overlooked considerations in young people’s use of digital mental health interventions. Health Policy and Technology 11: 100686. [Google Scholar] [CrossRef]
  59. Sekhon, Mandeep, Martin Cartwright, and Jill J. Francis. 2017. Acceptability of healthcare interventions: An overview of reviews and development of a theoretical framework. BMC Health Services Research 17: 88. [Google Scholar] [CrossRef]
  60. Semwanga, Agnes Rwashana, Hasifah Kasujja Namatovu, Swaib Kyanda, Mark Kaawaase, and Abraham Magumba. 2021. An ehealth Adoption Framework for Developing Countries: A Systematic Review. Health Informatics-An international Journal (HIJ) 10: 1–16. [Google Scholar] [CrossRef]
  61. Skivington, Kathryn, Lynsay Matthews, Sharon Anne Simpson, Peter Craig, Janis Baird, Jane M. Blazeby, Kathleen Anne Boyd, Neil Craig, David P French, and Emma McIntosh. 2021. A new framework for developing and evaluating complex interventions: Update of Medical Research Council guidance. BMJ 374: n2061. [Google Scholar] [CrossRef]
  62. Skute, Igors, Kasia Zalewska-Kurek, Isabella Hatak, and Petra de Weerd-Nederhof. 2019. Mapping the field: A bibliometric analysis of the literature on university–industry collaborations. The Journal of Technology Transfer 44: 916–47. [Google Scholar] [CrossRef]
  63. Suganuma, Shinichiro, Daisuke Sakamoto, and Haruhiko Shimoyama. 2018. An embodied conversational agent for unguided internet-based cognitive behavior therapy in preventative mental health: Feasibility and acceptability pilot trial. JMIR Mental Health 5: e10454. [Google Scholar] [CrossRef]
  64. The Scottish Government. 2021. NHS Recovery Plan 2021–2026. Available online: https://www.gov.scot/binaries/content/documents/govscot/publications/strategy-plan/2021/08/nhs-recovery-plan/documents/nhs-recovery-plan-2021-2026/nhs-recovery-plan-2021-2026/govscot%3Adocument/nhs-recovery-plan-2021-2026.pdf (accessed on 5 January 2024).
  65. Titov, Nickolai, Blake F. Dear, Luke Johnston, Carolyn Lorian, Judy Zou, Bethany Wootton, Jay Spence, Peter M McEvoy, and Ronald M. Rapee. 2013. Improving adherence and clinical outcomes in self-guided internet treatment for anxiety and depression: Randomised controlled trial. PLoS ONE 8: e62873. [Google Scholar] [CrossRef]
  66. Titov, Nickolai, Gavin Andrews, Matthew Davies, Karen McIntyre, Emma Robinson, and Karen Solley. 2010. Internet treatment for depression: A randomized controlled trial comparing clinician vs. technician assistance. PLoS ONE 5: e10939. [Google Scholar] [CrossRef]
  67. Torous, John, and Adam Haim. 2018. Dichotomies in the development and implementation of digital mental health tools. Psychiatric Services 69: 1204–6. [Google Scholar] [CrossRef]
  68. Witteveen, A. B., S. Young, P. Cuijpers, J. L. Ayuso-Mateos, C. Barbui, F. Bertolini, M. Cabello, C. Cadorin, N. Downes, and D. Franzoi. 2022. Remote mental health care interventions during the COVID-19 pandemic: An umbrella review. Behaviour Research and Therapy 159: 104226. [Google Scholar] [CrossRef]
  69. Zale, Arya, Meagan Lasecke, Katerina Baeza-Hernandez, Alanna Testerman, Shirin Aghakhani, Ricardo F Muñoz, and Eduardo L. Bunge. 2021. Technology and psychotherapeutic interventions: Bibliometric analysis of the past four decades. Internet Interventions 25: 100425. [Google Scholar] [CrossRef]
Figure 1. Times cited and publications over time.
Figure 1. Times cited and publications over time.
Socsci 13 00114 g001
Figure 2. Authors’ collaboration maps.
Figure 2. Authors’ collaboration maps.
Socsci 13 00114 g002
Figure 3. Clusters of Text Co-occurrence in documents’ titles and abstracts.
Figure 3. Clusters of Text Co-occurrence in documents’ titles and abstracts.
Socsci 13 00114 g003
Figure 4. Network analysis of all author keywords.
Figure 4. Network analysis of all author keywords.
Socsci 13 00114 g004
Figure 5. Technology keywords’ publication year mapping.
Figure 5. Technology keywords’ publication year mapping.
Socsci 13 00114 g005
Figure 6. Acceptability research clusters.
Figure 6. Acceptability research clusters.
Socsci 13 00114 g006
Figure 7. Documents’ bibliographical coupling.
Figure 7. Documents’ bibliographical coupling.
Socsci 13 00114 g007
Table 1. Top 10 authors.
Table 1. Top 10 authors.
AuthorDocumentsCountries
1Nickolai Titov14Australia
2Blake F. Dear11Australia
3David C. Mohr11USA
4Mario Alvarez-Jimenez11Australia
5Gerhard Andersson11Sweden
6Helen Christensen10Australia
7Pim Cuijpers9Vrije Universiteit Amsterdam
8Dror Ben-zeev8USA
9Helen Riper8Vrije Universiteit Amsterdam
10Gavin Andrews7Australia
Table 2. Top ranking knowledge contributors.
Table 2. Top ranking knowledge contributors.
RankingPublication TitlesRecord CountCitationsAffiliationsRecord CountCitationsCountriesRecord CountCitations
1JMIR Formative Research89210University of Melbourne46800USA3915737
2JMIR Mental Health721893Kings College London41638Australia1933671
3Journal of Medical Internet Research591472University of Sydney34492England1631955
4Internet Interventions-The Application of Information Technology in Mental and Behavioural Health39378University of Washington31506Germany71805
5International Journal of Environmental Research and Public Health28240University of New South Wales29950Canada70475
6JMIR mHealth and uHealth24617Stanford University24252Netherlands67880
7Frontiers in Psychiatry22266Northwestern University22381Sweden39461
8Mindfulness14116Monash University22327Spain37400
9Journal of Affective Disorders12327Vrije University of Amsterdam 21244People’s Republic of China26206
10BMC Psychiatry12317University of Oxford20161New Zealand22289
Table 3. Frequencies of technology-relevant keywords.
Table 3. Frequencies of technology-relevant keywords.
KeywordsOccurrencesKeywords Occurrences
mhealth107e-health19
mobile phone76online intervention18
internet54web-based intervention16
digital health47mobile applications15
mobile health45internet intervention14
ehealth42internet-based intervention14
telehealth38web-based14
telemedicine37artificial intelligence13
digital mental health33app12
technology33digital12
smartphone32conversational agent11
digital intervention30icbt11
e-mental health29internet interventions11
online27chatbot10
mobile apps26
virtual reality22
mobile app21
Table 4. Frequencies of authors’ keywords relevant to intervention acceptability research.
Table 4. Frequencies of authors’ keywords relevant to intervention acceptability research.
KeywordOccurrencesKeywordOccurrencesKeywordOccurrences
Acceptability56Social media11Workplace7
Feasibility45Survey11Clinical trial6
COVID-1945Young adult11Mixed methods6
Usability42Feasibility study10Qualitative evaluation6
Adolescent32University students10Qualitative study6
Adolescents32Attitudes9RCT6
implementation24Pandemic9Barriers5
Veterans21Pilot study9Breast cancer5
Youth21Pregnancy9Chronic illness5
Qualitative research20Usability testing9COVID-19 pandemic5
Suicide16Acceptance8Cultural adaptation5
Young people15Adherence8Homelessness5
College students14Children8Nurses5
Engagement14Development8Participatory design5
Qualitative14HIV8Perception5
Caregivers13Women8Perinatal5
User experience13Young adults8Postpartum period5
User-centered design13Child7Rural5
Co-design11Dementia7User centered design5
Older adults11Design7Veteran5
Parents11Students7
Table 5. Top 10 most cited publications.
Table 5. Top 10 most cited publications.
RankingDocumentsTitleCitations
1Fitzpatrick et al. (2017)Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial655
2Donaghy et al. (2019)Acceptability, benefits, and challenges of video consulting: a qualitative study in primary care242
3Titov et al. (2010)Internet Treatment for Depression: A Randomized Controlled Trial Comparing Clinician vs. Technician Assistance235
4Ben-Zeev et al. (2013)Development and usability testing of FOCUS: A smartphone system for self-management of schizophrenia.189
5Titov et al. (2013)Improving Adherence and Clinical Outcomes in Self-Guided Internet Treatment for Anxiety and Depression: Randomised Controlled Trial173
6Huberty et al. (2019)Efficacy of the Mindfulness Meditation Mobile App “Calm” to Reduce Stress Among College Students: Randomized Controlled Trial163
7Robinson et al. (2010)Internet Treatment for Generalized Anxiety Disorder: A Randomized Controlled Trial Comparing Clinician vs. Technician Assistance162
8Musiat et al. (2014)Understanding the acceptability of e-mental health—attitudes and expectations towards computerised self-help treatments for mental health problems150
9Comer et al. (2017)Remotely delivering real-time parent training to the home: An initial randomized trial of Internet-delivered parent–child interaction therapy (I-PCIT).126
10Levin et al. (2017)Web-Based Acceptance and Commitment Therapy for Mental Health Problems in College Students: A Randomized Controlled Trial124
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Armaou, M. Research Trends in the Study of Acceptability of Digital Mental Health-Related Interventions: A Bibliometric and Network Visualisation Analysis. Soc. Sci. 2024, 13, 114. https://doi.org/10.3390/socsci13020114

AMA Style

Armaou M. Research Trends in the Study of Acceptability of Digital Mental Health-Related Interventions: A Bibliometric and Network Visualisation Analysis. Social Sciences. 2024; 13(2):114. https://doi.org/10.3390/socsci13020114

Chicago/Turabian Style

Armaou, Maria. 2024. "Research Trends in the Study of Acceptability of Digital Mental Health-Related Interventions: A Bibliometric and Network Visualisation Analysis" Social Sciences 13, no. 2: 114. https://doi.org/10.3390/socsci13020114

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop