Next Article in Journal
The Impact of In-Event Health Services at Europe’s Largest Electronic Dance Music Festival on Ems and Ed in the Host Community
Previous Article in Journal
Time Pressure Affects the Risk Preference and Outcome Evaluation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Electronic Mental Wellness Tool as a Self-Administered Brief Screening Instrument for Mental Disorders in the General Spanish Population during the Post-COVID-19 Era

by
Ismael Martinez-Nicolas
1,2,
Cale Basaraba
3,
David Delgado-Gomez
4,5,
Olatz Lopez-Fernandez
6,7,8,9,*,
Enrique Baca-Garcia
6,10,11,12,13,14,15,16,* and
Milton L. Wainberg
3
1
Faculty of Life Sciences, Catholic University of Murcia (UCAM), 30107 Guadalupe, Spain
2
Sistema Español de Notificación en Seguridad en Anestesia y Reanimación (SENSAR), IDEhA Simulation Centre, Fundación Alcorcon University Hospital, 28922 Alcorcon, Spain
3
New York State Psychiatric Institute, Columbia University, New York, NY 10024, USA
4
Department of Statistics, University Carlos III of Madrid, 28903 Getafe, Spain
5
Santander Big Data Institute, Universidad Carlos III de Madrid, 28903 Getafe, Spain
6
Department of Psychiatry, University Hospital Jimenez Diaz Foundation, 28040 Madrid, Spain
7
Faculty of Psychology, Madrid Complutense University, 28049 Madrid, Spain
8
Faculty of Psychology, Francisco de Vitoria University, 28049 Madrid, Spain
9
Faculty of Psychology, Cardenal Cisneros Centro de Estudios Universitarios, 28223 Madrid, Spain
10
Department of Psychiatry, University Hospital Rey Juan Carlos, 28933 Móstoles, Spain
11
Department of Psychiatry, General University Hospital of Villalba, 28400 Collado Villalba, Spain
12
Department of Psychiatry, University Hospital Infanta Elena, 28342 Valdemoro, Spain
13
Department of Psychiatry, Madrid Autonomous University, 28049 Madrid, Spain
14
CIBERSAM (Centro de Investigacion en Salud Mental), Carlos III Institute of Health, 28029 Madrid, Spain
15
Faculty of Psychology, Universidad Catolica del Maule, Talca 3605, Chile
16
Department of Psychiatry, Centre Hospitalier Universitaire de Nîmes, 30900 Nîmes, France
*
Authors to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2023, 20(4), 3204; https://doi.org/10.3390/ijerph20043204
Submission received: 23 December 2022 / Revised: 29 January 2023 / Accepted: 3 February 2023 / Published: 11 February 2023
(This article belongs to the Special Issue Post-COVID-19 Phase Becomes the New Reality)

Abstract

:
(1) Background: In the “post-COVID-19 era”, there is a need to focus on properly assessing and addressing the extent of its well-established mental health collateral damage. The “Electronic Mental Wellness Tool” (E-mwTool) is a 13-item validated stepped-care or stratified management instrument that aims at the high-sensitivity captures of individuals with mental health disorders to determine the need for mental health care. This study validated the E-mwTool in a Spanish-speaking population. (2) Methods: It is a cross-sectional validation study using the Mini International Neuropsychiatric Interview as a criterion standard in a sample of 433 participants. (3) Results: About 72% of the sample had a psychiatric disorder, and 67% had a common mental disorder. Severe mental disorders, alcohol use disorders, substance use disorders, and suicide risk had a much lower prevalence rate (6.7%, 6.2%, 3.2%, and 6.2%, respectively). The first three items performed excellently in identifying any mental health disorder with 0.97 sensitivity. Ten additional items classified participants with common mental disorders, severe mental disorders, substance use disorders, and suicide risk. (4) Conclusions: The E-mwTool had high sensitivity in identifying common mental disorders, alcohol and substance use disorders, and suicidal risk. However, the tool’s sensitivity in detecting low-prevalence disorders in the sample was low. This Spanish version may be useful to detect patients at risk of mental health burden at the front line of primary and secondary care in facilitating help-seeking and referral by their physicians.

1. Introduction

Mental and psychological diseases have been identified as the world-leading causes of years lived with disability [1]. Indeed, people with mental health problems around the world have limited access to psychological care [2,3,4]. Developing countries face vast challenges in terms of financing these types of diseases and mental health service availability [5,6]. In low- and middle-income countries, more than 75% of people identified with serious anxiety, problematic mood changes, impulse control, or substance abuse disorders did not receive any traditional or digital care [7,8,9]. Even developed countries such as the United States (U.S.) also face mental health access problems with nearly 40% of the population residing in areas where no mental health professionals are available [10]. For these reasons, it is urgent to start implementing screening procedures to detect possible mental health burdens in patients who attend health centers that facilitate referral to psychological or psychiatric services.
Internet medical care and telehealth have profoundly changed access and the mode of seeking physical and mental health care, yet only in settings where the proper infrastructure exists [11]. Similarly, digital technology is increasingly being used to address the mental health access crisis worldwide [12,13], albeit mostly focused on addressing depression and anxiety symptoms [9,14,15]. Telehealth should overlap with public health strategies to greatly improve the large mental health treatment gap to increase access and enhance the ability of community medical staff to identify and manage psychological problems, especially after the COVID-19 pandemic [2,16,17,18,19,20].
Their underrated importance in public health is now highlighted by the impact of the COVID-19 pandemic [21]. During the pandemic, many factors have been identified as contributors to widespread emotional distress in the general population. It has led to both a higher prevalence of mental health and substance use symptoms and disorders globally, and recurrences or worsening of symptoms among people with mental disorders [22]. Studies involving patients have shown higher levels of depression, anxiety, and sleep disturbances in these individuals during strict lockdown measures in many countries that have flexibly managed hospital practices during pandemic times [23,24,25,26], including Spain [27,28]. Therefore, in this “post-COVID-19 era”, it is necessary to properly assess and address the extent of this mental health collateral damage and the usage of mobile health (mHealth) tools to cover the reported mental health services gap.
At this preliminary stage of emerging mHealth tools after the outbreak of the COVID-19 pandemic, there is a need for specific mHealth screening tools to sensibly detect those patients who attended healthcare settings with potential mental health problems. These mental health problems can be unnoticed by physicians’ consultations or other health professionals who are not psychiatrists or clinical psychologists.
One such advance which covers a gap in current screening instruments is the “Electronic Mental Wellness Tool” (E-mwTool) [29], which was developed and validated in a low-income country, Mozambique (Africa [2,30]). Thus, it has not been tested in developed countries or languages different from Portuguese. It is a brief screening and triage tool that could provide a substantial benefit to healthcare systems in low-, middle-, and high-income countries.
The original E-mwTool is a 12-item instrument designed as a stepped- or stratified-care management approach to improve access to care with rigor. The E-mwTool supports nonpsychiatric specialists in primary and secondary care settings, such as general practitioners (GPs) and specialist physicians, among other allied health professionals in community settings, in screening patients for mental disorders [29]. In the original setting, the first three items identified any mental disorder with excellent sensitivity (94%), and nine additional items further classified individuals into four treatment categories (i.e., common, severe, substance use disorders, and suicide risk) with high specificity (63%–93%) [29].
However, the original E-mwTool does not include items screening for substance use disorder because of low prevalence in the sample used for development. Accordingly, we will include one more single-item measure in our study regarding illicit drug use and nonmedical use of prescription drugs [31] to facilitate addiction detection in healthcare settings. As a result, we will examine the performance of an updated E-mwTool comprising 13 items.
Along with this screening tool, five internationally validated, self-administered screening tests were employed: the Alcohol Use Disorders Identification Test (AUDIT-3 [32]), Columbia-Suicide Severity Rating Scale (C-SSRS [33,34]), Generalized Anxiety Disorder-7 (GAD-7 [35]), and Patient Health Questionnaire-9 (PHQ-9 [36]). Altogether, the Spanish E-mwTool with 13 items or these self-administered questionnaires could provide identification and further orientation for patient referral from physicians to specialized mental healthcare units.
Therefore, the primary purpose of the current study was to examine the validity of the 13-item E-mwTool against the Mini International Neuropsychiatric Interview (MINI-Plus) diagnoses as the criterion standard in a Spanish population referred for psychiatric consultation. A secondary aim was to examine the performance of the aforementioned five self-administered questionnaires in a Spanish population to inform further use of these screening tools in mHealth settings.

2. Materials and Methods

2.1. Study Design

We conducted a cross-sectional validation study on mental health patients from Madrid (Spain). This paper is compliant with the Standards for the Reporting of Diagnostic Accuracy Studies (STARD) statement [37].

2.2. Participants and Procedure

Participants were recruited over a period of six months from primary and secondary care ambulatory services in the “Hospital Universitario Fundación Jiménez Díaz” (HUFJD) healthcare network. This network comprises care ambulatory centers and a general hospital with a catchment area of 300,000 people.
Eligible participants were required to meet two criteria: (a) being older than 18 years old and (b) presenting themselves for a psychiatric consultation, either self-referred or referred by a GP. Consecutive sampling was employed, as we intended to screen mental health patients in the routine care workflow of psychiatric units.
The process of recruitment consisted in inviting the patients referred by the GP and attending consultation at the psychiatric units of the UH FJD during their first visit or in a subsequent visit (i.e., at the first instance that we could apply the E-mwTool). A clinician, a clinical psychologist, or a psychiatrist informed eligible participants of the study. After obtaining informed consent, participants completed the E-mwTool and other screening tools on their smartphones. The MINI-Plus was administered by the clinician thereafter on the same visit.
Participants were patients seen in consultation by a psychiatrist either on their first visit or in a subsequent visit (i.e., at the first instance that we could apply the E-mwTool). Clinical data were recorded during their routine healthcare visits.
Ethical approval was received from the “Instituto de Investigación Sanitaria de la Fundación Jiménez Díaz” (IIS-FJD: code ER1_PIC 76-2013_FJD). Informed consent was obtained from each participant before recruitment.

2.3. Materials and Measures

Between June and December 2019, psychiatrists received/phoned patient visits as part of their routine in which research psychologists conduct MINI-Plus interviews. The MINI-Plus is a structured diagnostic interview that has been widely adopted and can be simply administered in approximately 15 min [38,39,40]. It has been used as a reference standard across many contexts [41].
Participants were provided a mobile app with a built-in questionnaire, the E-mwTool, to screen patients’ mental health and classify them into treatment categories, as well as five self-administered questionnaires: the single-item drug use screening test [31]; the AUDIT-3 [32], the C-SSRS [33,34], the GAD-7 [35], and the PHQ-9 [36]. We also evaluated the performance of the original E-mwTool and the AUDIT-3 with two different response options and cutoffs.
The E-mwTool is a validated instrument that was developed by identifying a small number of items from a battery of mental health screeners via least absolute shrinkage and selection operator (LASSO) regression. The items that could best predict the presence of a wide range of mental disorders, as well as classify an individual into treatment-oriented disorder categories, were selected [29]. This brief, 2-step 12-item screener screens for severe mental disorders (including psychosis and mania), common mental disorders (including depression, anxiety, and post-traumatic disorder (PTSD)), alcohol use disorders, and suicide risk (see Table 1).
The original E-mwTool does not have an item dedicated to drug use, because the sample used to develop the tool had a very low prevalence of substance use disorders. Therefore, in this sample, we explored if a 13th item focused on illicit drug use, taken from a single-item drug use screening test with high sensitivity and specificity [31], would improve screening for substance use disorders when added to the E-mwTool.

2.4. Analytical Strategy

After anonymization, all analyses were performed by an independent investigator (C.B.). We first used descriptive statistics to describe baseline characteristics, sum scores of the screening instruments, and the prevalence of psychiatric disorders among study participants. To ascertain the predictive performance of the E-mwTool, we calculated sensitivity and specificity when compared to MINI-Plus diagnoses for any mental disorder and each of the four disorder categories (health dimensions in Table 1).
For the other screening instruments, we constructed nonparametric receiver operating characteristic curves (ROC curves) to examine the performance of each scale over a range of sum score cutoffs. We then calculated the area under the ROC curve (AUC), which is a global measure of the discriminating ability of a test [42]. It ranges from 0.5, representing no discrimination, to 1.0, representing perfect discrimination. Then, an optimal empirical cutoff point was identified via the Youden (J) index [42]. The J index combines the sensitivity and specificity of a cutoff in a single value, which is defined as follows: Sensitivity + Specificity—1. It ranges from −1 to 1, where the greater its value, the better the combined sensitivity and specificity of the cutoff point.
Point estimates for sensitivity, specificity, and AUC are presented along with their respective 95% confidence intervals. All analyses were performed on the set of participants with no missing MINI diagnoses or missing responses to the tool being evaluated. All statistical analyses were performed by using R version 4.0.5 [43].

3. Results

3.1. Population Characteristics and Prevalence of Mental Health Disorders

Table 2 presents the characteristics of the study population. A total of 433 participants had initially completed responses to all E-mwTool items and MINI diagnoses. Females (279, 64.4%) represented most of our study population, which had a mean age of 49.3 (standard deviation (SD) 16.3) years. According to the MINI-Plus, 313 (72.3%) participants had a mental health condition, with 58 participants (13.4%) having more than one diagnosis category. Common conditions were highly prevalent in our sample, whilst severe conditions or alcohol–drug disorders were relatively scarce (under 7%). Major depression—as both episodes and depressive disorder—was the most prevalent disorder. General anxiety disorder (107, 24.7%) was also prevalent in this sample.

3.2. Discriminating Capacity of E-mwTool

Table 3 shows the sensitivity and specificity of the 13-item E-mwTool for predicting MINI-Plus diagnoses. When looking at the first three items that serve as screening for any disorder, we observed excellent sensitivity of 0.97 (95% CI = 0.95–0.99). Sensitivity and specificity for detecting any common mental disorder were 0.96 (95% CI = 0.93–0.98) and 0.31 (95% CI = 0.24–0.39), respectively. Similarly, the E-mwTool showed high sensitivity for almost every common mental disorder, ranging from 0.90 (95% CI = 0.56–1.00) for social anxiety disorder to 1.0 (95% CI = 0.92–1.00) for panic disorder. As expected, specificity for these common disorders was low: from 0.13 (95% CI = 0.11–0.17) for social anxiety disorder to 0.19 (95% CI = 0.14–0.24) for major depressive episodes; specificities ranged from 0.13 to 0.19.
The E-mwTool showed excellent specificity (0.85, 95% CI = 0.82–0.89) for any severe mental disorder, but low sensitivity (0.21, 95% CI = 0.08–0.40). The E-mwTool performed well for suicide risk, with a sensitivity of 0.93 (95% CI = 0.76–0.99) and specificity of 0.60 (95% CI = 0.55–0.65); for alcohol use disorder, 0.86 (95% CI = 0.57–0.98); and substance use disorder, 0.73 (95% CI = 0.45–0.92). The complete description of the performance of the E-mwTool across different cutoffs for the AUDIT item is available in Supplementary File S1 online.

3.3. Discriminating Capacity and Optimal Cutoff Points of Self-Administered Questionnaires

Results from ROC analyses showed good discriminatory power for all the self-administered questionnaires. The PHQ-9 had an AUC of 0.78 (95% CI = 0.74–0.82) for major depression (depressive episodes or depressive disorder) diagnosis, but also had a strong performance for any common mental disorder (AUC = 0.75; 95% CI = 0.70–0.80) and any disorder (i.e., severe mental disorders, substance/alcohol use disorders, suicide risk; AUC = 0.71; 95% CI = 0.66–0.77). The optimal cutoff for the PHQ-9 predicting MINI depression according to Youden’s J was ≥10, with a sensitivity of 0.82 and a specificity of 0.58.
The GAD-7 had an AUC of 0.69 (95% CI = 0.65–0.74) for any anxiety-related disorder (social anxiety disorder, generalized anxiety disorder, panic, agoraphobia, OCD, or PTSD) and had similar results regarding common conditions (AUC = 0.73; 95% CI = 0.68–0.79) or any disorder (AUC = 0.69; 95% CI = 0.64–0.75). The optimal cutoff for the GAD-7 predicting MINI anxiety according to Youden’s J was ≥8, with a sensitivity of 0.87 and a specificity of 0.42.
The CSSR-S had also a good discriminating power for suicide risk (AUC = 0.79; 95% CI = 0.72–0.86), with an optimal cutoff of ≥”Low Risk” according to Youden’s J, with a sensitivity of 0.96 and a specificity of 0.59. The AUDIT-3 had a greater AUC (0.90; 95% CI = 0.82–0.97) regarding alcohol abuse, which decreased to 0.76 (95% CI = 0.65–0.86) when applied as a predictor of either alcohol or substance use disorder. For predicting alcohol use disorder, the optimal cutoff according to Youden’s J was ≥6, with a sensitivity of 0.71 and a specificity of 0.95. The single-question screening test for drug use in primary care performed well for substance use (AUC = 0.81, 95% CI = 0.72, 0.90). When considering possible cutoffs for this single-item measure, a cutoff of ≥1 maximized combined sensitivity and specificity (sensitivity = 0.67, specificity = 0.92).
A complete description of the sensitivity and specificity of each scale at every cutoff point, including the optimal cutoff points based on the Youden index, is available in Supplementary File S2.

4. Discussion

The primary aim of the present study was to examine the validity of the self-administered E-mwTool compared to the MINI-Plus diagnoses (criterion standard) in a general Spanish clinical population in HUFJD. Thus, it was hypothesized the E-mwTool could be a valid screening option based on its performance in screening a broad range of disorder categories.
In this sample collected during COVID-19, the majority of individuals were diagnosed with a common mental health condition. The usual diagnosis was major depression or general anxiety, which coincides with what other studies have reported during the pandemic [44]. The performance of the E-mwTool screening these disorders was strong, with a sensitivity of 0.96 for common mental disorders overall. Similarly, the sensitivity of the tool for suicide risk and alcohol/substance use disorders was high, with sensitivities of 0.93 and 0.79, respectively. The strong sensitivity to these conditions is especially relevant, as the need to detect suicide risk has increased along with the increase in suicide during and after the pandemic crisis [45,46].
However, for severe mental health conditions, in general, the sensitivity of the E-mwTool was very low for the disorders studied (e.g., bipolar disorders I and II, hypomania, and manic episodes, except those linked to psychotic disorders), though it is important to note that only a few patients were positively diagnosed with these other types of health problems.
While these findings support that the newly adapted E-mwTool has an excellent capacity to identify with high sensitivity most mental disorders in this clinical Spanish population, the E-mwTool did excel at generating a negative result for patients who do not present with mental health conditions (i.e., specificity or “true negative rate”). Generally, the higher the sensitivity, the lower the specificity, as these performance metrics are dependent [47]. In the screening context, sensitivity is a priority, and in this study, the E-mwTool excellently detected any psychiatric disorder, common mental disorders, alcohol use disorder, and suicide risk.
The E-mwTool was developed for similar purposes as other short screening tools, such as the Psychiatric Diagnostic Screening Questionnaire (PDSQ) [48,49]. These tools are like diagnostic aids designed to be used in clinical practice by psychiatric outpatients presenting for diagnosis and treatment and can facilitate the efficiency of conducting initial prediagnostic evaluations. According to Zimmerman and Mattia in 2001 [49,50], it is crucial that a diagnostic aid have good sensitivity to detect more cases, at least in the initial stage of the clinical process (i.e., screening, assessments, and intervention). Indeed, in comparison with the PDSQ, the E-mwTool has been validated and compared with other scales; more importantly, it has technologized the administration of the screening test through a mobile health application.
Concerning the previous E-mwTool studies, the findings are quite similar. For instance, in its first sample [2], when it was developed as a mHealth app and applied in Mozambique into primary care settings, there was an excellent sensitivity in the common mental disorders (94% vs. 96% in the present study). However, the subsequent ten items measuring the severe disorders plus substance and suicide risk had a fair to excellent specificity (63–93% vs. 60–85% in the present study). Similarly, the second study [2] tested the tool to create a future toolkit to help in screening and identifying the presence of these disorders in community samples during annual medical visits to support the healthcare system managed by this African country [29].
Thus, our results suggest that this new tool can be used as a screener to support populations experiencing the risk of a psychiatric disorder. Specifically, it would be useful as a screener to be used before visiting the outpatient clinic to support the prediagnosis of a mental health problem. There is already a need, highlighted in COVID-19 times, to support people with a high risk of psychological morbidity through enhancing the awareness and diagnosis of mental disorders through online and smartphone technologies, especially in primary care and emergency units, apart from psychiatric units [51].
A recent systematic review showed that the negative psychological impact of the pandemic was significant and that it can be associated with an increase in internet searches related to fear, anxiety, and depression [52]. Indeed, during the pandemic, the reduced availability of mental health services has been associated with an increase in mental health burden, potentially related to an escalating number of untreated people with mental health problems. During the COVID-19 pandemic, an increased number of users of mental health apps was also reported compared to prior to the pandemic. Male adults, who are less likely to access mental health services, were more likely to launch mental health apps compared to females [53]. Importantly, in spite of the usage increase, user engagement remained minimal. These findings support utilizing an easy-to-use and brief mHealth screening app, such as the E-mwTool, to efficiently and rapidly prescreen the potential need for mental health treatment either via self-report or during a visit to the GP. Engagement and adherence to mental health treatments would require referral to appropriate mental health services.
Initiatives such as the E-mwTool can mitigate how poorly prepared we were for the pandemic to continue the delivery of psychiatric services worldwide at a moment when the citizens needed more of this type of attention, and to help prevent the risks of severe disorders such as addictions and suicide [54]. Interestingly, while social distance measures were taken to avoid infections, it seems the reduction in human contact has shown a potential adverse outcome on suicide risk during the pandemic, and strategies to reduce the barriers to mental health treatment are paramount in these times [55].
The secondary goal of this study was to evaluate the five self-administered questionnaires in the Spanish community population. The evaluation of these screeners across possible sum score cutoffs allows clinicians to adapt their use to different scenarios, maximizing sensitivity or specificity or balancing the two; this is essential for making good clinical decisions on the basis of proposed screening tests, as in medical practice the decision threshold is crucial for usefulness [48]. Overall, the five self-administered questionnaires showed good discriminatory power for major depression, anxiety-related disorders, and the rest of the common and severe conditions, with AUC values mostly above 0.75. The findings from this study support the strong performance of these questionnaires in their short versions [33,34,35,36] and validate these tools in the Spanish language for psychiatric clinic outpatients during the COVID-19 pandemic.
While some of the optimal cutoffs for these screening tools determined by Youden’s J (combined sensitivity and specificity) are similar to those commonly used, other results suggest possible alterations to common practice. In past UHFJD psychiatric settings, a cutoff of ≥10 was used for the PHQ-9 and GAD-7 to screen for depression and anxiety, respectively. The results from this analysis suggest that the cutoff of ≥10 does maximize sensitivity and specificity for the PHQ-9, but a lower cutoff of ≥8 for the GAD-7 provides a higher sensitivity with less decrease in specificity. The CSSR-S cutoff of ≥”Low Risk” which is often used in UHFJD screening for suicide risk was supported by our data. However, the optimal AUDIT-C cutoff of ≥6 based on this analysis is slightly lower than the cutoff of ≥7 used in UHFJD screening practice. Lowering the commonly used cutoff to ≥6 may detect more individuals with alcohol use disorder, while minimally increasing false positives. Providing this information to clinicians allows them to make appropriate decisions regarding what cutoff to use given the needs of their patient population and their resources [48].
Regarding the limitations of this study, the sample could not be randomized due to the ethics procedure established by the IISFJD at UHFJD. As a result, there was no counterbalance between the order in which the E-mwTool and the MINI-Plus were administered (all individuals first answered the E-mwTool, and then clinicians administered the MINI-Plus); therefore, there was no opportunity to evaluate whether the order may have had an effect. However, the E-mwTool was intended as a screening measure, which is why it was administered before the MINI diagnosis process. Secondly, the representativeness of the sample should be taken cautiously as a snowball sampling approach was used, and future replications with larger samples and other sampling strategies can provide a major generalization of the findings. Thirdly, the psychiatric outpatient clinic is a private setting, which during pandemic times most commonly served individuals with mood and anxiety disorders and had few individuals with severe disorders. Finally, the mHealth app was administered to adults who were on average in late middle age, so it would have been optimal to perform a digital competency test to ensure their capacity to use the app. However, all were accepted and finally managed to use the app with no difficulties.
Our intended target population was suspected cases of mental health conditions and related disorders that may be seen in a GP (i.e., primary care physician) visit in an urban, high-income country context. The sample was recruited at secondary care centers when patients were already referred by GPs, so relevant variables that could have biased the proportion and distribution of mental health disorders in our sample could not be measured. For example, race; religion; socioeconomic status; institutional racism; availability, accessibility, and quality of primary healthcare; attrition in referral; and distressing COVID-19 circumstances have not been analyzed. Our study findings may not generalize to the broader population of suspected cases of mental health disorders seen by urban primary and secondary care teams in high-income countries.

5. Conclusions

In conclusion, this first examination of the E-mwTool with a Spanish sample during the pandemic indicates it may be a promising tool that can be beneficial under time constraints and COVID-19 measures (such as social distancing). Its screening will generally work well in detecting any psychiatric disorder, common psychiatric disorders, suicide risk, and alcohol/substance use disorders. However, as currently constructed, it has low sensitivity for severe disorders. Additionally, the tool has low overall specificity for most disorders, which should inform its use as a first-step screener rather than a tool for diagnosing patients. Overall, compared with the MINI-Plus, the new E-mwTool and the short versions of the self-questionnaires seem to detect those patients with psychiatric diseases quite well in a rapid and easy format at the front line of primary care. Given the psychological barriers to the diagnosis of mental disorders that exist among many adults attending their GPs, the E-mwTool might be a good first step in facilitating subsequent help-seeking among the general population reluctant to engage in other types of screening.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/ijerph20043204/s1, Supplementary File S1: Performance of E-mwTool for substance/alcohol use at varied AUDIT item 1 cutoffs; Supplementary File S2: Optimal cutoff points and summary of performance of self-administered questionnaires across all cutoff points.

Author Contributions

Conceptualization and methodology, I.M.-N. and E.B.-G.; writing—original draft preparation, all co-authors; writing—review and editing, O.L.-F., C.B. and M.L.W.; supervision, E.B.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board (CEIC at HUFJD).

Informed Consent Statement

Written informed consent has been obtained from the patients to publish this paper.

Data Availability Statement

Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Vigo, D.; Thornicroft, G.; Atun, R. Estimating the true global burden of mental illness. Lancet Psychiatry 2016, 3, 171–178. [Google Scholar] [CrossRef]
  2. Wainberg, M.L.; Gouveia, M.L.; Stockton, M.A.; Feliciano, P.; Suleman, A.; Mootz, J.J.; Mello, M.; Salem, A.F.; Greene, M.C.; Bezuidenhout, C.; et al. Technology and implementation science to forge the future of evidence-based psychotherapies: The PRIDE scale-up study. Évid. Based Ment. Health 2020, 24, 19–24. [Google Scholar] [CrossRef]
  3. Moreno, C.; Wykes, T.; Galderisi, S.; Nordentoft, M.; Crossley, N.; Jones, N.; Cannon, M.; Correll, C.U.; Byrne, L.; Carr, S.; et al. How mental health care should change as a consequence of the COVID-19 pandemic. Lancet Psychiatry 2020, 7, 813–824. [Google Scholar] [CrossRef]
  4. Otu, A.; Charles, C.H.; Yaya, S. Mental health and psychosocial well-being during the COVID-19 pandemic: The invisible elephant in the room. Int. J. Ment. Health Syst. 2020, 14, 38. [Google Scholar] [CrossRef]
  5. Andrade, L.H.; Alonso, J.; Mneimneh, Z.; Wells, J.E.; Al-Hamzawi, A.; Borges, G.; Bromet, E.; Bruffaerts, R.; de Girolamo, G.; de Graaf, R.; et al. Barriers to mental health treatment: Results from the WHO World Mental Health surveys. Psychol. Med. 2013, 44, 1303–1317. [Google Scholar] [CrossRef]
  6. Chisholm, D.; Docrat, S.; Abdulmalik, J.; Alem, A.; Gureje, O.; Gurung, D.; Hanlon, C.; Jordans, M.J.D.; Kangere, S.; Kigozi, F.; et al. Mental health financing challenges, opportunities and strategies in low- and middle-income countries: Findings from the Emerald project. BJPsych Open 2019, 5, e68. [Google Scholar] [CrossRef]
  7. Patel, V.; Prince, M. Global Mental Health: A new global health field comes of age. JAMA 2010, 303, 1976–1977. [Google Scholar] [CrossRef]
  8. Barbui, C.; Purgato, M.; Abdulmalik, J.; Acarturk, C.; Eaton, J.; Gastaldon, C.; Gureje, O.; Hanlon, C.; Jordans, M.; Lund, C.; et al. Efficacy of psychosocial interventions for mental health outcomes in low-income and middle-income countries: An umbrella review. Lancet Psychiatry 2020, 7, 162–172. [Google Scholar] [CrossRef]
  9. Fu, Z.; Burger, H.; Arjadi, R.; Bockting, C.L.H. Effectiveness of digital psychological interventions for mental health problems in low-income and middle-income countries: A systematic review and meta-analysis. Lancet Psychiatry 2020, 7, 851–864. [Google Scholar] [CrossRef]
  10. Anjum, S.; Ullah, R.; Rana, M.S.; Khan, H.A.; Memon, F.S.; Ahmed, Y.; Jabeen, S.; Faryal, R. COVID-19 pandemic: A serious threat for public mental health globally. Psychiatr. Danub. 2020, 32, 245–250. [Google Scholar] [CrossRef]
  11. Zhou, X.; Snoswell, C.L.; Harding, L.E.; Bambling, M.; Edirippulige, S.; Bai, X.; Smith, A.C. The Role of Telehealth in Reducing the Mental Health Burden from COVID-19. Telemed. J. e-Health 2020, 26, 377–379. [Google Scholar] [CrossRef]
  12. Bucci, S.; Schwannauer, M.; Berry, N. The digital revolution and its impact on mental health care. Psychol. Psychother. Theory Res. Pract. 2019, 92, 277–297. [Google Scholar] [CrossRef] [PubMed]
  13. Rodríguez, S.; Valle, A.; Piñeiro, I.; González-Suárez, R.; Díaz, F.M.; Vieites, T. COVID-19 Lockdown: Key Factors in Citizens’ Stress. Front. Psychol. 2021, 12, 666891. [Google Scholar] [CrossRef]
  14. Arenas-Castañeda, P.E.; Bisquert, F.A.; Martinez-Nicolas, I.; Espíndola, L.A.C.; Barahona, I.; Maya-Hernández, C.; Hernández, M.M.L.; Mirón, P.C.M.; Barrera, D.G.A.; Aguilar, E.T.; et al. Universal mental health screening with a focus on suicidal behaviour using smartphones in a Mexican rural community: Protocol for the SMART-SCREEN population-based survey. BMJ Open 2020, 10, e035041. [Google Scholar] [CrossRef] [PubMed]
  15. Kaonga, N.N.; Morgan, J. Common themes and emerging trends for the use of technology to support mental health and psychosocial well-being in limited resource settings: A review of the literature. Psychiatry Res. 2019, 281, 112594. [Google Scholar] [CrossRef]
  16. Norbury, A.; Liu, S.H.; Campaña-Montes, J.J.; Romero-Medrano, L.; Barrigón, M.L.; Smith, E.; Aroca, F.; Artés-Rodríguez, A.; Baca-García, E.; Berrouiguet, S.; et al. Social media and smartphone app use predicts maintenance of physical activity during Covid-19 enforced isolation in psychiatric outpatients. Mol. Psychiatry 2020, 26, 3920–3930. [Google Scholar] [CrossRef] [PubMed]
  17. Rahman, A.; Naslund, J.A.; Betancourt, T.S.; Black, C.J.; Bhan, A.; Byansi, W.; Chen, H.; Gaynes, B.N.; Restrepo, C.G.; Gouveia, L.; et al. The NIMH global mental health research community and COVID-19. Lancet Psychiatry 2020, 7, 834–836. [Google Scholar] [CrossRef]
  18. Ren, F.-F.; Guo, R.-J. PUBLIC MENTAL HEALTH IN POST-COVID-19 ERA. Psychiatr. Danub. 2020, 32, 251–255. [Google Scholar] [CrossRef]
  19. Gruber, J.; Prinstein, M.J.; Clark, L.A.; Rottenberg, J.; Abramowitz, J.S.; Albano, A.M.; Aldao, A.; Borelli, J.L.; Chung, T.; Davila, J.; et al. Mental health and clinical psychological science in the time of COVID-19: Challenges, opportunities, and a call to action. Am. Psychol. 2021, 76, 409–426. [Google Scholar] [CrossRef]
  20. Montoya, M.I.; Kogan, C.S.; Rebello, T.J.; Sadowska, K.; Garcia-Pacheco, J.A.; Khoury, B.; Kulygina, M.; Matsumoto, C.; Robles, R.; Huang, J.; et al. An international survey examining the impact of the COVID-19 pandemic on telehealth use among mental health professionals. J. Psychiatr. Res. 2022, 148, 188–196. [Google Scholar] [CrossRef]
  21. Xiong, J.; Lipsitz, O.; Nasri, F.; Lui, L.M.W.; Gill, H.; Phan, L.; Chen-Li, D.; Iacobucci, M.; Ho, R.; Majeed, A.; et al. Impact of COVID-19 pandemic on mental health in the general population: A systematic review. J. Affect. Disord. 2020, 277, 55–64. [Google Scholar] [CrossRef]
  22. Vieta, E.; Pérez, V.; Arango, C. Psychiatry in the aftermath of COVID-19. Rev. Psiquiatr. Salud Ment. 2020, 13, 105–110. [Google Scholar] [CrossRef] [PubMed]
  23. Asmundson, G.J.; Paluszek, M.M.; Landry, C.A.; Rachor, G.S.; McKay, D.; Taylor, S. Do pre-existing anxiety-related and mood disorders differentially impact COVID-19 stress responses and coping? J. Anxiety Disord. 2020, 74, 102271. [Google Scholar] [CrossRef] [PubMed]
  24. Bäuerle, A.; Steinbach, J.; Schweda, A.; Beckord, J.; Hetkamp, M.; Weismüller, B.; Kohler, H.; Musche, V.; Dörrie, N.; Teufel, M.; et al. Mental Health Burden of the COVID-19 Outbreak in Germany: Predictors of Mental Health Impairment. J. Prim. Care Community Health 2020, 11, 2150132720953682. [Google Scholar] [CrossRef]
  25. Hao, F.; Tan, W.; Jiang, L.; Zhang, L.; Zhao, X.; Zou, Y.; Hu, Y.; Luo, X.; Jiang, X.; McIntyre, R.S.; et al. Do psychiatric patients experience more psychiatric symptoms during COVID-19 pandemic and lockdown? A case-control study with service and research implications for immunopsychiatry. Brain Behav. Immun. 2020, 87, 100–106. [Google Scholar] [CrossRef]
  26. Van Rheenen, T.E.; Meyer, D.; Neill, E.; Phillipou, A.; Tan, E.J.; Toh, W.L.; Rossell, S.L. Mental health status of individuals with a mood-disorder during the COVID-19 pandemic in Australia: Initial results from the COLLATE project. J. Affect. Disord. 2020, 275, 69–77. [Google Scholar] [CrossRef]
  27. González-Blanco, L.; Santo, F.D.; García-Álvarez, L.; de la Fuente-Tomás, L.; Lacasa, C.M.; Paniagua, G.; Sáiz, P.A.; García-Portilla, M.P.; Bobes, J. COVID-19 lockdown in people with severe mental disorders in Spain: Do they have a specific psychological reaction compared with other mental disorders and healthy controls? Schizophr. Res. 2020, 223, 192–198. [Google Scholar] [CrossRef] [PubMed]
  28. Guadalajara, H.; Palazón, Á.; Lopez-Fernandez, O.; Esteban-Flores, P.; Garcia, J.; Gutiérrez-Misis, A.; Baca-García, E.; Garcia-Olmo, D. Towards an Open Medical School without Checkerboards during the COVID-19 Pandemic: How to Flexibly Self-Manage General Surgery Practices in Hospitals? Healthcare 2021, 9, 743. [Google Scholar] [CrossRef]
  29. Lovero, K.L.; Basaraba, C.; Khan, S.; Suleman, A.; Mabunda, D.; Feliciano, P.; dos Santos, P.; Fumo, W.; Mandlate, F.; Greene, M.C.; et al. Brief Screening Tool for Stepped-Care Management of Mental and Substance Use Disorders. Psychiatr. Serv. 2021, 72, 891–897. [Google Scholar] [CrossRef]
  30. Wainberg, M.L.; Lovero, K.L.; Duarte, C.S.; Salem, A.F.; Mello, M.; Bezuidenhout, C.; Mootz, J.; Feliciano, P.; Suleman, A.; dos Santos, P.F.; et al. Partnerships in Research to Implement and Disseminate Sustainable and Scalable Evidence-Based Practices (PRIDE) in Mozambique. Psychiatr. Serv. 2021, 72, 802–811. [Google Scholar] [CrossRef] [PubMed]
  31. Smith, P.C.; Schmidt, S.M.; Allensworth-Davies, D.; Saitz, R. A Single-Question Screening Test for Drug Use in Primary Care. Arch. Intern. Med. 2010, 170, 1155–1160. [Google Scholar] [CrossRef] [PubMed]
  32. Bush, K.; Kivlahan, D.R.; McDonell, M.B.; Fihn, S.D.; Bradley, K.A. The AUDIT Alcohol Consumption Questions (AUDIT-C). An Effective Brief Screening Test for Problem Drinking. Ambulatory Care Quality Improvement Project (ACQUIP). Alcohol Use Disorders Identification Test. Arch. Intern. Med. 1998, 158, 1789–1795. [Google Scholar] [CrossRef]
  33. Mundt, J.C.; Greist, J.H.; Jefferson, J.W.; Federico, M.; Mann, J.J.; Posner, K. Prediction of Suicidal Behavior in Clinical Research by Lifetime Suicidal Ideation and Behavior Ascertained by the Electronic Columbia-Suicide Severity Rating Scale. J. Clin. Psychiatry 2013, 74, 887–893. [Google Scholar] [CrossRef] [PubMed]
  34. Posner, K.; Brown, G.K.; Stanley, B.; Brent, D.A.; Yershova, K.V.; Oquendo, M.A.; Currier, G.W.; Melvin, G.; Greenhill, L.; Shen, S.; et al. The Columbia–Suicide Severity Rating Scale: Initial Validity and Internal Consistency Findings From Three Multisite Studies With Adolescents and Adults. Am. J. Psychiatry 2011, 168, 1266–1277. [Google Scholar] [CrossRef]
  35. Spitzer, R.L.; Kroenke, K.; Williams, J.B.W.; Löwe, B. A Brief Measure for Assessing Generalized Anxiety Disorder: The GAD-7: The GAD-7. Arch. Intern. Med. 2006, 166, 1092–1097. [Google Scholar] [CrossRef]
  36. Kroenke, K.; Spitzer, R.L.; Williams, J.B. The PHQ-9: Validity of a brief depression severity measure. J. Gen. Intern. Med. 2011, 16, 606–613. [Google Scholar] [CrossRef] [PubMed]
  37. Cohen, J.F.; Korevaar, D.A.; Altman, D.G.; Bruns, D.E.; Gatsonis, C.A.; Hooft, L.; Irwig, L.; Levine, D.; Reitsma, J.B.; de Vet, H.C.W.; et al. STARD 2015 guidelines for reporting diagnostic accuracy studies: Explanation and elaboration. BMJ Open 2016, 6, e012799. [Google Scholar] [CrossRef]
  38. Lecrubier, Y.; Sheehan, D.V.; Weiller, E.; Amorim, P.; Bonora, I.; Sheehan, K.H.; Janavs, J.; Dunbar, G. The Mini International Neuropsychiatric Interview (MINI). A short diagnostic structured interview: Reliability and validity according to the CIDI. Eur. Psychiatry 1997, 12, 224–231. [Google Scholar] [CrossRef]
  39. Sheehan, D.; Lecrubier, Y.; Sheehan, K.H.; Janavs, J.; Weiller, E.; Keskiner, A.; Schinka, J.; Knapp, E.; Sheehan, M.; Dunbar, G. The validity of the Mini International Neuropsychiatric Interview (MINI) according to the SCID-P and its reliability. Eur. Psychiatry 1997, 12, 232–241. [Google Scholar] [CrossRef]
  40. Sheehan, D.V.; Lecrubier, Y.; Sheehan, K.H.; Amorim, P.; Janavs, J.; Weiller, E.; Hergueta, T.; Baker, R.; Dunbar, G.C. The Mini-International Neuropsychiatric Interview (M.I.N.I.): The development and validation of a structured diagnostic psychiatric interview for DSM-IV and ICD-10. J. Clin. Psychiatry 1998, 59 (Suppl. S2), 22–33. Available online: http://www.ncbi.nlm.nih.gov/pubmed/9881538 (accessed on 29 January 2023).
  41. Ali, G.-C.; Ryan, G.; De Silva, M.J. Validated Screening Tools for Common Mental Disorders in Low and Middle Income Countries: A Systematic Review. PLoS ONE 2016, 11, e0156939. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Hajian-Tilaki, K. Receiver Operating Characteristic (ROC) Curve Analysis for Medical Diagnostic Test Evaluation. Casp. J. Intern. Med. 2013, 4, 627–635. [Google Scholar]
  43. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2019; Available online: https://www.R-project.org/ (accessed on 29 January 2023).
  44. Fitzpatrick, K.M.; Harris, C.; Drawve, G. Fear of COVID-19 and the mental health consequences in America. Psychol. Trauma Theory Res. Pract. Policy 2020, 12 (Suppl. S1), S17–S21. [Google Scholar] [CrossRef] [PubMed]
  45. Sher, L. Psychiatric disorders and suicide in the COVID-19 era. QJM Int. J. Med. 2020, 113, 527–528. [Google Scholar] [CrossRef] [PubMed]
  46. Sher, L. The impact of the COVID-19 pandemic on suicide rates. QJM Int. J. Med. 2020, 113, 707–712. [Google Scholar] [CrossRef]
  47. Lalkhen, A.; McCluskey, A. Clinical tests: Sensitivity and specificity. Contin. Educ. Anaesth. Crit. Care Pain 2008, 8, 221–223. [Google Scholar] [CrossRef]
  48. Irwin, R.J.; Irwin, T.C. A principled approach to setting optimal diagnostic thresholds: Where ROC and indifference curves meet. Eur. J. Intern. Med. 2011, 22, 230–234. [Google Scholar] [CrossRef]
  49. Zimmerman, M.; Mattia, J.I. The psychiatric diagnostic screening questionnaire: Development, reliability and validity. Compr. Psychiatry 2001, 42, 175–189. [Google Scholar] [CrossRef]
  50. Zimmerman, M.; Mattia, J.I. A Self-Report Scale to Help Make Psychiatric Diagnoses. Arch. Gen. Psychiatry 2001, 58, 787–794. [Google Scholar] [CrossRef]
  51. Cullen, W.; Gulati, G.; Kelly, B.D. Mental health in the COVID-19 pandemic. QJM Int. J. Med. 2020, 113, 311–312. [Google Scholar] [CrossRef]
  52. Gianfredi, V.; Provenzano, S.; Santangelo, O.E. What can internet users’ behaviours reveal about the mental health impacts of the COVID-19 pandemic? A systematic review. Public Health 2021, 198, 44–52. [Google Scholar] [CrossRef]
  53. Aziz, M.; Erbad, A.; Almourad, M.B.; Altuwairiqi, M.; McAlaney, J.; Ali, R. Did Usage of Mental Health Apps Change during COVID-19? A Comparative Study Based on an Objective Recording of Usage Data and Demographics. Life 2022, 12, 1266. [Google Scholar] [CrossRef]
  54. Reger, M.A.; Stanley, I.H.; Joiner, T.E. Suicide Mortality and Coronavirus Disease 2019—A Perfect Storm? JAMA Psychiatry 2020, 77, 1093–1094. [Google Scholar] [CrossRef]
  55. Wainberg, M.L.; Scorza, P.; Shultz, J.M.; Helpman, L.; Mootz, J.J.; Johnson, K.A.; Neria, Y.; Bradford, J.-M.E.; Oquendo, M.A.; Arbuckle, M.R. Challenges and Opportunities in Global Mental Health: A Research-to-Practice Perspective. Curr. Psychiatry Rep. 2017, 19, 28. [Google Scholar] [CrossRef] [Green Version]
Table 1. Health dimensions and the number of questions of E-mwTool and self-administered questionnaires.
Table 1. Health dimensions and the number of questions of E-mwTool and self-administered questionnaires.
QuestionnaireHealth DimensionNumber of Questions
Electronic Mental Wellness Tool (E-mwTool)Common mental disorder3
Severe mental disorder4
Suicide risk3
Alcohol use disorder2
Illegal drug use or nonprescribed medication1
Alcohol Use Disorders Identification Test, short version (AUDIT-3)Alcohol use3
Columbia-Suicide Severity Rating Scale (C-SSRS)Suicide behavior7
Generalized Anxiety Disorder-7 (GAD-7)Anxiety7
Patient Health Questionnaire-9 (PHQ–9)Depression9
Table 2. Descriptive characteristics of participants (n = 433).
Table 2. Descriptive characteristics of participants (n = 433).
Variablen (%) or Mean (SD)
Age49.3 (16.3)
Gender
Female279 (64.4%)
Male153 (35.3%)
Transgender1 (0.2%)
Country
Spain358 (86.3%)
Other57 (13.7%)
Concurrent conditions
Patients with No Diagnosis120 (27.7%)
Patients with 1 Diagnosis Category255 (58.9%)
Patients with >1 Diagnosis Category58 (13.4%)
Diagnosis Categories
MINI-Plus Common288 (66.5%)
Major Depressive Episode151 (34.9%)
Major Depressive Disorder91 (21.0%)
Panic Disorder45 (10.4%)
Agoraphobia25 (5.8%)
Social Anxiety Disorder10 (2.3%)
OCD16 (3.7%)
PTSD35 (8.1%)
Generalized Anxiety Disorder107 (24.7%)
MINI-Plus Severe29 (6.7%)
Manic Episode0 (0%)
Hypomania2 (0.5%)
Bipolar Type I7 (1.6%)
Bipolar Type I w/Psychotic Symptoms1 (0.2%)
Bipolar Type II3 (0.7%)
Psychotic Disorder17 (3.9%)
MDD w/Psychotic Symptoms1 (0.2%)
Suicide risk27 (6.2%)
Alcohol/Substance29 (6.7%)
Alcohol use disorder14 (3.2%)
Substance use disorder15 (3.5%)
Abbreviations: OCD = obsessive–compulsive disorder; PTSD = post-traumatic stress disorder; MDD = manic–depressive disorder.
Table 3. Performance of 13-item E-mwTool in identification and classification of participants with mental disorders (n = 433).
Table 3. Performance of 13-item E-mwTool in identification and classification of participants with mental disorders (n = 433).
DisorderN%Sensitivity95% CISpecificity95% CI
Common mental disorders288670.960.93–0.980.310.24–0.39
Agoraphobia255.81.000.86–1.000.140.11–0.18
Generalized Anxiety Disorder10724.70.970.92–0.990.170.13–0.21
Major Depressive Disorder91210.980.92–1.000.160.12–0.20
Major Depressive Episode15134.90.970.93–0.990.190.14–0.24
Obsessive–Compulsive Disorder163.71.000.79–1.000.140.10–0.17
Panic Disorder4510.41.000.92–1.000.150.11–0.19
Post-traumatic stress disorder 358.10.910.77–0.980.140.10–0.17
Social Anxiety Disorder102.30.900.56–1.000.130.10–0.17
Severe mental disorders296.70.210.08–0.400.850.82–0.89
Bipolar Type I71.60.140.00–0.580.850.81–0.88
Bipolar Type I w/Psychotic Symptoms10.21.000.03–1.000.850.82–0.88
Bipolar Type II30.70.330.01–0.910.850.81–0.88
Hypomania20.50.000.00–0.840.850.81–0.88
Manic Episode00.0----
MDD w/Psychotic Symptoms10.21.000.03–1.000.850.82–0.88
Psychotic Disorder173.90.180.04–0.430.850.81–0.88
Substance/Alcohol Use Disorders296.20.790.60–0.920.770.73–0.81
Alcohol146.20.860.57–0.980.770.73–0.81
Substance153.20.730.45–0.920.670.62–0.71
Suicide Risk276.20.930.76–0.990.600.55–0.65
Abbreviations: OCD = obsessive–compulsive Disorder; PTSD = post-traumatic stress disorder, MDD = manic–depressive disorder.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Martinez-Nicolas, I.; Basaraba, C.; Delgado-Gomez, D.; Lopez-Fernandez, O.; Baca-Garcia, E.; Wainberg, M.L. The Electronic Mental Wellness Tool as a Self-Administered Brief Screening Instrument for Mental Disorders in the General Spanish Population during the Post-COVID-19 Era. Int. J. Environ. Res. Public Health 2023, 20, 3204. https://doi.org/10.3390/ijerph20043204

AMA Style

Martinez-Nicolas I, Basaraba C, Delgado-Gomez D, Lopez-Fernandez O, Baca-Garcia E, Wainberg ML. The Electronic Mental Wellness Tool as a Self-Administered Brief Screening Instrument for Mental Disorders in the General Spanish Population during the Post-COVID-19 Era. International Journal of Environmental Research and Public Health. 2023; 20(4):3204. https://doi.org/10.3390/ijerph20043204

Chicago/Turabian Style

Martinez-Nicolas, Ismael, Cale Basaraba, David Delgado-Gomez, Olatz Lopez-Fernandez, Enrique Baca-Garcia, and Milton L. Wainberg. 2023. "The Electronic Mental Wellness Tool as a Self-Administered Brief Screening Instrument for Mental Disorders in the General Spanish Population during the Post-COVID-19 Era" International Journal of Environmental Research and Public Health 20, no. 4: 3204. https://doi.org/10.3390/ijerph20043204

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop