Next Article in Journal
Clinical Relevance of Drug Interactions with Cannabis: A Systematic Review
Next Article in Special Issue
State-of-the-Art Review on Immersive Virtual Reality Interventions for Colonoscopy-Induced Anxiety and Pain
Previous Article in Journal
Influence of Obesity and Unemployment on Fertility Rates: A Multinational Analysis of 30 Countries from 1976 to 2014
Previous Article in Special Issue
Changes in Brain Activation through Cognitive-Behavioral Therapy with Exposure to Virtual Reality: A Neuroimaging Study of Specific Phobia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Psychometric Tool for Evaluating Executive Functions in Parkinson’s Disease

1
IRCCS Fondazione Don Carlo Gnocchi ONLUS, 20148 Milan, Italy
2
Department of Psychology, Università Cattolica del Sacro Cuore, 20123 Milan, Italy
3
Applied Technology for Neuro-Psychology Lab, Istituto Auxologico Italiano, Istituto di Ricovero e Cura a Carattere Scientifico, 20149 Milan, Italy
4
Faculty of Psychology, eCampus University, 22060 Novedrate, Italy
5
Department of Psychology, Università degli Studi di Torino, 10124 Turin, Italy
*
Author to whom correspondence should be addressed.
J. Clin. Med. 2022, 11(5), 1153; https://doi.org/10.3390/jcm11051153
Submission received: 10 January 2022 / Revised: 14 February 2022 / Accepted: 17 February 2022 / Published: 22 February 2022
(This article belongs to the Special Issue Virtual Reality Therapy: Emerging Topics and Future Challenges)

Abstract

:
Recently, there has been an increasing interest in using 360° virtual-reality video for an ecologically valid assessment of executive functioning in the neurologic population. In this framework, we have developed the EXecutive-functions Innovative Tool (EXIT 360°), an original 360°-based instrument for a multicomponent, ecologically valid evaluation of executive functioning in Parkinson’s Disease (PD). This work aimed to test the usability and user experience of EXIT 360° in patients with PD (PwPD). Twenty-seven PwPD and twenty-seven healthy controls underwent an evaluation that involved: (1) usability assessment by the System Usability Scale and (2) evaluation of user experience using the ICT—Sense of Presence and User Experience Questionnaire. Results showed a satisfactory level of usability for patients (mean = 76.94 ± 9.18) and controls (mean = 80 ± 11.22), with good scores for usability and learnability. Regarding user experience, patients provided a positive overall impression of the tool, evaluating it as attractive, enjoyable, activating, and funny. Moreover, EXIT 360° showed good pragmatic (e.g., efficient, fast, clear) and hedonic quality (e.g., exciting, interesting, and creative). Finally, PwPD considered EXIT 360° as an original tool with high ecological validity (mean = 4.29 ± 0.61), spatial presence (mean = 3.11 ± 0.83) and engagement (mean = 3.43 ± 0.54) without relevant adverse effects. Technological expertise had no impact on performance. Overall, EXIT 360° appeared to be a usable, easy-to-learn, engaging, and innovative instrument for PD. Further studies will be conducted to deepen its efficacy in distinguishing between healthy subjects and patients with executive dysfunctions.

1. Introduction

Over the years, virtual reality-based (VR) tools have appeared to be a promising solution in neuropsychological assessment, providing an ecological evaluation to detect everyday cognitive impairments [1,2,3]. Specifically, several studies have shown the feasibility, acceptability, and efficacy of VR-based tools in the early assessment and rehabilitation of executive dysfunctions (ED) in many neurologic pathologies, for example, Parkinson’s disease (PD) [4,5,6,7].
ED constitutes a typical non-motor symptom in PD, from the early stages of the disease [8,9,10,11], with a negative impact on everyday functioning and quality of life [12,13,14,15]. Specifically, patients with PD showed several impairments in planning, attention, working memory, set-shifting, dual-task performance, inhibitory control, and decision making, also compromising social–cognition abilities [5,16,17]. Thus, patients have trouble in many essential goal-directed daily activities, with repercussions on everyday functioning (i.e., preparing meals, managing money, shopping, and work) [6,14,18,19]. Moreover, a growing number of longitudinal studies have suggested that early ED is predictive of subsequent development of PD dementia [11,20,21]. Therefore, the early identification of executive impairments could identify individuals with PD at risk of developing dementia, providing the opportunity for timely neurorehabilitation interventions [6,22]. In this framework, an early and ecologically valid evaluation of daily executive impairments appears crucial to achieving excellent disease management. Thus, VR-based instruments that allow for carrying out different everyday tasks in ecologically valid and controlled environments [23,24,25,26] appear to be a promising solution in the early evaluation of ED.
A preliminary study conducted using a virtual supermarket showed the presence of a deficit in planning and the switching mechanism required to process, in parallel, a large volume of information [27,28]. VR allowed for a description of planning alterations by testing “pure” mental sequences without the interference of possible motor deficits.
In the following years, the virtual version of the Multiple Errand Test (VMET) allowed for the detection of several executive disorders typical in patients with PD [6,29,30]. In a preliminary study, Raspelli and colleagues have confirmed the presence of deficits in planning and showed impairments in problem solving, set shifting, and sustained attention (few strategies and much perseveration) [29]. The following study conducted by Albani and collaborators also showed deficits in decision making in individuals with PD (more errors and fewer effective strategies than controls) [30]. Similar results were found by Cipresso and colleagues that showed the presence of impairments in cognitive flexibility in PD with normal cognition [6]. This study has shown that an evaluation in real-life context provides a more accurate estimate of the patient’s impairment, hidden in traditional measures: patients with PD differ from healthy controls subjects in VMET performance but not in the conventional neuropsychological assessment of EFs. Therefore, a more ecologically valid evaluation of executive functions (EFs) leads to better detection of subtle deficits since the early stage of PD. In recent years, some authors have exploited the advanced 360° technology in the evaluation of executive functioning in PD [25]. To date, 360° video appears to be a promising interactive virtual technology for creating virtual-reality–immersive applications at a low cost [31]. Implementing neuropsychological tests in 360° environments is an actual challenge; in this framework, Serino and colleagues have developed and validated a 360° version of a paper-and-pencil test for EFs known as the Picture Interpretation Test (PIT) [25]. Results showed the efficacy of PIT 360° as a highly sensitive ecological tool for detecting deficits in active visual perception from the early stages of PD. The traditional neuropsychological test for executive functioning did not differentiate between patients and healthy controls.
In light of these interesting results, Borgnis and colleagues (2021) have developed a 360° instrument to allow for an ecologically valid and multicomponent evaluation of executive functioning: the EXecutive-functions Innovative Tool 360° (EXIT 360°) [32]. In EXIT 360°, participants are engaged in a “game for health” delivered via a standard mobile-powered VR headset, in which they are immersed in 360° domestic environments and have to perform seven everyday subtasks developed to evaluate many components of executive functioning simultaneously and quickly. After developing EXIT 360°, the same authors conducted two preliminary studies to assess the convergent validity of EXIT 360° and usability, involving healthy control subjects [33]. The first study showed a significant correlation between the EXIT 360° score and several standardized neuropsychological tests for EFs (Borgnis et al., submitted); therefore, EXIT 360° can be considered as a 360°-based tool able to assess several components of executive functioning (Borgnis et al., submitted). The second study showed promising and interesting results regarding usability, user experience, and engagement of EXIT 360° [33]. First, participants obtained a positive global opinion of the instrument, judging it as usable, easy-to-learn, clear, enjoyable, attractive, and friendly. Second, EXIT 360° appeared as an efficient and fast tool, with excellent hedonic quality in terms of stimulation (exciting and interesting) and originality. Moreover, EXIT 360° also seemed to be an engaging and challenging device with high spatial presence, excellent ecological validity, and irrelevant adverse sides. Finally, data on healthy control subjects showed that demographic characteristics and technological expertise had no impact on the performance.
Several studies have shown the need to consider the evaluation of “usability” and “user experience” as crucial elements in developing technological tools [34,35,36,37]. Sauer and colleagues recently introduced a new higher-level concept, “interaction experience,” that integrated these two critical aspects, providing major benefits to users and improving their experience with technological instruments [35]. The usability assessment allows for comprehending the “degree to which a subject can use a system to achieve specific goals effectively, efficiently, and satisfactorily within a well-defined context of use” [38]. Therefore, its evaluation allows for understanding of any technical difficulties affecting subjects’ performance. Moreover, cybersickness (e.g., nausea, vertigo, dizziness, headache, sweating) can result in unpleasant experiences, impacting the users’ performance and significantly decreasing the test results’ validity. For this reason, work on user experience appears crucial. Previous studies have underlined the importance of improving the user experience in virtual environments by working on five domains in digital content development: a sense of presence, a sense of realism, engagement, enjoyment, and side effects [39]. Focusing on enjoyment and attractiveness increases users’ motivation and participation and reduces the anxiety of neuropsychological evaluation. Finally, another critical aspect of evaluation regards the technological expertise, above all older adults, since poor performance could be due to insufficient technological expertise [40].
Despite the evident importance of paying attention to usability and user experience, only one study has been conducted to evaluate the usability of VR-based instruments for the assessment of neurocognitive abilities in Parkinson’s disease. Pedroli and colleagues conducted a preliminary study involving 21 healthy control subjects and three patients with PD in an evaluation with VMET. Data showed that healthy participants gave good usability for the VMET, while the patients with PD showed that there needs more than an improvement to VMET to be considered usable. Moreover, results showed that a good training phase before the test is crucial to apply the virtual protocol to PD patients [37].
This work aimed to test the usability and user experience of EXIT 360° in patients with PD.

2. Materials and Methods

2.1. Participants

Twenty-seven patients with PD (PwPD) (M:F = 11:16) and 27 healthy control subjects (HC) (M:F = 11:16) matched for age and education were consecutively recruited at IRCCS Fondazione Don Carlo Gnocchi ONLUS in Milan. All participants had to follow specific inclusion criteria: (a) age between 18 and 90 years; (b) education ≥5 (primary school); (c) absence of cognitive impairment as determined by the Montreal Cognitive Assessment test [41] (MoCA score ≥17.54, the cut-off of normality), corrected for age and years of education according to Italian normative data [42]; and (d) ability to provide written, signed informed consent. Moreover, PwPD had to meet the following inclusion criteria: (a) clinically established or probable Parkinson’s disease according to Movement Disorder Society (MDS) criteria [43]; (b) mild to moderate disease staging, with scores <3 on the Hoehn and Yahr scale; and (c) deficits in EFs confirmed by documented neurological and/or neuropsychological evaluation. Exclusion criteria for all subjects were: (a) severe hearing or visual impairments, (b) major systemic, psychiatric, or other neurological illnesses and (c) overt visual hallucinations or vertigo.
The study was approved by the “Fondazione Don Carlo Gnocchi-Milan” Ethics Committee on 7 April 2021, project identification code 09_07/04/2021. The neuropsychologist provided all participants with a complete explanation of the purpose and risk of the study before they signed the written informed consent based on the revised Declaration of Helsinki [44].

2.2. Procedure

All participants underwent a one-session evaluation at IRCCS Fondazione Don Carlo Gnocchi ONLUS in Milan that involved three main phases: (a) pre-task evaluation; (b) EXIT 360° session; and (c) post-task evaluation [45].
In pre-task evaluation (a), all participants underwent an assessment of their global cognitive profile through the MoCA test, a sensitive screening tool able to exclude the presence of cognitive impairment [41,42]. Moreover, the psychology evaluated their executive functioning through the Frontal Assessment Battery (FAB), a traditional standardized paper-and-pencil test specific for EFs [46,47]. After that, the psychologist collected participants’ socio-demographic data (e.g., age, gender, education level) and technological expertise through an ad hoc questionnaire in which they had to evaluate their perceived level of familiarity and competence with several technologies: tablet, smartphone, computer, and the Internet. Specifically, the questionnaire involved a 5-point scale (from “never” to “every day”) in evaluating “how often, in the last year, did you use...” and a 5-point scale (from “nothing” to “much”) to investigate “how competent do you feel in using...”.
After the preliminary screening, all participants underwent an evaluation session with EXIT 360°. The psychologist started the administration by inviting participants to sit on a swivel chair and wear the mobile-powered headset. Before wearing the headset, participants received a general instruction of the task: “You will now wear a headset. Inside this viewer, you will see some 360° rooms of a house. To visualize the whole environment, I ask you to turn on yourself; you are sitting on a swivel chair for this reason. Within these environments, you will be asked to perform some tasks.
After that, participants started an initial phase to familiarize themselves with the device and virtual environment and to control any adverse effects (e.g., dizziness, nausea). Participants were immersed in a neutral 360° living room, exploring the settings and finding specific objects. At the end of this preliminary phase, participants had to indicate the presence of any negative sides. If the subjects did not report side effects, they were immersed in another 360° living room. They started the real experimental session, hearing the following instruction: “You are about to enter a house. Your goal is to get out of this house in the shortest time possible. To exit, you will have to complete a path and a series of tasks that you will encounter along your way. Are you ready to start?”. All instructions were provided in a standardized way, as they have been previously recorded and inserted within the virtual environments.
During their evaluation, the subjects were engaged in many domestic 360° environments explorable through the head’s movement as in real-life situations [48]. In these environments, participants had to perform seven everyday subtasks of increasing complexity that wanted to tap and evaluate different EFs (for a detailed description, see [32]).
Briefly, “Let’s Start” requires subjects to observe a map, choosing the path that allows them to reach the “finish” in the shortest possible time. In the second subtask, “Unlock the Door”, the participants have to open a door choosing between “key, telephone, and drill”. The task “Choose the Person” requires the participant to explore a living room and select a specific person according to a particular instruction. In task 4, “Turn on the Light”, the subjects are immersed in a dark room because “the power went out,” and they must choose the object that allows them to continue the journey. In the following task (“Where are the Objects?”), participants have to identify the piece of furniture on which four specific objects (i.e., telephone, lamp, teddy bear and blanket) are placed in a bedroom (Figure 1).
In task 6, subjects must complete a rebus. Finally, they must memorize a sequence of numbers in the last task, “Create the Sequence”, reporting them in reverse.
EXIT 360° was designed to allow participants to respond to each task by choosing between three or more alternatives simply by moving their head and positioning a small white dot that they saw in the headset on the answer for a few seconds (Figure 2). Therefore, the answer will be automatically selected, and participants will not have to learn to use complex tools.
Participants had to perform all seven subtasks, obtaining one point for a wrong answer or two for a correct one. Therefore, we evaluated the usability and user experience of the whole task. Overall, EXIT 360° allowed for the collection of Total Score (range 7–14) and Total Reaction Time (i.e., time in seconds registered from examiner’s instruction until the participant provided the last correct answer).
After the EXIT 360° session, participants underwent an evaluation that assessed usability and user experience quality. Regarding usability assessment, we used the System Usability Scale (SUS), a short questionnaire of 10 items on a 5-point scale from “completely disagree” to “strongly agree”, often used to evaluate the overall usability of technological instruments [49,50,51]. Furthermore, SUS allowed for evaluation of the two main aspects that could affect the user experience: usability (easy to use the system) and learnability (easy to learn to use the system) [51]. Table 1 shows the questionnaire and scale used to evaluate the user experience.

2.3. Statistical Analysis

Descriptive statistics included frequencies, percentages, and median and interquartile range (IQR) for categorical variables and mean and standard deviation (SD) for continuous measures. The normality of data distribution was assessed using the Kolmogorov–Smirnov test. A t test for independent sample (parametric or non-according to variables) and Chi-square were conducted to verify possible differences between pathological group and healthy controls in main demographic and clinical characteristics and technological expertise. Moreover, Pearson’s correlation was applied to compare the usability scores, user experience, and technological experience. At the same time, a t test for the independent sample was conducted to evaluate any significant differences between groups in the same variables. All statistical analyses were performed using Jamovi 1.6.7 software. A statistical threshold of p < 0.05 was considered statistically significant.

3. Results

3.1. Participants

Table 2 reports the demographic and clinical characteristics of the whole sample (N = 54), divided into two groups. PwPD (n = 27) was predominantly female (M:F = 11:16) with a mean age of 68.2 (SD = 9, range = 53–84) and age of education =13 (IQR = 5, range 5–18). HC was predominantly female (M:F = 11:16) with a mean age of 66.4 (SD = 10.5, range = 48–88) and age of education = 13 (IQR = 5, range 5–18). The comparison between the PwPD and HC showed the absence of significant differences in all main demographic and clinical characteristics. All participants included in the study showed no cognitive impairment (cutoff of normality = MoCA score ≥ 17.54).

3.2. Technological Expertise

The ad hoc five-point scale for evaluating the perceived level of familiarity with technologies showed similar results between patients and healthy controls, with a mean score of 3.15 ± 0.89 and 3.14 ± 1.12 (i.e., participants used the technology about once a week). Figure 3 shows the percentages relating to familiarity with the technologies for each group.
Moreover, the mean score of the ad hoc five-point competence questionnaire was 2.68 ± 1.01 for PwPD, indicating a score near to “little”. HC obtained a mean score of 3.04 ± 0.98 (i.e., neither enough nor little).
Figure 4 shows the percentages relating to the self-reported competence in using several technologies for each group. Overall, only 7.4% of PwPD and 18.5% of HC showed a good (≥4—enough or much) competence with technology.
Analyzing the possible differences between groups, data showed the absence of significant differences in levels of competence (t test (52) = −1.377; p = 0.174) and familiarity (t test (52) = 0.045; p = 0.964) with technologies.

3.3. Neuropsychological Evaluation

Preliminary analyses have shown that EXIT 360° could be considered as an effective tool in discriminating between PwPD and HC. Table 3 showed significant differences between the two groups in Total EXIT score (t test (52) = −4.95; p < 0.001) and Total Reaction time (t test (52) = 7.12; p < 0.001). Specifically, HC obtained a higher total score (mean = 12.3 ± 1.07) and completed the test in less time (mean = 457.3 ± 73.60). Furthermore, a significant difference also appeared in the FAB score (p = 0.006), showing that HC achieved a higher performance (17.46 ± 1.003).

3.4. EXIT 360°: Usability

Figure 5 shows the mean value of the usability provided by both groups at the SUS. The comparison between the PwPD and HC showed the absence of significant difference in usability score (t test (52) = −1.09; p = 0.279). PwPD provided a mean score of 76.94 ± 9.18, while HC showed a mean score of 80 ± 11.22. Both scores indicate a satisfactory level of usability, according to the scale’s score acceptability ranges (cut off = 68) and adjective ratings (included between “good” and “excellent”).
Specifically, according to the cut-off score (cut-off = 68), more than 74% of PwPD and 92% of HC showed scores above the cut-off. In addition, according to the adjective rating, 29.6% of subjects evaluated EXIT 360° as “OK”, 59.3% as “good”, 7.4% as “excellent”, and 3.7% as “best imaginable” [51,58]. As regards HC, 3.7% of subjects evaluated EXIT 360° as “OK”, 55.6 as “good”, 18.5% as “excellent”, and 18.5% “best imaginable”, with only one participant that showed a low score. Finally, participants provided good and promising scores for two main aspects affecting the user experience. The comparison between groups showed the absence of significant differences in usability (t test (52) = −1.96; p = 0.055) and learnability (t test (52) = 1.89; p = 0.064). Specifically, PwPD showed a mean score of 2.98 ± 0.47 for usability (vs 3.24 ± 0.50) and 3.37 ± 0.63 for learnability (vs. 3.06 ± 0.59). Only 11.1% of patients and 3.7% of controls showed low scores (<2.5) at usability, while only 3.7% for each group in learnability.

3.5. EXIT 360°: User Experience

The first item of the Flow Short Scale showed a high score in the perceived level of skill in performing EXIT 360° both for PwPD and HC (median = 5, IQR = 4–5), without significant difference between groups (U test (52) = 363; p = 0.978). In addition, the other two items allowed for evaluation of the level of challenge of EXIT 360°, also based on their own abilities, such as balance/appropriate (median = 3; IQR = 3) for both groups, without significant difference (U test (52) = 326; p = 0.191).
Table 4 showed that the subscale enjoyment of the IMI obtained high scores (≥4) in all items without significant differences between the two groups.
Specifically, Figure 6 showed the percentage relating to all four items of subscale enjoyment of the IMI, comparing two groups. The figure emerges that both PwPD and HC considered EXIT 360° as activating, funny and enjoyable. No participant evaluated EXIT 360° as boring.
Table 5 showed good scores in all ICT—SOPI dimensions. The comparison between the PwPD and HC showed a significant difference only in the domain “engagement” (t test (52) = −3.44; p < 0.05).
As regards the domain “negative effects”, only a few participants (three PwPD and three HC) reported the presence of minor adverse effects (score < 3), such as vertigo or nausea.
Figure 7 showed participants’ good and promising scores at domains “spatial presence” and “ecological validity”, divided according to groups. First, most of the participants (70.4 of PwPD and 88.9 of HC) showed good scores in terms of spatial presence (≥3—e.g., (“I felt I could interact with the environment shown”). In addition, all healthy control subjects and 92.6% of patients supported that EXIT 360° had good ecological validity (“I had the feeling that the environment shown was part of the real world”), with most of the participants (respectively, 96.3% and 85.2%) that provided high scores (≥4).
Finally, despite the two groups that obtained a significant difference in the domain “engagement”, most of the participants indicated a good level of engagement while performing EXIT 360° (≥3—e.g., “I would have liked the experience to continue”), with only six patients and one control that showed low scores.
The UEQ questionnaire showed positive evaluation (>0.8) in all 26 items in both groups. Figure 8 showed good scores obtained by PwPD in all UEQ scales according to the questionnaire’s score ranges (range between −3, horribly bad, and +3, extremely good).
Table 6 shows in detail the high mean scores of all scales (regarding PwPD) with their respective good values of internal consistency (Alpha-coefficient > 0.7) [59].
Table 7 shows good scores in all UEQ scales and two main dimensions (pragmatic and hedonic quality) of patients and healthy subjects, including the comparison between two groups. Specifically, UEQ’s scales can be grouped into pragmatic quality (perspicuity, efficiency, dependability) and hedonic quality (non-task-related quality aspects—stimulation and originality).
Finally, the means of each UEQ scale of PwPD were compared to existing values from a benchmark dataset (containing data from 20,190 persons from 452 studies) [60]. Results showed that the scale’s stimulation and novelty obtained excellent evaluation, belonging to the range of the 10% best results (Figure 9). Moreover, the scales for Attractiveness, Perspicuity, Efficacy and Dependability obtained a good evaluation, that is, “10% of results better, 75% of results worse.”

3.6. Correlation

Overall, Pearson’s correlation showed the absence of significant linear correlation between the SUS total score and education (r = 0.078; p = 0.576), but not for age (r = −0.401; p < 0.05). Moreover, Pearson’s correlation has underlined the absence of significant correlation between SUS total score and technological expertise measured by the ad hoc questionnaire of competence, both for patients (r = 0.340; p = 0.082) and controls (r = 0.244; p = 0.221). As regards the relationship between usability and user experience, patients showed no linear correlation between SUS total score and three ICT—SOPI domains, spatial presence (r = 0.293; p = 0.138), engagement (r = 0.361; p = 0.064), and ecological validity (r = 0.282; p = 0.154). HC obtained similar results, except for ecological validity that appeared significantly correlated with usability score (r = 0.422; p = 0.028). Moreover, data showed the presence of a significant and negative correlation between the SUS total score and the ICT—SOPI domain negative effect only in PwPD (r = −0.325; p < 0.05). Finally, the statistical analysis showed the absence of significant correlations between usability and the three dimensions of UEQ: attractiveness (r = 0.168; p = 0.224), pragmatic quality (r = 0.196; p = 0.157) and hedonic quality (r = 0.250; p = 0.069).

4. Discussion

Executive dysfunction represents a typical non-motor symptom in PD, impacting everyday functioning and quality of life from the early stages of the disease course [5,7,13,14,15]. Several studies have shown the feasibility, acceptability, and efficacy of VR-based tools for an early and ecologically valid evaluation of ED in many neurologic pathologies [4,5,6,25,26]. In this framework, we have developed EXIT 360° that fits perfectly into the ongoing transformation of traditional neuropsychological assessment [2,23,24]. EXIT 360° aimed to be an original 360°-based instrument for a multicomponent, ecologically valid evaluation of executive functioning in Parkinson’s disease [32]. EXIT 360° have overcome the first validation steps to become a valuable and standardized instrument for assessing EFs, showing excellent convergent validity and promising usability and user experience results in a healthy sample [33]. These results appear interesting because many studies have demonstrated the need to consider the evaluation of “usability” and “user experience” as crucial elements in developing technological tools [34,35,36,37]. Their evaluation allows for understanding any technical (e.g., technological expertise) or clinical (e.g., side effects or motivation) elements that could affect the users’ performance and significantly decrease the test results’ validity [36,38,39,40]. Therefore, we have focused on evaluating usability and user experience in a sample of patients with PD.
Our work involved twenty-seven patients with PD, matched with twenty-seven healthy controls subjects. All subjects involved in the study met the inclusion criteria and successfully carried out EXIT 360° without relevant adverse effects, as demonstrated by previous studies with 360°-based instruments [25]. At the baseline, the whole sample showed low technological expertise regarding the perceived level of familiarity (i.e., participants used the technology about once a week) and technology competence. Only 7.4% of PwPD and 18.5% of HC showed good competence with technology. Overall, despite the low level of familiarity and competence with the technologies, all subjects were able to complete the test successfully. Therefore, EXIT 360° appeared as a promising tool that clinicians could use even with patients without a great technological experience.
Regarding usability evaluation, data showed a good usability score, evaluated by SUS, for both groups without significant difference. PwPD provided a mean score of 76.94 ± 9.18, while HC showed a mean score of 80 ± 11.22. Both scores indicate a satisfactory level of usability, according to the scale’s score acceptability ranges (cut off = 68) and adjective ratings (included between “good” and “excellent”) [49]. Specifically, according to the adjective rating, 29.6% of patients evaluated EXIT 360° as “OK”, 59.3% as “good”, 7.4% as “excellent”, and 3.7% as “best imaginable” [58]. Moreover, all participants provided good and interesting scores for the variables of usability and learnability, indicating that EXIT 360° can be considered as an easy-to-use technological tool and an easy-to-learn instrument [51] for the patients. All these promising usability results allow us to conclude that EXIT 360° showed high effectiveness (i.e., possibility for the users to achieve goals), efficiency (i.e., users’ efforts to reach the aim), and satisfaction (i.e., users’ thoughts about their interaction with the system) [35,38,50]. Thus, it is possible to support that any subject’s low performance does not depend on technological problems. The usability result was not influenced by education level and technological expertise measured by the ad hoc questionnaires of competence. Only the age variable showed a negative correlation with usability score; however, older people (both patients and controls) were able to complete the evaluation with some instructions. Overall, according to these results, no adaptation of our system would be necessary. These findings appeared interesting and promising compared to the only previous usability study on PD and VR-based instruments [36], in which data showed that healthy participants gave good usability for the VR-based tool, while the PwPD showed that there needs to be more than an improvement to the instrument to be considered usable.
In addition to the good usability level, EXIT 360° showed promising results in terms of user experience in both groups. First, patients with PD showed high scores in the perceived level of skill in performing EXIT 360° and evaluated the EXIT 360° subtasks as balanced/appropriate concerning their abilities. Second, they supported a positive general impression of the EXIT 360°, considering it attractive (e.g., pleasant, pleasing, friendly, and enjoyable), activating, funny and not boring. Moreover, EXIT 360° demonstrated a good pragmatic quality as it appeared: (1) efficient, fast, practical, and organized (efficiency); (2) understandable, easy to learn, and straightforward (perspicuity); and (3) predictable, supportive, and secure (dependability). The two groups obtained similar scores in these variables, except for dependability, in which patients obtained a lower score. However, this domain was influenced by the item “meets expectations” because patients referred to having negative expectations due to traditional long and complex evaluation, and they claimed to be pleasantly surprised. In addition, EXIT 360° showed excellent hedonic quality in terms of stimulation (valuable, exciting, interesting, and motivating) and novelty (creative, innovative, inventive). Interestingly, results showed that the scales stimulation and novelty evaluated better than existing values from benchmark data [60]. Finally, all participants considered EXIT 360° as an engaging and challenging tool with good spatial presence (“I felt I could interact with the environment shown”) and as excellent (“I had the feeling that the environment shown was part of the real world”), with most of the participants (96.3% controls and 85.2% patients) providing high scores. Moreover, despite that the two groups obtained a significant difference in the domain “engagement”, most of the participants indicated a good level of engagement while performing EXIT 360° (e.g., “I would have liked the experience to continue”), with only six patients and one control showing low scores. However, the six patients claimed that “the evaluation had a correct duration but that they would have had no problems continuing”. As regards adverse effects, only three PwPD and three HC reported the presence of irrelevant adverse effects, such as vertigo or nausea. These results appear promising considering previous literature that supports the importance of the sense of presence/realism, engagement, enjoyment, and side effects in digital content development [39,61].
Overall, the present results appear promising in terms of usability and user experience of EXIT 360° in patients with PD, in line with results obtained in the previous study on healthy controls. Briefly, EXIT 360° allowed for an ecologically valid evaluation without relevant sicknesses (e.g., dizziness, headache, and nausea) that lead to unpleasant experiences for the patients, impacting their performance and significantly decreasing the test results’ validity [23]. Moreover, our results showed that technological expertise does not affect the EXIT 360° performance [40]. In addition, the high levels of enjoyment, engagement, and attractiveness of EXIT 360° allowed for increasing users’ motivation and participation and decreasing anxiety typical of neuropsychological evaluation. These results supported the innovative higher-level concept of interaction experience, proposed by Sauer and colleagues, in which usability and user experience can be considered key elements to provide more benefits in using technological tools [35].

Limitations and Future Perspectives

The present work has some limitations. First, the head-mounted display used was entry level: the 360° mobile-powered devices currently available on the market have higher quality (e.g., Oculus Quest), which would allow for improving the quality of 360° images, providing a better and more realistic experience. Second, possible misdiagnosis with syndromes showing high resemblance with PD (e.g., Parkinsonism variant of Progressive Supranuclear Palsy (PSP-P)) could have an impact on enrollment and results. Possible misdiagnosis is an important concern for experienced movement disorder specialists, particularly in the early stage of the disease in which several clinical manifestations could overlap [62,63]. To minimize possible misdiagnosis in this study, the patients’ enrollment was made by an experienced neurologist of the Parkinson Center of the Fondazione Don Carlo Gnocchi, according to clinically established or probable PD MDS criteria. Finally, to date, EXIT 360° can be considered as a promising prototype that needs other validation steps to become a valid and standardized instrument for assessing EFs. For this reason, it will be necessary to deepen its effectiveness in discriminating between healthy control subjects and patients with executive dysfunctions.

5. Conclusions

The present study results are promising and interesting in terms of the usability and user experience of EXIT 360° in patients with PD. Overall, patients provided a positive global impression of the instrument, evaluating it as usable, easy-to-learn, understandable, enjoyable, attractive, and friendly. Moreover, EXIT 360° is an efficient, fast, and organized tool, with excellent hedonic quality regarding stimulation (exciting and interesting) and novelty. Finally, EXIT 360° also appeared to be an engaging and challenging tool with good spatial presence, excellent ecological validity, and irrelevant adverse effects. Technological expertise did not influence the encouraging results. Further studies will have to be conducted to deepen the efficacy of EXIT 360° in discriminating between healthy control subjects and patients with executive dysfunctions.

Author Contributions

F.B. (Francesca Borgnis), F.B. (Francesca Baglio), and P.C. conceived and designed the experiments. F.B. (Francesca Borgnis) performed the experiment and collected data. F.B. (Francesca Borgnis) and P.C. developed the new 360° EXecutive-functions Innovative tool and conducted the statistical analysis. M.M. recruited patients with Parkinson’s disease. E.P. and F.R. supervised the methods and results. F.B. (Francesca Borgnis) wrote the first manuscript under the final supervision of F.B. (Francesca Baglio), P.C. and G.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by “Fondazione Don Carlo Gnocchi—Milan” Ethics Committee on 7 April 2021, project identification code 09_07/04/2021.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent was obtained from the participants to publish this paper.

Data Availability Statement

Data can be obtained upon reasonable request to the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

EDExecutive Dysfunctions
EFsExecutive Functions
EXIT 360°EXecutive-functions Innovative Tool 360°
FFemale
FABFrontal Assessment Battery
HCHealthy Control
ICT—SOPIICT—Sense of Presence Inventory
IMIIntrinsic Motivation Inventory
IQRInterquartile Range
MMale
MDSMovement Disorder Society
MoCA testMontreal Cognitive Assessment Test
NNumber
PDParkinson’s Disease
PITPicture Interpretation Test
PwPDPatients with Parkinson’s Disease
SDStandard Deviation
SUSSystem Usability Scale
UEQUser Experience Questionnaire
VMETVirtual Multiple Errand Test
VRVirtual Reality

References

  1. Neguț, A. Cognitive assessment and rehabilitation in virtual reality: Theoretical review and practical implications. Rom. J. Appl. Psychol. 2014, 16, 1–7. [Google Scholar]
  2. Bohil, C.J.; Alicea, B.; Biocca, F.A. Virtual reality in neuroscience research and therapy. Nat. Rev. Neurosci. 2011, 12, 752–762. [Google Scholar] [CrossRef]
  3. Neguț, A.; Matu, S.-A.; Sava, F.A.; David, D. Virtual reality measures in neuropsychological assessment: A meta-analytic review. Clin. Neuropsychol. 2016, 30, 165–184. [Google Scholar] [CrossRef]
  4. Camacho-Conde, J.A.; Climent, G. Attentional profile of adolescents with ADHD in virtual-reality dual execution tasks: A pilot study. Appl. Neuropsychol. Child. 2020, 11, 1–10. [Google Scholar] [CrossRef]
  5. Dahdah, M.N.; Bennett, M.; Prajapati, P.; Parsons, T.D.; Sullivan, E.; Driver, S. Application of virtual environments in a multi-disciplinary day neurorehabilitation program to improve executive functioning using the Stroop task. NeuroRehabilitation 2017, 41, 721–734. [Google Scholar] [CrossRef]
  6. Cipresso, P.; Albani, G.; Serino, S.; Pedroli, E.; Pallavicini, F.; Mauro, A.; Riva, G. Virtual multiple errands test (VMET): A virtual reality-based tool to detect early executive functions deficit in parkinson’s disease. Front. Behav. Neurosci. 2014, 8, 1–11. [Google Scholar] [CrossRef] [Green Version]
  7. Isernia, S.; Di Tella, S.; Pagliari, C.; Jonsdottir, J.; Castiglioni, C.; Gindri, P.; Salza, M.; Gramigna, C.; Palumbo, G.; Molteni, F.; et al. Effects of an innovative telerehabilitation intervention for people with Parkinson’s disease on quality of life, motor, and non-motor abilities. Front. Neurol. 2020, 11, 846. [Google Scholar] [CrossRef]
  8. Kudlicka, A.; Clare, L.; Hindle, J.V. Executive functions in Parkinson’s disease: Systematic review and meta-analysis. Mov. Disord. 2011, 26, 2305–2315. [Google Scholar] [CrossRef]
  9. Aarsland, D.; Creese, B.; Politis, M.; Chaudhuri, K.R.; Weintraub, D.; Ballard, C. Cognitive decline in Parkinson disease. Nat. Rev. Neurol. 2017, 13, 217–231. [Google Scholar] [CrossRef] [Green Version]
  10. Fengler, S.; Liepelt-Scarfone, I.; Brockmann, K.; Schäffer, E.; Berg, D.; Kalbe, E. Cognitive changes in prodromal Parkinson’s disease: A review. Mov. Disord. 2017, 32, 1655–1666. [Google Scholar] [CrossRef]
  11. Fang, C.; Lv, L.; Mao, S.; Dong, H.; Liu, B. Cognition Deficits in Parkinson’s Disease: Mechanisms and Treatment. Park Dis. 2020, 2020, 2076942. [Google Scholar] [CrossRef]
  12. Chan, R.C.K.; Shum, D.; Toulopoulou, T.; Chen, E.Y.H. Assessment of executive functions: Review of instruments and identification of critical issues. Arch. Clin. Neuropsychol. 2008, 23, 201–216. [Google Scholar] [CrossRef] [Green Version]
  13. Leroi, I.; McDonald, K.; Pantula, H.; Harbishettar, V. Cognitive impairment in Parkinson disease: Impact on quality of life, disability, and caregiver burden. J. Geriatr. Psychiatry Neurol. 2012, 25, 208–214. [Google Scholar] [CrossRef]
  14. Lawson, R.A.; Yarnall, A.J.; Duncan, G.W.; Breen, D.P.; Khoo, T.K.; Williams-Gray, C.H.; Barker, R.A.; Collerton, D.; Taylor, J.P.; Burn, D.J.; et al. Cognitive decline and quality of life in incident Parkinson’s disease: The role of attention. Parkinsonism Relat. Disord. 2016, 27, 47–53. [Google Scholar] [CrossRef] [Green Version]
  15. Barone, P.; Erro, R.; Picillo, M. Quality of life and nonmotor symptoms in Parkinson’s disease. In International Review of Neurobiology; Elsevier: Amsterdam, The Netherlands, 2017; pp. 499–516. [Google Scholar]
  16. Diamond, A. Executive functions. Annu. Rev. Psychol. 2013, 64, 135–168. [Google Scholar] [CrossRef] [Green Version]
  17. Dirnberger, G.; Jahanshahi, M. Executive dysfunction in P arkinson’s disease: A review. J. Neuropsychol. 2013, 7, 193–224. [Google Scholar] [CrossRef]
  18. Josman, N.; Schenirderman, A.E.; Klinger, E.; Shevil, E. Using virtual reality to evaluate executive functioning among persons with schizophrenia: A validity study. Schizophr. Res. 2009, 115, 270–277. [Google Scholar] [CrossRef]
  19. Levine, B.; Stuss, D.T.; Winocur, G.; Binns, M.A.; Fahy, L.; Mandic, M.; Bridges, K.; Robertson, I.H. Cognitive rehabilitation in the elderly: Effects on strategic behavior in relation to goal management. J. Int. Neuropsychol. Soc. JINS 2007, 13, 143. [Google Scholar] [CrossRef]
  20. Azuma, T.; Cruz, R.F.; Bayles, K.A.; Tomoeda, C.K.; Montgomery, E.B., Jr. A longitudinal study of neuropsychological change in individuals with Parkinson’s disease. Int. J. Geriatr. Psychiatry 2003, 18, 1043–1049. [Google Scholar] [CrossRef]
  21. Janvin, C.C.; Aarsland, D.; Larsen, J.P. Cognitive predictors of dementia in Parkinson’s disease: A community-based, 4-year longitudinal study. J. Geriatr. Psychiatry Neurol. 2005, 18, 149–154. [Google Scholar] [CrossRef]
  22. Serino, S.; Pedroli, E.; Cipresso, P.; Pallavicini, F.; Albani, G.; Mauro, A.; Riva, G. The role of virtual reality in neuropsychology: The virtual multiple errands test for the assessment of executive functions in Parkinson’s disease. Intell. Syst. Ref. Libr. 2014, 68, 257–274. [Google Scholar]
  23. Armstrong, C.M.; Reger, G.M.; Edwards, J.; Rizzo, A.A.; Courtney, C.G.; Parsons, T.D. Validity of the Virtual Reality Stroop Task (VRST) in active duty military. J. Clin. Exp. Neuropsychol. 2013, 35, 113–123. [Google Scholar] [CrossRef] [PubMed]
  24. Parsons, T.D. Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Front. Hum. Neurosci. 2015, 9, 1–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Serino, S.; Baglio, F.; Rossetto, F.; Realdon, O.; Cipresso, P.; Parsons, T.D.; Cappellini, G.; Mantovani, F.; De Leo, G.; Nemni, R.; et al. Picture Interpretation Test (PIT) 360°: An Innovative Measure of Executive Functions. Sci. Rep. 2017, 7, 1–10. [Google Scholar]
  26. Realdon, O.; Serino, S.; Savazzi, F.; Rossetto, F.; Cipresso, P.; Parsons, T.D.; Cappellini, G.; Mantovani, F.; Mendozzi, L.; Nemni, R.; et al. An ecological measure to screen executive functioning in MS: The Picture Interpretation Test (PIT) 360°. Sci. Rep. 2019, 9, 1–8. [Google Scholar] [CrossRef] [Green Version]
  27. Klinger, E.; Chemin, I.; Lebreton, S.; Marie, R.M. A virtual supermarket to assess cognitive planning. Annu. Rev. Cyber. Therapy Telemed. 2004, 2, 49–57. [Google Scholar]
  28. Klinger, E.; Chemin, I.; Lebreton, S.; Marié, R.-M. Virtual action planning in parkinson’s disease: AControl study. Cyberpsychol. Behav. 2006, 9, 342–347. [Google Scholar] [CrossRef]
  29. Raspelli, S.; Carelli, L.; Morganti, F.; Albani, G.; Pignatti, R.; Mauro, A.; Poletti, B.; Corra, B.; Silani, V.; Riva, G. A neuro vr-based version of the multiple errands test for the assessment of executive functions: A possible approach. J. Cyber. Ther. Rehabil. 2009, 2, 299–314. [Google Scholar]
  30. Wiederhold, B.K.; Reality, V.; Riva, G. Annual Review of Cybertherapy and Telemedicine 2010—Advanced Technologies in Behavioral, Social and Neurosciences. Stud. Health Technol. Inform. 2010, 154, 92. [Google Scholar]
  31. Violante, M.G.; Vezzetti, E.; Piazzolla, P. Interactive virtual technologies in engineering education: Why not 360° videos? Int. J. Interact. Des. Manuf. 2019, 13, 729–742. [Google Scholar] [CrossRef]
  32. Borgnis, F.; Baglio, F.; Pedroli, E.; Rossetto, F.; Riva, G.; Cipresso, P. A Simple and Effective Way to Study Executive Functions by Using 360° Videos. Front. Neurosci. 2021, 15, 296. [Google Scholar] [CrossRef] [PubMed]
  33. Borgnis, F.; Baglio, F.; Pedroli, E.; Rossetto, F.; Isernia, S.; Uccellatore, L.; Riva, G.; Cipresso, P. EXecutive-Functions Innovative Tool (EXIT 360°): A Usability and User Experience Study of an Original 360°-Based Assessment Instrument. Sensors 2021, 21, 5867. [Google Scholar] [CrossRef] [PubMed]
  34. Tuena, C.; Pedroli, E.; Trimarchi, P.D.; Gallucci, A.; Chiappini, M.; Goulene, K.; Gaggioli, A.; Riva, G.; Lattanzio, F.; Giunco, F.; et al. Usability issues of clinical and research applications of virtual reality in older people: A systematic review. Front. Hum. Neurosci. 2020, 14, 93. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Sauer, J.; Sonderegger, A.; Schmutz, S. Usability, user experience and accessibility: Towards an integrative model. Ergonomics 2020, 63, 1207–1220. [Google Scholar] [CrossRef]
  36. Pedroli, E.; Cipresso, P.; Serino, S.; Albani, G.; Riva, G. A Virtual Reality Test for the assessment of cognitive deficits: Usability and perspectives. In Proceedings of the 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, Venice, Italy, 5–8 May 2013; Volume 2013, pp. 453–458. [Google Scholar]
  37. Pedroli, E.; Greci, L.; Colombo, D.; Serino, S.; Cipresso, P.; Arlati, S.; Mondellini, M.; Boilini, L.; Giussani, V.; Goulene, K.; et al. Characteristics, usability, and users experience of a system combining cognitive and physical therapy in a virtual environment: Positive bike. Sensors 2018, 18, 2343. [Google Scholar] [CrossRef] [Green Version]
  38. Iso, W. 9241-11. Ergonomic requirements for office work with visual display terminals (VDTs). Int. Organ. Stand. 1998, 45. [Google Scholar]
  39. Schultheis, M.T.; Rizzo, A.A. The application of virtual reality technology in rehabilitation. Rehabil. Psychol. 2001, 46, 296. [Google Scholar] [CrossRef]
  40. Parsons, T.D.; Phillips, A.S. Virtual reality for psychological assessment in clinical practice. Pract. Innov. 2016, 1, 197–217. [Google Scholar] [CrossRef] [Green Version]
  41. Nasreddine, Z.S.; Phillips, N.A.; Bédirian, V.; Charbonneau, S.; Whitehead, V.; Collin, I.; Cummings, J.L.; Chertkow, H. The Montreal Cognitive Assessment, MoCA: A brief screening tool for mild cognitive impairment. J. Am. Geriatr. Soc. 2005, 53, 695–699. [Google Scholar] [CrossRef]
  42. Santangelo, G.; Siciliano, M.; Pedone, R.; Vitale, C.; Falco, F.; Bisogno, R.; Siano, P.; Barone, P.; Grossi, D.; Santangelo, F.; et al. Normative data for the Montreal Cognitive Assessment in an Italian population sample. Neurol. Sci. 2015, 36, 585–591. [Google Scholar] [CrossRef] [Green Version]
  43. Postuma, R.B.; Berg, D.; Stern, M.; Poewe, W.; Olanow, C.W.; Oertel, W.; Obeso, J.; Marek, K.; Litvan, I.; Lang, A.E.; et al. MDS clinical diagnostic criteria for Parkinson’s disease. Mov. Disord. 2015, 30, 1591–1601. [Google Scholar] [CrossRef] [PubMed]
  44. World Medical Association. Declaration of Helsinki: Ethical principles for medical research involving human subjects. JAMA 2013, 310, 2191–2194. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Borgnis, F.; Baglio, F.; Pedroli, E.; Rossetto, F.; Meloni, M.; Riva, G.; Cipresso, P. EXIT 360—executive-functions innovative tool 360—A simple and effective way to study executive functions in parkinson’s disease by using 360 videos. Appl. Sci. 2021, 11, 6791. [Google Scholar] [CrossRef]
  46. Appollonio, I.; Leone, M.; Isella, V.; Piamarta, F.; Consoli, T.; Villa, M.L.; Forapani, E.; Russo, A.; Nichelli, P. The Frontal Assessment Battery (FAB): Normative values in an Italian population sample. Neurol. Sci. 2005, 26, 108–116. [Google Scholar] [CrossRef]
  47. Dubois, B.; Slachevsky, A.; Litvan, I.; Pillon, B. The FAB: A frontal assessment battery at bedside. Neurology 2000, 55, 1621–1626. [Google Scholar] [CrossRef] [Green Version]
  48. Serino, S.; Repetto, C. New trends in episodic memory assessment: Immersive 360 ecological videos. Front. Psychol. 2018, 9, 1878. [Google Scholar] [CrossRef] [Green Version]
  49. Brooke, J. System Usability Scale (SUS): A Quick-and-Dirty Method of System Evaluation User Information; Digit Equip Co Ltd.: Reading, UK, 1986; p. 43. [Google Scholar]
  50. Brooke, J. SUS-A quick and dirty usability scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
  51. Lewis, J.R.; Sauro, J. The factor structure of the system usability scale. In International Conference on Human Centered Design; Springer: Berlin/Heidelberg, Germany, 2009; pp. 94–103. [Google Scholar]
  52. Laugwitz, B.; Held, T.; Schrepp, M. Construction and evaluation of a user experience questionnaire. In Symposium of the Austrian HCI and Usability Engineering Group; Springer: Berlin/Heidelberg, Germany, 2008; pp. 63–76. [Google Scholar]
  53. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Applying the user experience questionnaire (UEQ) in different evaluation scenarios. In International Conference of Design, User Experience, and Usability; Springer: Berlin/Heidelberg, Germany, 2014; pp. 383–392. [Google Scholar]
  54. Schrepp, M.; Thomaschewski, J. Design and Validation of a Framework for the Creation of User Experience Questionnaires. Int. J. Interact. Multimed. Artif. Intell. 2019, 5, 88–95. [Google Scholar] [CrossRef]
  55. Lessiter, J.; Freeman, J.; Keogh, E.; Davidoff, J. A cross-media presence questionnaire: The ITC-Sense of Presence Inventory. Presence Teleoperators Virtual Environ. 2001, 10, 282–297. [Google Scholar] [CrossRef] [Green Version]
  56. Engeser, S.; Rheinberg, F.; Vollmeyer, R.; Bischoff, J. Motivation, flow-Erleben und Lernleistung in universitären Lernsettings. Zeitschrift für Pädagogische Psychol. 2005, 19, 159–172. [Google Scholar] [CrossRef]
  57. Deci, E.L.; Eghrari, H.; Patrick, B.C.; Leone, D.R. Facilitating internalization: The self-determination theory perspective. J. Pers. 1994, 62, 119–142. [Google Scholar] [CrossRef] [PubMed]
  58. Bangor, A.; Kortum, P.T.; Miller, J.T. An empirical evaluation of the system usability scale. Intl. J. Hum.–Computer Interact. 2008, 24, 574–594. [Google Scholar] [CrossRef]
  59. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef] [Green Version]
  60. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Construction of a Benchmark for the User Experience Questionnaire (UEQ). Int. J. Interact. Multim. Artif. Intell. 2017, 4, 40–44. [Google Scholar] [CrossRef] [Green Version]
  61. Aubin, G.; Béliveau, M.F.; Klinger, E. An exploration of the ecological validity of the Virtual Action Planning–Supermarket (VAP-S) with people with schizophrenia. Neuropsychol. Rehabil. 2018, 28, 689–708. [Google Scholar] [CrossRef]
  62. Alster, P.; Madetko, N.; Koziorowski, D.; Friedman, A. Progressive Supranuclear Palsy-Parkinsonism Predominant (PSP-P)-A Clinical Challenge at the Boundaries of PSP and Parkinson’s Disease (PD). Front. Neurol. 2020, 11, 180. [Google Scholar] [CrossRef] [PubMed]
  63. Necpál, J.; Borsek, M.; Jeleňová, B. “Parkinson’s disease” on the way to progressive supranuclear palsy: A review on PSP-parkinsonism. Neurol. Sci. 2021, 42, 4927–4936. [Google Scholar] [CrossRef]
Figure 1. A representation of a 360° environment that participants see in the headset (here, the image is represented in anamorphic format).
Figure 1. A representation of a 360° environment that participants see in the headset (here, the image is represented in anamorphic format).
Jcm 11 01153 g001
Figure 2. The representation of the small white dot that participants see in the headset. To respond, participants must move their heads and position the dot on the answer for a few seconds and the answer will be selected automatically.
Figure 2. The representation of the small white dot that participants see in the headset. To respond, participants must move their heads and position the dot on the answer for a few seconds and the answer will be selected automatically.
Jcm 11 01153 g002
Figure 3. Percentages relating to familiarity with the technologies for each group. PD, Parkinson’s disease; HC, healthy control. 1 = never; 2 = once a month or more rarely; 3 = once a week; 4 = every 2/3 days; 5 = every day.
Figure 3. Percentages relating to familiarity with the technologies for each group. PD, Parkinson’s disease; HC, healthy control. 1 = never; 2 = once a month or more rarely; 3 = once a week; 4 = every 2/3 days; 5 = every day.
Jcm 11 01153 g003
Figure 4. Percentages relating to the self-reported competence in using several technologies for each group. PD, Parkinson’s disease; HC, healthy control.
Figure 4. Percentages relating to the self-reported competence in using several technologies for each group. PD, Parkinson’s disease; HC, healthy control.
Jcm 11 01153 g004
Figure 5. A graphic representation of the SUS score. PD, Parkinson’s disease; HC, healthy control.
Figure 5. A graphic representation of the SUS score. PD, Parkinson’s disease; HC, healthy control.
Jcm 11 01153 g005
Figure 6. Graphic representation of the Intrinsic Motivation Inventory domains, comparing patients and controls.
Figure 6. Graphic representation of the Intrinsic Motivation Inventory domains, comparing patients and controls.
Jcm 11 01153 g006
Figure 7. Graphic representation of two ICT—SOPI dimensions. The orange lines indicate a neutral score.
Figure 7. Graphic representation of two ICT—SOPI dimensions. The orange lines indicate a neutral score.
Jcm 11 01153 g007
Figure 8. Graphic representation of scores of the six UEQ scales.
Figure 8. Graphic representation of scores of the six UEQ scales.
Jcm 11 01153 g008
Figure 9. Comparison between means of each UEQ scale and values from a benchmark dataset.
Figure 9. Comparison between means of each UEQ scale and values from a benchmark dataset.
Jcm 11 01153 g009
Table 1. Questionnaire and scale to evaluate the user experience.
Table 1. Questionnaire and scale to evaluate the user experience.
ScaleAimCharacteristics
User Experience Questionnaire (UEQ) [52,53,54]1. attractiveness (overall impression of the product)
2. perspicuity: easily to learn how to use the product
3. efficiency (user’s effort to solve tasks)
4. dependability (feeling of control of the interaction)
5. stimulation (motivation to use the product)
6. novelty: (innovation and creation of product)
a 26 item-scale (semantic differential scale: each item consists of two opposite adjectives, e.g., boring vs. exciting) that allows for calculation of the six different domains
ICT—Sense of Presence Inventory (ICT—SOPI) [55]1. spatial, physical presence: the feeling of being in a physical space in the virtual environment and having control over it
2. engagement: the tendency to feel psychologically and pleasantly involved in the virtual environment
3. ecological validity: the tendency to perceive the virtual environment as real
4. negative effects: adverse psychological reactions
44 item-scale
5-point scale from 1: “strongly disagree” to 5 “strongly agree.”
ICT—SOPI is divided into thoughts and feelings after experiencing the environment (Part A) or while the user was experiencing the environment (Part B). Items are divided into four dimensions, generated by calculating the mean of all items contributing to each factor.
Flow Short Scale (three items) [56]perceived level of:
—abilities in coping with the task
—challenges
—challenge-skill balance
5-point scale: from low to high
Intrinsic Motivation Inventory (subscale enjoyment—four items) [57]participants’ appreciation of the proposed activity (i.e., boring, pleasant, fun and activating)5-point scale: from low to high
The item boring scores were reversed to align with the remaining items; therefore, in the whole scale, a low value in the items reflects a negative perception of the experience with EXIT 360°.
Table 2. Demographic and clinical characteristics of the whole sample.
Table 2. Demographic and clinical characteristics of the whole sample.
PwPD
N = 27
HC
N = 27
Group Comparison
(p-Value)
Age (years, mean (SD))68.2 (9)66.4 (10.5)0.507
Sex (M: F)11:1611:161.000
Age of education (years, median (IRQ))13 (5)13 (5)0.740
MoCA_raw score (mean (SD))25.4 (3.12)26.3 (2.25)0.235
MoCA_adjusted score (mean (SD))25.3 (2.25)26.0 (2.53)0.246
M, male; F, female; SD, standard deviation; IQR, interquartile range; n, number; MoCA, Montreal Cognitive Assessment; PwPD, patients with Parkinson’s disease; HC, healthy controls.
Table 3. Comparison of scores at EXIT 360°.
Table 3. Comparison of scores at EXIT 360°.
PwPD
Mean ± SD
HC
Mean ± SD
Group Comparison
(p-Value)
Total EXIT Score10.5 ± 1.5812.3 ± 1.07<0.001
Total Reaction Time716.4 ± 174.19457.3 ± 73.60<0.001
FAB15.94 ± 2.3317.46 ± 1.0030.006
SD, standard deviation; PwPD, patients with Parkinson’s disease; HC, healthy controls; FAB = Frontal Assessment Battery (in bold, statistically significant value).
Table 4. Comparison of scores at the subscale enjoyment of IMI.
Table 4. Comparison of scores at the subscale enjoyment of IMI.
PwPD
Median (IRQ)
HC
Median (IRQ)
Group Comparison
(p-Value)
Boring5 (5)5 (5)0.107
Enjoyable4 (4–5)5 (4–5)0.113
Activating5 (5)5 (5)0.28
Funny4 (3.5)4 (4–4.5)0.81
IQR, interquartile range; n, number; PwPD, patients with Parkinson’s disease; HC, healthy controls.
Table 5. Comparison of scores in ICT—SOPI dimensions.
Table 5. Comparison of scores in ICT—SOPI dimensions.
PwPD
Mean ± SD
HC
Mean ± SD
Group Comparison
(p-Value)
Spatial Presence3.11 ± 0.833.47 ± 0.480.054
Engagement3.43 ± 0.543.9 ± 0.470.001
Ecological Validity4.29 ± 0.614.49 ± 0.370.149
Negative Effects1.29 ± 0.421.2 ± 0.260.361
SD, standard deviation; PwPD, patients with Parkinson’s disease; HC, healthy controls (in bold, statistically significant value).
Table 6. Scores of the six UEQ scales. SD, standard deviation.
Table 6. Scores of the six UEQ scales. SD, standard deviation.
MeanSDConfidence
Interval
Alpha-
Coefficient
Attractiveness1.5930.8461.2731.9120.89
Perspicuity1.8520.6551.6052.0990.81
Efficiency1.6940.6811.4381.9510.72
Dependability1.6300.6951.3681.8920.78
Stimulation2.0280.7761.7352.3210.79
Novelty2.0560.8531.7342.3770.93
Table 7. Comparison of scores in UEQ scales and dimensions.
Table 7. Comparison of scores in UEQ scales and dimensions.
PwPD
Mean (SD)
HC
Mean (SD)
Group Comparison
(p-Value)
Attractiveness1.59 (0.85)1.81 (1.13)0.430
Perspicuity1.85 (0.66)2.01 (0.75)0.416
Efficiency1.69 (0.68)1.73 (0.84)0.859
Dependability1.63 (0.70)2.14 (0.86)0.020
Stimulation2.03 (0.78)2.08 (0.98)0.818
Novelty2.06 (0.85)2.46 (0.68)0.058
Pragmatic Quality1.73 (0.59)1.96 (0.74)0.204
Hedonic Quality2.04 (0.72)2.27 (0.82)0.275
SD, standard deviation; PwPD, patients with Parkinson’s disease; HC, healthy controls (in bold, statistically significant value).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Borgnis, F.; Baglio, F.; Pedroli, E.; Rossetto, F.; Meloni, M.; Riva, G.; Cipresso, P. A Psychometric Tool for Evaluating Executive Functions in Parkinson’s Disease. J. Clin. Med. 2022, 11, 1153. https://doi.org/10.3390/jcm11051153

AMA Style

Borgnis F, Baglio F, Pedroli E, Rossetto F, Meloni M, Riva G, Cipresso P. A Psychometric Tool for Evaluating Executive Functions in Parkinson’s Disease. Journal of Clinical Medicine. 2022; 11(5):1153. https://doi.org/10.3390/jcm11051153

Chicago/Turabian Style

Borgnis, Francesca, Francesca Baglio, Elisa Pedroli, Federica Rossetto, Mario Meloni, Giuseppe Riva, and Pietro Cipresso. 2022. "A Psychometric Tool for Evaluating Executive Functions in Parkinson’s Disease" Journal of Clinical Medicine 11, no. 5: 1153. https://doi.org/10.3390/jcm11051153

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop