Previous Article in Journal
Federated Learning for Anomaly Detection: A Systematic Review on Scalability, Adaptability, and Benchmarking Framework
Previous Article in Special Issue
AI-MDD-UX: Revolutionizing E-Commerce User Experience with Generative AI and Model-Driven Development
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Patient-Oriented Smart Applications to Support the Diagnosis, Rehabilitation, and Care of Patients with Parkinson’s: An Umbrella Review

1
Research Unit in Design and Communication—UNIDCOM, Science and Technology School, University of Trás-os-Montes and Alto Douro, 5000-801 Vila Real, Portugal
2
Institute for Systems and Computer Engineering, Technology and Science—INESC-TEC, Science and Technology School, University of Trás-os-Montes and Alto Douro, 5000-801 Vila Real, Portugal
3
Institute of Electronics and Informatics Engineering of Aveiro—IEETA, Department of Medical Sciences, University of Aveiro, 3810-193 Aveiro, Portugal
4
Center for Health Technology and Services Research—CINTESIS@RISE, School of Health Sciences, University of Aveiro, 3810-193 Aveiro, Portugal
*
Author to whom correspondence should be addressed.
Future Internet 2025, 17(8), 376; https://doi.org/10.3390/fi17080376
Submission received: 25 June 2025 / Revised: 7 August 2025 / Accepted: 17 August 2025 / Published: 19 August 2025

Abstract

This umbrella review aimed to identify, analyze, and synthesize the results of existing literature reviews related to patient-oriented smart applications to support healthcare provision for patients with Parkinson’s. An electronic search was conducted on Scopus, Web of Science, and PubMed, and, after screening using predefined eligibility criteria, 85 reviews were included in the umbrella review. The included studies reported on smart applications integrating wearable devices, smartphones, serious computerized games, and other technologies (e.g., ambient intelligence, computer-based objective assessments, or online platforms) to support the diagnosis and monitoring of patients with Parkinson’s, improve physical and cognitive rehabilitation, and support disease management. Numerous smart applications are potentially useful for patients with Parkinson’s, although their full clinical potential has not yet been demonstrated. This is because the quality of their clinical assessments, as well as aspects related to their acceptability and compliance with requirements from regulatory bodies, have not yet been adequately studied. Future research requires more aligned methods and procedures for experimental assessments, as well as collaborative efforts to avoid replication and promote advances on the topic.

1. Introduction

Due to the increasing number and proportion of older adults, neurological disorders are a primary cause of disability globally, with Parkinson’s disease emerging as the fastest growing among them [1,2]. Estimates indicate that, by 2040, the number of patients with Parkinson’s (PwP) worldwide will range from 12 to 17 million [3]. This increasing prevalence of Parkinson’s is associated with a rising burden for patients, families, and society, which demands the optimization of healthcare provision to improve health outcomes and reduce economic costs [2].
Currently, no pharmacological tools are available to modify the progression of Parkinson’s disease, and dopamine replacement therapy and levodopa are the gold-standard treatments for alleviating the disease’s symptoms [4,5]. Considering this pharmacological limitation, the scientific community is researching new drugs with potential disease-modifying properties to decelerate or halt the disease’s progression [4]. Future disease-modifying therapies might emerge from these research efforts, but symptomatic treatment is likely to remain the foundation of Parkinson’s disease management for many years to come [4].
Other lines of research seek to optimize the diagnosis of Parkinson’s disease and care provision for PwP—specifically, by using digital technologies. In this respect, digital technologies can help clinicians to improve diagnosis (e.g., with support from medical imaging modalities [6]), particularly by employing machine learning and deep learning models [7,8,9,10,11] or to facilitate the screening of large populations to identify individuals requiring additional clinical evaluations. Moreover, there is an active research community developing patient-oriented smart applications to support the diagnosis, rehabilitation, and care of PwP in their home environments to increase their comfort and quality of life [4].
This umbrella review, carried out using systematic methods [12], aimed to identify, analyze, and synthesize existing literature reviews related to patient-oriented smart applications to support healthcare provision for PwP. To the best of the authors’ knowledge, this is the first tertiary study on the topic and might guide researchers in developing future patient-oriented smart applications and give patients and practitioners a better understanding of the relevant clinical benefits of these applications.

2. Materials and Methods

This umbrella review was registered in the International Prospective Register of Systematic Reviews (PROSPERO) on 4 April 2025 (Registration CRD420251019601), and its planning and conduction were supported by the Joanna Briggs Institute (JBI) guidelines [13]. The report followed the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines [14].

2.1. Objective and Research Questions

This umbrella review aimed to analyze scientific literature reviews focused on the use of patient-oriented smart applications to support the diagnosis, rehabilitation, and care of PwP (as diagnosed using any recognized diagnostic criteria), and to identify patient-relevant clinical benefits. This objective was informed by the following research questions:
  • RQ1: What measurement and interaction technologies are being used to develop patient-oriented smart applications to support healthcare provision for PwP?
  • RQ2: What types of patient-oriented smart applications are being proposed?
  • RQ3: What is the effectiveness of the proposed smart applications?
  • RQ4: What limitations and open issues of current research were identified by the existing reviews?
RQ1 and RQ2 aimed to provide a comprehensive overview of current research interests as well as the types of smart technologies and applications being used, while RQ3 aimed to identify the clinical implications of the existing evidence. Finally, RQ4 aimed to synthesize the current limitations and future trends in the research on the topic.

2.2. Literature Search

The online databases chosen for the umbrella review were Scopus, Web of Science, and PubMed. These databases were selected due to (i) broad coverage, (ii) content update, (iii) the availability of the full text of the indexed references, (iv) the accuracy of the results obtained by automatic searches, and (v) the versatility of the mechanisms used to export the results of the automatic searches.
The search query comprised three major components: (i) Parkinson’s disease, (ii) smart application, and (iii) review. For each component, relevant terms were identified:
  • Parkinson’s disease: Parkinson, neurodegenerative disorder, neurological disorder.
  • Smart applications: smart, remote, computerized, ehealth, digital, mobile, sensor, biosensor, wearable, artificial intelligence, machine learning, deep learning, information technologies, and communication technologies.
  • Review: review, secondary and systematic mapping.
The relevant terms of each component were merged using the Boolean connector OR and the expressions obtained were combined with the Boolean connector AND. Having defined the search query, it was instantiated to the syntax of each database together with the indication that the automatic search should be performed considering the title and abstract of the indexed references. The search query did not include time limits or restrictions in terms of the types of articles (e.g., review). Table 1 presents the instantiation of the general combination of keywords to the selected databases.

2.3. Inclusion and Exclusion Criteria

Studies were included in the umbrella review if they met the eligibility criteria presented in Table 2.
Although there are guidelines on the different forms of literature reviews (e.g., systematic reviews or scoping reviews) and how they should be conducted and reported, inconsistencies in concepts and methodological procedures are often identified. For instance, concepts such as reviews, systematic reviews, scoping reviews, comprehensive reviews, integrative reviews, or narrative reviews are sometimes used indiscriminately, and, in terms of reporting, several non-conformities may not allow results to be reproduced. Thus, based on questions 1 and 3 of the first version of the A MeaSurement Tool to Assess systematic Reviews (AMSTAR) checklist [15], the IC2 criterion was introduced. The remaining criteria were derived from the objective of this umbrella review.

2.4. Identification Process

After the automatic search in the three online databases, all the references retrieved were imported into a Microsoft Excel spreadsheet. Then, the identification process included four steps. Duplicate articles and articles without authors or abstracts were excluded during the first step. In turn, during the second step, articles were excluded based on the analysis of their titles and abstracts. In the third step, the full texts of the articles not excluded in the first two steps were analyzed to verify their conformity with the eligibility criteria. Finally, the fourth step included a backward search (i.e., the references of the articles included were analyzed to identify additional potentially relevant studies).
In all four steps, all the articles were independently analyzed by at least two authors.

2.5. Quality Assessment

The AMSTAR 2 [16] was used to assess the methodological quality of the studies included in the umbrella review. Each study was randomly assigned to two authors who performed their assessment independently. Disagreements were resolved by consensus.

2.6. Data Extraction

Customized Microsoft Excel forms were used for data extraction, which were piloted using three studies. These forms were developed to extract all relevant information about the included reviews to assess their quality and answer the research questions of the umbrella review.
Considering the quality assessment, authors and publication dates were extracted together with data to answer the sixteen questions on the AMSTAR 2 checklist. Moreover, the following data were extracted to answer RQ1 to RQ3: (i) research objectives; (ii) primary studies included; (iii) experimental designs of the primary studies; (iv) measurement and interaction technologies; (v) types of patient-oriented smart applications; (vi) interventions and measured outcomes; (vii) gold-standard instruments used as comparators; (viii) body location of the wearable devices; and (ix) effectiveness of the proposed applications. Finally, to answer RQ4, key paragraphs from the discussion and conclusions of each review were extracted.
The data extraction of each article was manually completed by one author and checked by a second author.

2.7. Data Synthesis

The different measurement and interaction technologies described in the studies included were identified and classified according to four classes: (i) wearable devices; (ii) smartphones; (iii) serious computerized games; and (iv) other technologies. Within these categories, the subgroup of studies that applied machine learning and deep learning was also identified.
In terms of application areas, the smart applications analyzed by the studies were classified according to their purpose (i.e., patients’ diagnosis and monitoring based on motor and non-motor clinical variables of Parkinson’s disease), rehabilitation (i.e., physical and cognitive rehabilitation), and disease management.
For each area of application, the following information was synthesized: research objectives; experimental designs of the respective primary studies, including type of study, settings, number and characteristics of the participants, and clinical instruments used as comparators; technological characteristics of smart applications; and assessment results. Moreover, concerning diagnosis and monitoring based on motor and non-motor clinical variables, the clinical variables, respective measurement devices, and body locations of wearable devices were also synthetized.
Finally, a thematic analysis was made of important points of the discussions and conclusions presented by the reviews to identify current challenges and open issues on the topic. Therefore, after the extraction of key paragraphs from the discussion and conclusions of each review, two authors independently created a primary list of categories. This categorization was refined by further analyses, and the final list of categories was discussed as a group.

3. Results

3.1. Identification of Studies

The search on the selected databases was performed in February 2025, and 6856 references were retrieved. Figure 1 presents the PRISMAS flowchart [14] of the identification process.
During the title and abstract screening, 3729 references were excluded, according to the inclusion and exclusion criteria. The exclusion reasons include: (i) references not related to patient-oriented smart applications to support the diagnosis, rehabilitation and care of PwP (e.g., [17,18]); (ii) references targeting other pathologies than Parkinson’s (e.g., [19,20]); (iii) references focusing on clinician-oriented smart applications (e.g., [21,22]); (iv) references not reporting reviews (e.g., [23,24]); and (v) references reporting reviews not supported by reproducible methods (e.g., [25,26]). During the full-text analysis, 71 references were excluded because they report on (i) reviews that, although including studies targeting PwP, targeted other health conditions in more than 70% of their primary studies, (e.g., [27,28,29]), (ii) clinician-oriented smart applications (e.g., [30,31,32,33]), and (iii) reviews that were not reproducible (e.g., [34,35,36]). Therefore, after the identification process, 85 studies [37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121] were included in the review.

3.2. Characteristics of Included Studies

According to Figure 2, the studies were published between 2014 (i.e., one study [37]) and the first months of 2025 (i.e., four studies [118,119,120,121]), and more than 75% of the studies were published since 2019.
Sixty-two reviews [38,39,40,41,42,43,44,46,47,48,49,50,51,52,53,54,55,56,58,59,60,62,64,65,66,67,68,69,70,73,74,76,77,81,82,83,87,88,89,90,91,92,94,95,96,97,99,100,101,102,103,104,105,107,109,111,114,115,116,118,119,120,121] specifically targeted PwP. Another 14 reviews [37,45,61,71,72,75,79,80,84,89,93,108,110,112] included primary studies targeting PwP and patients with other neurological disorders. Finally, nine reviews [57,63,78,85,86,98,106,113,117] included primary studies with patients with neurological disorders, including PwP, and other health conditions (e.g., musculoskeletal disorders [86,113] or healthy older adults [106]). By identifying all the references of the primary studies of the reviews and eliminating duplicates and studies that did not include PwP, a total of 1867 different primary studies were identified. From those studies, 14 [122,123,124,125,126,127,128,129,130,131,132,133,134,135] were analyzed in at least 10% of the reviews.
Overall, only eight included studies [65,71,77,78,80,99,106,115] performed meta-analyses. All the other studies only conducted a qualitative synthesis, either because this was their objective or because they were forced to exclude meta-analyses due to the small number or heterogeneity of the primary studies available.

3.3. Quality Assessment

Figure 3 presents the results of the quality assessment of the reviews included according to AMSTAR 2. The AMSTAR items with the highest scores were Q1 (i.e., research questions and inclusion criteria) and Q16 (i.e., report of potential sources of conflict of interest, including funding sources), while the items with the lowest scores were related to the report on the sources of funding for the studies included in the review (Q10), the list of excluded studies and exclusion justifications (Q7), satisfactory technique for assessing the risk of bias (Q9), duplicate data extraction (Q6), and duplicate study selection (Q5). Moreover, items Q2 (review methods established a priori), Q3 (explanation of the selection of the studies’ designs) and Q4 (comprehensive literature search strategy) have a high percentage of partial yes when compared with yes, due to the absence of an explicit statement that the respective review protocol was registered (Q2), an explanation of the type of primary studies’ design included in the review (Q3), and complementary search strategies such as backward search in addition to the databases’ queries (Q4).

3.4. Measurement and Interaction Technologies

Figure 4 presents the types of measurement and interaction technologies reported in the reviews included. A significant number of them (i.e., 65 reviews) analyzed the application of wearable technologies, including biomechanical wearable devices (e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers, force-sensitive insoles, or custom-made gloves) and physiological wearable devices (e.g., electromyography, electroencephalography, electrocardiography, actigraphy and polysomnography devices, or devices to measure galvanic skin response). In turn, 26 reviews analyzed the use of smartphones, and 12 others analyzed the use of serious computerized games. Finally, other technologies, such as ambient intelligence (e.g., force platforms or motion capture systems), computer-based objective assessments, or online platforms, were identified in 16 studies. Moreover, 18 studies reported the simultaneous use of different types of measurement and interaction technologies. For instance, Bansal et al. [97], Taniguchi et al. [101], Jeyasingh-Jacob et al. [110], and Silva-Batista et al. [114] reported the use of wearable devices and other technologies, while Tuena et al. [98] reported the integration of serious computerized games and wearable devices.
Among all the studies included, five [81,86,90,109,116] specifically expressed in their titles the application of machine learning or deep learning techniques, and other reviews (i.e., [37,40,43,55,64,74,78,91,92]) reported the use of machine and deep learning techniques. Several techniques were identified, including adaptative boosting, decision tree, extra gradient booster, gradient booster, hidden Markov model, linear discriminant analysis, linear regression, logistic regression, naïve Bayes, random forest, support vector machines, artificial neural networks, convolutional neural networks, deep autoencoder, multilayer perceptron, neural networks, recurrent neural networks, and temporal convolutional networks. However, in general, according to the reviews, the primary studies did not provide sufficient details on implementation of the machine learning and deep learning models and their validation.
These machine learning and deep learning models were mainly developed to process spatiotemporal gait parameters to assess movement impairments (e.g., [55,74,81,86,116]), analyze diverse kinematic data collected by a broad range of wearable devices to assess functional activities such as sitting and standing (e.g., [37]), or process kinematic and keystroke timing data to discriminate voluntary and involuntary movements such as the movements resulting from tremors (e.g., [40,43,64,78,92]). Specifically, Pardoel et al. [55] performed a comparative analysis of the performance of different techniques when applied for the freezing of gait detection using wearable devices.

3.5. Application Areas

Most reviews reported smart applications to support the diagnosis and monitoring of PwP, considering both motor and non-motor clinical variables. Moreover, the studies reported smart applications to improve the rehabilitation of PwP, in terms of physical and cognitive rehabilitation, as well as smart applications to help patients improve their disease management.
Therefore, five major application areas were identified (Figure 5): (i) motor, 57 studies, and (ii) non-motor diagnosis and monitoring, 15 studies; (iii) physical rehabilitation, 18 studies; (iv) cognitive rehabilitation, 8 studies; and (v) disease management, 9 studies. By comparing the application areas with measurement and interaction technologies, it is possible to conclude that wearable devices and smart applications are prominent in patients’ diagnosis and monitoring based on motor and non-motor clinical variables, physical rehabilitation, and disease management; meanwhile, for cognitive rehabilitation, most studies applied computerized serious games. In turn, computerized serious games are not relevant to the other application areas, and technologies such as online platforms and videoconferencing assume relevance in terms of disease management.

3.5.1. Patients’ Diagnosis and Monitoring Based on Motor Clinical Variables

The reviews addressing Parkinson’s motor clinical variables reported the use of the following types of measurement devices (Figure 6): (i) wearable devices that collect biomechanical data, 44 studies; (ii) wearable devices that collect physiological data, 18 studies; (iii) smartphones, 16 studies; and (iv) other technologies such as computer-based objective measures or online platforms, 12 studies.
Although some primary studies used single wearable devices, in most cases the strategy was to use a combination of multiple devices. In this respect, the number of wearable devices used by the proposed smart applications to measure motor clinical variables varied considerably. For instance, Brognara et al. [47] reported that the number of wearable devices combined varied from two to eight, Giannakopoulou et al. [81] found that the same number varied from two to fourteen, and Polvorinos-Fernández et al. [105] concluded that 23% of the studies they analyzed used more than ten wearable devices.
Some of the reviews did not report on the body location of the wearable devices. Considering the available information, Figure 7 presents the distribution of studies according to the body location of the wearable devices being reported.
Table 3 identifies the Parkinson’s motor clinical variables considered as well as the measurement devices used and respective studies, while Figure 8 presents the frequency of these variables among the reviews included.
The proposed smart applications related to the measurement of motor clinical variables had two main purposes: (i) to discriminate PwP; or (ii) to track the disease progression for those patients already diagnosed with Parkinson’s disease. In terms of the assessment of the smart applications, for the first purpose, the objective was to verify the smart applications’ capacity to discriminate PwP from healthy controls or patients with other chronic conditions (i.e., accuracy studies), while, for the second purpose, the objective was to verify the smart applications’ clinical validity by comparing their outcomes to the outcomes of gold-standard clinical instruments, such as objective clinical assessments or standardized clinical assessment scales. Among the reviews, the Unified Parkinson’s disease rating scale (UPDRS) and the Modified Unified Parkinson’s Disease Rating Scale (MDS-UPDRS) emerged as the most frequently used standardized clinical assessment scales for comparators: the first scale was reported by 30 reviews [37,38,40,41,42,43,44,45,47,48,49,50,54,59,60,64,73,76,81,84,88,89,90,92,94,97,101,105,116,120], and the second scale was reported by 18 reviews [38,40,41,42,54,59,68,76,81,84,88,89,90,94,105,113,116,120]. The Hoehn and Yahr scale was the third most used clinical assessment scale and was reported by five reviews [38,42,47,89,101].
The experimental designs of the primary studies were quite heterogeneous in terms of methods (e.g., case studies, cross-sectional, case–control, or randomized and non-randomized studies) and in terms of the number and characteristics of the participants (e.g., disease severity or duration). Specifically, the number of participants seems to be a methodological drawback of most primary studies, since it was relatively small in most studies. For instance, Hubble et al. [38] reported that, among the primary studies they reviewed, a significant percentage included fewer than 15 participants in the intervention or control groups.
Eight studies [59,84,85,88,90,100,102,103] only reviewed primary studies that assessed smart applications in home settings, simulated home environments, or free-living environments. Considering the remaining reviews targeting Parkinson’s motor clinical variables, all of them included a small percentage of studies performing long-term data collection in home settings or free-living environments. Therefore, most of the primary studies only collected data in research labs or clinical settings over short periods.
When collecting data in clinical settings, the patients perform constrained tasks, such as standardized tests (e.g., Timed UP and Go Test for assessing gait, the Romberg test for assessing balance, or tasks from the MDS-UPDRS for assessing other motor clinical variables), but, in general, the tasks patients performed were poorly specified by the primary studies. Measurement of motor clinical variables in home settings or free-living settings requires that the devices can measure motor parameters during unconstrained daily activities, which introduces added levels of complexity that need further research.
Most of the studies reviewed gave a narrative synthesis of the primary studies, and a few of them presented quantitative results. Table 4 presents the assessment results of smart applications considering the information provided by the reviews.
In terms of measurement of fine motor impairments, the meta-analysis performed by Alfalahi et al. [78] analyzed 29 independent models supported by machine learning or deep learning techniques to process keystroke timing data, concluding that keystroke dynamics presents good accuracy, sensitivity, and specificity to discriminate and classify PwP. The classification methods based on keystroke presented a pooled AUC of 0.85 (95% confidence interval (CI) 0.83–0.88; I2 = 94.04%) and a pooled accuracy of 0.82 (95% CI 0.78–0.86; I2 = 71.55%). These referred methods presented a pooled sensitivity of 0.86 (95% CI 0.82–0.90, I2 = 79.49%) and a pooled specificity of 0.83 (95% CI 0.79–0.87, I2 = 83.45%). These results were corroborated by other studies [76,84,105] that reported statistically significant correlations between finger tapping and traditional clinical metrics used to diagnose and monitor disease progression.
Moreover, good correlations between gold-standard clinical evaluations were found for the use of smart applications to assess the freezing of gait [45], other gait outcomes [81], tremor [40,43,89,94], balance [68], bradykinesia [41,54,73,88,89,103], dyskinesia [43,45,89,103], functional activities [37,42], and physical activity [42,59]. Some smart applications present good accuracy at detecting the freezing of gait [39,68,79,90,103], differentiating PwP and healthy controls based on other gait disturbances [38,40,90], identifying tremor presence and severity and differentiating PwP with tremor from healthy individuals [49,73,76,81,90,94,103,109], discriminating PwP from healthy controls based on balance measurements [38,109], detecting bradykinesia versus no bradykinesia [42,43], differentiating Parkinson’s disease-specific patterns from those of healthy controls and classifying disease severity and progression within PwP patients based on the measurement of functional activities [37,42], discriminating between sedentary or upright and walking behavior [45], discriminating between non-fallers and fallers [68] or predicting future falls [64], and successfully distinguishing patients from healthy controls based on the measurement of rigidity [54].
However, inconsistencies in the assessment processes [91], significant differences in gait spatiotemporal parameters due to high heterogeneity in terms of measures and placement of the wearable devices [43,47], contradictory findings (since some of the outcomes may be more robust than others [38]), or difficulties in discriminating between voluntary and non-voluntary movements still hinder the use of smart applications to measure motor clinical variables of Parkinson’s disease [40,43].

3.5.2. Patients’ Diagnosis and Monitoring Based on Non-Motor Clinical Variables

Considering non-motor clinical variables, this umbrella review identified wearable devices [41,44,45,59,66,73,83,90,100,103,104,109], smartphones [41,44,49,73,104,112], and serious computerized games [118] as being used to measure diverse non-motor clinical variables (Table 5). Only the assessments of sleep disturbances, cognition, constipation, depression, voice and speech quality, and urinary dysfunction were reported by more than one review. In terms of sensing devices, diverse types were identified, including blood pressure, electromyography, electroencephalography, polysomnography, and abdominal electrodermal activity devices, smart vests, or ingestible smart pills.
The studies related to non-motor clinical variables reported assessments both in clinical settings and home environments and highly heterogeneous validation methods, which limit generalization of the results. Moreover, some of the non-motor clinical variables (i.e., affective state, anxiety, brain dysfunction, constipation, depression, emotional dysfunction, fatigue, hallucinations and delusions, impulse control disorders, pain, voice and speech quality, and urinary dysfunction) were addressed in a limited number of primary studies, mainly proof-of-concept studies; therefore, there is not sufficient evidence to draw conclusions.
Regarding the remaining non-motor measurements, analyzing correlations against gold-standard clinical instruments, wide differences between devices were found for heart variability [104], significant modest correlations were found for sleep disturbance [59], and considerable correspondence was found for orthostatic hypotension [104]. In turn, significant differences were observed between PwP and healthy controls for autonomic function [44]. Finally, concerning cognition, contradictory findings were reported: (i) Daalen et al. [104] found sensitivity to progression towards mild cognitive impairment and dementia; (ii) van Wamelen et al. [66] suggested that gait parameters measured by wearable inertial devices could be useful as an indirect indicator of cognitive decline; and (iii) Craig et al. [118] concluded there are mixed results on how digitalized tests compare with paper/pencil versions.

3.5.3. Physical Rehabilitation

Eighteen studies [46,49,61,65,67,70,71,80,92,96,97,99,102,106,112,114,115,116], including six meta-analyses [65,71,80,99,106,115], addressed the physical rehabilitation of PwP.
One study, a meta-analysis [65], compared exergaming rehabilitation and conventional physiotherapy in relation to the quality of life of PwP and concluded that the analyzed randomized control trials, most of them performed in clinical settings or research labs, showed statistically significant improvements in quality of life and function when performing daily activities in the exergame rehabilitation experimental groups compared to conventional physiotherapy control groups.
The other studies addressed the rehabilitation of gait disorders (i.e., 11 studies [46,49,70,71,97,99,102,106,112,115,116]), balance (i.e., 6 studies [67,70,71,80,96,114]), and tremor disorders (i.e., 2 studies [61,92]).
Wearable devices [46,71,97,102,106,115,116], smartphones [49,99,112], and online platforms [67] were used for the rehabilitation of gait disorders, some of them at home. Turning to the results of the meta-analyses of randomized control trials, Lai et al. [106] concluded that the intervention group showed a significant improvement in gait speed, and stride length was also shown to improve, but heterogeneity across studies was moderate. Zhang et al. [115] concluded that wearable cueing devices did not demonstrate efficacy on gait functions, Özdenet found smart application-based rehabilitation not superior to standard rehabilitation [99], and Bowman et al. [71] concluded that it was not possible to reach firm conclusions about the effectiveness of wearable devices’ biofeedback rehabilitation due to the current quality of the literature.
In terms of balance rehabilitation, the reviews reported on wearable devices [70,71,80], exergames [96], and online platforms [67,114] in clinical settings, research labs, or at home. The meta-analysis performed by [71] included a small number of randomized control trials to draw significant conclusions, while the meta-analysis by Yue et al. [80], with the data of 14 primary studies, indicated that wearable-based exercise had a significant short-term effect on balance, although improvements did not reach statistical significance in other outcomes such as functional gait capacity or risk of falls.
Finally, two studies [61,92] reported the use of wearable devices to support the rehabilitation of tremor disorders. However, the primary studies related to tremor rehabilitation were limited and mainly cross-sectional and proof-of-concept studies.

3.5.4. Cognitive Rehabilitation

Cognitive rehabilitation was addressed by eight studies [56,62,77,93,98,117,118,121]: (i) five studies [56,62,77,118,121], including a meta-analysis [77], analyzed cognitive training applications based on serious computerized games applied to different settings, including home, community, long-term care, and rehabilitation settings; (ii) one study [98] analyzed smart applications integrating virtual-reality-based computerized cognitive training and wearable devices; (iii) and two studies [93,117] analyzed music interventions in institutional settings [93] supported by wearable devices (e.g., headphones) and mobile applications, and dance interventions [117], including home interventions, supported by mobile and online smart applications.
Due to the reduced number of primary studies and the heterogeneity of the respective designs, it was not possible to draw conclusions about the effectiveness of virtual-reality-based computerized cognitive training, music, and dance interventions. In terms of the impact of computerized cognitive training on cognitive functions (e.g., global cognition, attention, memory, or processing speed), Orgeta et al. [56] found no clear evidence that cognitive training improves global cognition at the end of treatment, although the cognitive training groups present improvements in attention and verbal memory compared to control groups, and Gavelin et al. [77] concluded that the overall cognitive effect of serious computerized games was small and statistically significant with moderate heterogeneity.

3.5.5. Disease Management

In total, ten studies [49,51,52,53,69,82,87,99,111,119], including a meta-analysis [99], addressed disease management smart applications: (i) two studies [53,111] reported the use of wearable devices, smartphones, and online platforms (e.g., synchronous technologies in the form of videoconference) to support the management of communication and swallowing disorders; (ii) four studies [51,52,69,99] addressed smart applications based on the use of wearable devices [51,69,99] or exergames [52] to promote physical activity; (iii) one study [82] addressed peer support based on social networks; (iv) and three studies [49,87,119] addressed smart mobile applications to empower PwP in terms of disease management (e.g., applications providing information on the disease and assistance in treatment).
Theodoros et al. [53] and Boege et al. [111] concluded that using smart applications to support the management of communication and swallowing disorders showed generally high user acceptance but were not able to determine the effectiveness of the applications due to the respective study designs. Similarly, the number of primary studies and their quality do not support conclusions about peer support and disease management applications. Finally, in terms of physical activity, due to the heterogeneity of the primary studies, the meta-analysis reported by Özdenet [99] only included two primary studies.

3.6. Challenges and Open Issues

The challenges and open issues were identified using a thematic analysis of the discussions and conclusions presented by the reviews. As shown in Figure 9, four major categories cut across all areas of application identified by this umbrella review: (i) experimental design, (ii) clinical viability, (iii) acceptability, and (iv) regulatory conformity. The subcategories of these major categories are presented in Table 6.
One important aspect emerging from the thematic analysis is that, in terms of primary studies, extensive engineering-oriented research has been undertaken, but clinical research significantly lags behind its engineering counterpart. This is evident in the experimental design of the primary studies, since 61 reviews reported several inconsistencies in terms of experimental designs of the primary studies.
As a result, it is not possible to confirm the clinical viability of the proposed smart applications due to a broad range of issues pointed out by the 81 reviews.
Moreover, 36 reviews reported challenges related to the acceptability of smart applications in terms of adherence, user experience, and comfort; improving this requires the proposed smart applications, to be minimally invasive, as well as training and technical assistance, and 15 reviews pointed out unsolved regulatory aspects regarding conformity requirements, security and data protection, risk analysis, and standardized assessment criteria.

4. Discussion

4.1. Measurement and Interaction Technologies

Concerning the measurement and interaction technologies being used (i.e., the first research question), most studies reported the use of biomechanical wearable devices (e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers, force-sensitive insoles, or custom-made gloves), or inertial sensors embedded in the current smartphones. The studies also reported the use of other wearable devices to measure physiological parameters (e.g., electrical activity of brain, heart, or muscles; reactivity cycles; sleep parameters; or galvanic skin response). In addition, serious computerized games, including virtual-reality-based serious games, were used for rehabilitation, and other technologies, such as ambient intelligence (e.g., force platforms or motion capture systems), computer-based objective assessments, or online platforms, were also identified. Notably, ambient intelligence devices assume little importance, although they can be useful in passively monitoring the patients [136].
Additionally, some studies used a broad range of machine and deep learning techniques, either alone or in combination, to process monitoring data for improving diagnostic accuracy.

4.2. Types of Smart Applications

Considering the types of patient-oriented smart applications being proposed for improving healthcare provision for PwP (i.e., the second research question), about 70% of studies investigated the use of smart applications to support continuous patient monitoring, mainly using wearable devices and smartphones. This is an important aspect, considering that continuous monitoring of patients allows for the tracking of disease progression and treatment optimization.
Since wearable devices and smartphones can measure movement parameters, most studies focus on the measurement of motor clinical variables (e.g., gait disturbances, including the freezing of gait, balance, tremor, bradykinesia, dyskinesia, hypokinesia, fine motor impairments, rigidity, impaired bed mobility, functional activities, physical activity, falls, and swallowing disorders). In turn, although non-motor clinical variables (e.g., cognitive impairment or pain) also impact the quality of life of the patients, only a small number of studies targeted non-motor clinical variables.
Symptom changes must be accompanied by responses to changes, including rehabilitation (i.e., physical and cognitive rehabilitation) and disease management (e.g., healthy behaviors) to maintain a healthy life for as long as possible. In this respect, 21% of studies analyzed smart applications to support physical rehabilitation, 7% analyzed smart applications focused on cognitive rehabilitation, and 9% reviewed the use of smart applications to support disease management, including the empowerment of the patients, peer support, and the adoption of healthy behaviors (e.g., physical activity).

4.3. Effectiveness of the Proposed Smart Applications

In terms of the effectiveness of the proposed smart applications (i.e., the third research question), one of the drawbacks of current research on the topic is the heterogeneity of primary studies in terms of objectives, methods, procedures, outcomes, placement of the wearable devices, or characteristics and number of participants, which limits quantitative comparisons. This justifies the small number of meta-analyses included in this umbrella review and the fact that most studies give qualitative synthesis of primary studies instead of quantitative synthesis, which prevents drawing robust conclusions about the effectiveness of the proposed smart applications. Therefore, the analysis of the effectiveness of the proposed smart applications was based on the results expressed by the reviews, and no standardized qualitative assessments were performed.
Starting by analyzing the meta-analyses, in terms of the measurement of motor clinical variables, Alfalahi et al. [78] concluded that the analysis of keystroke dynamics presents good accuracy, sensitivity and specificity to discriminate PwP from healthy controls, although the quality of evidence was moderate based on the application of the Quality Assessment of Diagnostic Test Accuracy [137] to the included primary studies. Concerning physical rehabilitation, the meta-analyses concluded: (i) when compared to conventional physiotherapy, exergaming rehabilitation has a positive, statistically significant impact on patients’ quality of life and their daily activities [65]; (ii) wearable devices are associated with a significant improvement in gait speed and stride length, although heterogeneity across studies was moderate [106]; (iii) physical rehabilitation monitored using wearable devices had a significant impact on balance, but there is no evidence of improved postural control during daily activities [80]; (iv) smart application-based rehabilitation was not superior to standard rehabilitation [99]; and (v) firm conclusions about the effectiveness of biofeedback rehabilitation based on wearable devices could not be drawn due to the current quality of the literature [71]. Regarding cognitive rehabilitation, Orgeta et al. [56] concluded that there was no clear evidence of the impact of cognitive training on global cognition, and Gavelin et al. [77] concluded that the overall cognitive effect of serious computerized games was small and statistically significant with moderate heterogeneity. Finally, considering disease management, Özdenet [99] could not analyze the effectiveness of physical activities due to the heterogeneity of the primary studies, which meant that the meta-analysis only included two primary studies.
Considering the reviews that present quantitative results, for gait disturbances, significant differences in gait spatiotemporal parameters were found between the primary studies, which present high variability in terms of the measurement and placement of the wearables [47]. However, according to several studies [39,45,68,79,90,103], wearable devices and smartphones might be used to detect the freezing of gait or to differentiate between PwP and healthy controls [38,40,90], while gait outcomes measured using wearable devices present good correlations with clinical scales such as UPDRS, MDP-UPDRS, and Hoehn and Yahr [81].
In terms of other motor clinical variables, good results for correlation with gold-standard clinical instruments were found for tremor [40,43,89,94], balance [68], bradykinesia [41,54,73,88,89,90,103], and dyskinesia [43,45,89,103], while other studies found good accuracy in the identification of tremor disorders and their severity [49,73,76,81,90,94,103,109], balance impairment [38,109], bradykinesia [42,43], or dyskinesia [43,45,89,103]. Additionally, Steins et al. [37] and Sou et al. [42] reported applications that are able to discriminate the Parkinson’s disease-specific mobility patterns of healthy controls based on the measurement of functional activity, Johansson et al. [45] reported the capacity to discriminate between sedentary or upright and walking behavior, and the quantification of missteps and risk of falls was shown to discriminate between non-fallers and fallers, Abou et al. [68] suggested the discriminative ability of smartphone applications to predict future falls, and Sica et al. [64] reported that the processing of measures from wearable devices was able to predict future falls even in patients with no previous fall history.
Concerning non-motor clinical variables, considerable correspondence with gold-standard instruments was found for orthostatic hypotension [104], and significant modest correlations were identified for sleep disturbance [59], but wide differences between devices were identified for heart variability [104].
Furthermore, concerning cognition, smart applications are sensitive to progression towards mild cognitive impairment and dementia [104], and gait parameters measured by wearable inertial devices could be useful as an indirect indicator of cognitive decline [68]. However, there are mixed results for how digitalized tests compare with paper/pencil versions [118].
Finally, for disease management, Theodoros et al. [53] and Boege et al. [111] concluded that smart applications used to support the management of communication and swallowing disorders showed generally high user acceptance, despite not being able to determine the effectiveness of the applications, due to the study designs. Similarly, the number of primary studies and their quality did not allow for conclusions to be drawn about the peer support and disease management applications.

4.4. Limitations and Open Issues in Current Research

Concerning the fourth research question (what limitations and open issues of current research were identified by existing reviews?), the studies identified several limitations and open issues that make it difficult to translate the proposed smart applications into clinical practice. Most of these limitations and open issues are not related to the limitations of existing technologies, since they primarily result from the lack of adequate study designs tailored to provide robust evidence about the improvement of smart applications in health outcomes, which was also pointed out by other tertiary studies related to remote monitoring [138,139]. In fact, the full clinical potential of the proposed applications is not yet demonstrated; this is due to different reasons, including the quality of the clinical assessments of those applications, and aspects related to their acceptability and compliance with regulatory bodies that have not yet been adequately studied.

4.4.1. Experimental Design

Different methods, including case studies, cross-sectional, case–control or randomized and non-randomized longitudinal studies, were employed to assess the proposed smart applications. Moreover, the primary studies had specific objectives or constraints [116], which resulted in significant variability in the outcomes measured and results reported.
As for the number of participants, most primary studies were performed with relatively small sample sizes, so they are not representative of the entire population of PwP, although they involve patients in different stages of the disease. To obtain accurate results, the applications must be validated by involving a large number of PwP [40]. Moreover, for studies on diagnostic accuracy, experimental studies should include patients in the early stage of the disease [40].
Usually, during tests to diagnose Parkinson’s disease, the patient performs specific tasks while clinicians assign scores as defined by clinical instruments such as UPDRS or MDS-UPDRS. Similarly, measurements of motor clinical variables based on the reported smart applications follow this paradigm, and the patients should perform predefined tasks such as resting tasks, postural tasks, or a 10 m walkway [40]. However, in the studies, tasks vary greatly, and, in some cases, they are not detailed. Additionally, some studies explicitly specify the placement of the wearable devices, while others merely refer to general anatomical regions [105].
Unfortunately, poorly detailed specifications are not exclusive to applications targeting motor clinical variables, as the same occurs in other types of applications, such as non-specification of the level of cognitive training intensity performed by participants during the validation of the smart applications to support cognitive rehabilitation [102].
All these inconsistencies hinder comparison between studies.
Although a significant number of studies identified correlations between experimental measurements and clinical scales, such as UPDRS or MDS-UPDRS, it is possible that these scales do not accurately reflect patients’ free-living state [54]. Therefore, if we consider the hypothesis that new measurement instruments may be more sensitive than traditional clinical scales, their correlation with these clinical measures may not be desirable [54,120]. In turn, outcomes presenting weak or no correlations with clinical measures are unlikely to be adopted [120]. Consequently, their ability to measure changes that are meaningful to PwP should be analyzed as complementary rather than equivalent to existing gold-standard scales [120].
Other methodological issues identified were the inadequate assessment of learning effects [84] and inappropriate detail regarding the influence of confounding variables (e.g., psychological state, fatigue severity, discomfort, medication adherence, and timing) [38]. The paucity of studies conducting repeated trials of the assessment limits identification of any learning effects with the use of a new device [84]. In turn, if confounding variables are not properly considered, manifestations of uncontrolled variables might be reported as significant changes [38].
Therefore, research on the topic would benefit from more focused efforts with aligned methods, namely longitudinal randomized controlled trials to assess long-term outcomes, protocols, data collection instruments, and their specific positioning on the body, collected parameters, tasks performed by the patients, data analysis procedures, and reports of the results [43].
These more focused efforts should be supported in existing guidelines (e.g., determining the sample size for research [140], diagnostic accuracy studies [137], or intervention studies [141]) and are fundamental to allowing the reproducibility of experiments and fostering confirmation and comparison of the results and, consequently, their generalization, which is fundamental to guaranteeing the clinical viability of the proposed smart applications [39].

4.4.2. Clinical Viability

The results of this umbrella review show that numerous smart applications specifically designed for PwP are potentially useful. However, independently of the type of applications (i.e., measurement of motor and non-motor clinical variables, physical and cognitive rehabilitation, and disease management), scientific evidence of their effectiveness is scarce and of poor quality and presents contradictory findings, calling for high-quality experimental assessments with rigorous methods and procedures and an adequate number of participants and disease profiles. Moreover, evidence of the clinical utility of smart applications, defined as improved health outcomes or clinical decisions following their use [102], is scarce. Therefore, to facilitate the translation of research into clinical practice, future research should include longitudinal randomized studies to evaluate the long-term impact on health outcomes such as quality of life, self-efficacy, or treatment adherence, and to allow for comprehensive assessments of sociocultural factors that influence the adherence of patients, relatives, and clinicians to new clinical models and how these clinical models might be effectively translated to clinical practice [48]. Additionally, cost-effectiveness studies evaluating smart applications to support PwP are required [41,53,103,107].
This means that the real-world benefits of these smart applications in terms of improving clinical outcomes are not yet known. However, this knowledge is essential to translating the research findings into improved healthcare provision [109].
To guarantee the usefulness of smart applications, clinicians should be able to interact with them. Therefore, efficient clinical management platforms are required to provide efficient user-friendly interfaces for clinicians [40] and to guarantee the integration of patients’ clinical information—specifically, in terms of existing electronic health record systems used to support clinical workflows [109].
The effective use of smart applications favors the participation of the patients and respective informal caregivers and their cooperation with clinicians [48]. Although the validity and usefulness of many smart applications were assessed in controlled settings (e.g., clinic or laboratory settings), models of their remote use in real free-living environments such as home settings need to be established to advance patient care [42,54,81,86,90,95].
Real-life validation of the practical effectiveness of smart applications is essential to guaranteeing their translation into clinical practice. However, real-life validation requires longer data collection periods and may require more complex outcome measures [109].
The basis of most research related to measuring motor clinical variables assumes that the patients perform predefined scripted tasks. In this respect, passive monitoring based on wearable devices or ambient intelligence devices may provide similar metrics with a reduced participant time burden [84,105]. This is particularly important for patients suffering from apathy or depression, who might not have the motivation to complete the scripted tasks [41]. However, both passive monitoring and the use of ambient intelligence devices (encountered extremely rarely in the literature [81]) need further research [105].
Moreover, in terms of the measurement of motor clinical variables, additional research should validate their application during unscripted and unconstrained daily tasks and activities rather than in scripted tasks (i.e., to support home-based measurements). This requires high-quality algorithms to discriminate between voluntary and involuntary movements [88].
Another critical limitation is the high variability in the metrics used to quantify motor symptoms [61]. Considering the broad range of parameters being measured, their clinical utility should be assessed in future studies. Moreover, considering the low importance ascribed to the measurement of non-motor clinical variables when compared with motor clinical variables, additional research is required on this topic [50,59].
Some of the studies analyzed the use of machine learning and deep learning techniques. However, they found that most of the primary studies did not provide sufficient details on the implementation of machine learning and deep learning models, training settings, and the validation procedure, which limits the validity of the results and reduces transparency and reproducibility [90]. Consequently, the application of these techniques is difficult to interpret, and, therefore, they are not easily adopted by clinicians [90].
Moreover, the translation to clinical practice in the home environment requires the optimization of the number and optimal body placement of the monitoring devices.
The selection of an ideal device system for the quantification of motor clinical variables is complex. Different individual devices or combinations of devices were proposed for measuring motor clinical variables. Their selection was mainly based on the type of symptoms being considered, and no single device was able to capture all the motor clinical variables related to PwP [43]. Similarly, multiple body parts were identified for the placement of the wearable devices, since different symptoms are detected by different device placements. Therefore, currently, there are no consensual guidelines regarding the number of monitoring devices and where they should be placed in the body [47]. Thus, accuracy, redundancy of different wearable devices, and body placement deserve further investigation [47].
A possible future direction is the integration of data collected by continuous monitoring of smart applications that can deliver treatments [41]. However, this possibility was not reported by the studies included.
Finally, the interoperability of smart applications and respective devices is a key problem requiring high-quality standards to aggregate data from different non-compatible devices [83] and integrate these data in healthcare workflows. According to the reviews included in this umbrella review, the interoperability of smart applications has not been sufficiently studied, despite having a significant impact on their clinical viability.

4.4.3. Acceptability

Acceptability needs particular attention, as the effective use of patient-oriented smart applications depends on a set of characteristics, such as usability, wearability, comfort of use, aesthetics, instructions for use, and support. These aspects might affect patients physically or psychologically but are frequently ignored by researchers [64], despite the existing literature on how to assess them (e.g., [142,143]).
Smart applications to be used by patients should not affect their daily activities negatively, which means the development of obtrusive bulky devices should be avoided since they can be invasive when used in the community [48]. In fact, uncomfortable solutions with negative user experience, together with their costs and skepticism about their effectiveness, are substantial barriers to their adoption [61,107].
Although wearable devices facilitate the capture of relevant motor parameters, to increase their acceptability and reduce their invasiveness, it is important to minimize the number of wearable devices required and place them in a way that does not interfere with daily activities [40].
To increase the acceptability of smart applications, it is imperative to consider the perspectives of PwP and their relatives. Despite these perspectives being critical for operative and effective clinical smart application, in general, they were not considered by the primary studies [48]. Moreover, the perspectives of clinicians and service providers are not duly considered by researchers [53]. Therefore, future studies should consider the involvement of all stakeholders [53,79] and implement user-centered design methods, namely co-design approaches.
Additionally, users should first become familiar with smart applications before using them in daily practice, and clear descriptions of their use should be available [96]. This means preparing structured training packages with adequate documentation. Moreover, to overcome difficulties in daily use, technical support teams should be available to tackle technical issues such as individual calibrations [97] or battery loss [96]. Despite its seemingly simple nature, patients’ lack of trust in technological solutions, their geographical dispersion, their living conditions, and the heterogeneity of communication infrastructures can make technical support a highly complex process, which is why it is one of the constraints placed on the use of smart applications.
Although some experimental studies were conducted over several weeks or months, further research is required to determine long-term adherence to smart applications.

4.4.4. Regulatory Conformity

An important issue is approval of the smart application by regulatory authorities (e.g., the European Medicines Agency or the Federal Drugs Agency) [59], which might be supported by existing literature (e.g., [144,145,146,147]).
Certification by regulatory bodies not only guarantees the medical-grade accuracy, reliability, and safety of the smart applications [49,75], but it is also important to allow for reimbursement of the costs.
However, this certification is associated with several open issues. First, the developers of smart applications, in general, do not properly consider the conformity requirements needed for certification (e.g., certifiability, safety, security, stability, or trustworthiness). Second, even when smart applications are noninvasive and pose a relatively low safety risk, data protection should be guaranteed [84] (e.g., as defined, for instance, by the Health Insurance Portability Accountability Act [144]), but this was not generally addressed in the primary studies. Third, additional research is required to develop computationally efficient risk assessment models able to provide continuous risk assessments of the smart applications [78]. Finally, the heterogeneity of parameters, measurement protocols, types of devices, and body locations can hinder standardization, although it is expected that obligatory certification by recognized regulatory authorities will require standard criteria for the assessment of clinical applications [100].

4.5. Limitations of the Umbrella Review

When analyzing the results of this review, some limitations inherent to its scope and methodology must be considered. Given the vastness of the topic, it was challenging to define the search strategies, which means that the chosen keywords and the databases that were used, as well as the judgment of the authors when screening the articles, might potentially exclude relevant studies.
The results of this umbrella review should be analyzed with caution considering the methodological issues of the primary studies reported by the reviews included. Moreover, by analyzing the quality assessment performed on the reviews, it is possible to conclude that they present some limitations, specifically in terms of the explicit statement that the respective review protocol was registered before the study was carried out, the explanation of the selection of the designs of the primary studies, comprehensive literature search strategies, duplicated study selection and data extraction, and satisfactory techniques for assessing the risk of bias. Additionally, most of the reviews included in the umbrella review only performed qualitative synthesis, either because this was their objective or because the small number or heterogeneity of the primary studies they reviewed did not allow meta-analyses. Therefore, an important limitation of this umbrella review was the impossibility of presenting more consolidated results on the effectiveness of smart applications to support healthcare provision for PwP.

5. Conclusions

Based on 85 reviews, analyzing a total of 1867 primary studies, this umbrella review systematized the efforts of a very active research community. Through extensive engineering-oriented research, a considerable number of smart applications have been developed. However, it is not possible to reach conclusions about the clinical viability of most of them due to limited research evidence, with contradictory findings about their clinical effectiveness, inconsistencies in their assessments, and unresolved issues related to their acceptability and regulatory conformity. Therefore, future research on the topic requires more focused efforts with aligned methods, protocols, data collection, data analysis, and reporting of the results, allowing for reproducibility of the experiments and the confirmation and comparison of the results.
An important research gap is the translation of patient-oriented applications to clinical practice to support the diagnosis, rehabilitation, and care of PwP. This translation requires the integration of interoperable smart applications with existing electronic health record systems supporting clinical workflows, as well as their real-life validation, which requires longer data collection periods and more complex outcome measures. For their application to home environments, smart applications should be optimized in terms of algorithms able to analyze unscripted and unconstrained daily tasks and activities, rather than scripted tasks, measurements of non-motor clinical variables, metrics to quantify motor symptoms, and number and body placement of the monitoring devices.
Finally, since there are so many separate groups working across the world on similar smart applications to support healthcare provision for PwP, collaborative efforts are needed to avoid the replication of developments already achieved and instead promote advances on the topic.

Author Contributions

Conceptualization, N.P.R.; methodology, A.I.M., A.G.S. and N.P.R.; investigation, R.B., J.P., A.I.M., A.G.S. and N.P.R.; writing—original draft preparation, R.B., J.P. and N.P.R.; writing—review and editing, R.B., J.P., A.I.M., A.G.S. and N.P.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data sharing is not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Dorsey, E.R.; Sherer, T.; Okun, M.S.; Bloem, B.R. The emerging evidence of the Parkinson pandemic. J. Park. Dis. 2018, 8, S3–S8. [Google Scholar] [CrossRef]
  2. Chaudhuri, K.R.; Azulay, J.P.; Odin, P.; Lindvall, S.; Domingos, J.; Alobaidi, A.; Kandukuri, P.L.; Chaudhari, V.S.; Parra, J.C.; Yamazaki, T.; et al. Economic burden of Parkinson’s disease: A multinational, real-world, cost-of-illness study. Drugs Real World Outcomes 2024, 11, 1–11. [Google Scholar] [CrossRef]
  3. Dorsey, E.; Bloem, B.R. The Parkinson pandemic—A call to action. JAMA Neurol. 2018, 75, 9–10. [Google Scholar] [CrossRef]
  4. Stocchi, F.; Bravi, D.; Emmi, A.; Antonini, A. Parkinson disease therapy: Current strategies and future research priorities. Nat. Rev. Neurol. 2024, 20, 695–707. [Google Scholar] [CrossRef]
  5. Antonini, A.; Moro, E.; Godeiro, C.; Reichmann, H. Medical and surgical management of advanced Parkinson’s disease. Mov. Disord. 2018, 33, 900–908. [Google Scholar] [CrossRef] [PubMed]
  6. Samson, E.; Noseworthy, M.D. A review of diagnostic imaging approaches to assessing Parkinson’s disease. Brain Disord. 2022, 6, 100037. [Google Scholar] [CrossRef]
  7. Wang, J.; Xue, L.; Jiang, J.; Liu, F.; Wu, P.; Lu, J.; Zhang, H.; Bao, W.; Xu, Q.; Ju, Z.; et al. Diagnostic performance of artificial intelligence-assisted PET imaging for Parkinson’s disease: A systematic review and meta-analysis. NPJ Digit. Med. 2024, 7, 17. [Google Scholar] [CrossRef]
  8. Grigas, O.; Maskeliunas, R.; Damaševičius, R. Early detection of dementia using artificial intelligence and multimodal features with a focus on neuroimaging: A systematic literature review. Health Technol. 2024, 14, 201–237. [Google Scholar] [CrossRef]
  9. Ghaderi, S.; Mohammadi, M.; Sayehmiri, F.; Mohammadi, S.; Tavasol, A.; Rezaei, M.; Ghalyanchi-Langeroudi, A. Machine learning approaches to identify affected brain regions in movement disorders using MRI data: A systematic review and diagnostic meta-analysis. J. Magn. Reson. Imaging 2024, 60, 2518–2546. [Google Scholar] [CrossRef]
  10. Bacon, E.J.; He, D.; Achi, N.; Wang, L.; Li, H.; Yao-Digba, P.D.; Monkam, P.; Qi, S. Neuroimage analysis using artificial intelligence approaches: A systematic review. Med. Biol. Eng. Comput. 2024, 62, 2599–2627. [Google Scholar] [CrossRef]
  11. Aggarwal, N.; Saini, B.S.; Gupta, S. Role of artificial intelligence techniques and neuroimaging modalities in detection of Parkinson’s disease: A systematic review. Cogn. Comput. 2024, 16, 2078–2115. [Google Scholar] [CrossRef]
  12. Chandler, J.; Cumpston, M.; Li, T.; Page, M.J.; Welch, V.J.H.W. Cochrane Handbook for Systematic Reviews of Interventions, 2nd ed.; Wiley: Hoboken, NJ, USA, 2019. [Google Scholar]
  13. Aromataris, E.; Fernandez, R.; Godfrey, C.M.; Holly, C.; Khalil, H.; Tungpunkom, P. Summarizing systematic reviews: Methodological development, conduct and reporting of an umbrella review approach. JBI Evid. Implement. 2015, 13, 132–140. [Google Scholar] [CrossRef] [PubMed]
  14. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; Prisma Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Int. J. Surg. 2010, 8, 336–341. [Google Scholar] [CrossRef]
  15. Shea, B.J.; Bouter, L.M.; Peterson, J.; Boers, M.; Andersson, N.; Ortiz, Z.; Ramsay, T.; Bai, A.; Shukla, V.K.; Grimshaw, J.M. External validation of a measurement tool to assess systematic reviews (AMSTAR). PLoS ONE 2007, 2, e1350. [Google Scholar] [CrossRef]
  16. Shea, B.J.; Reeves, B.C.; Wells, G.; Thuku, M.; Hamel, C.; Moran, J.; Moher, D.; Tugwell, P.; Welch, V.; Kristjansson, E.; et al. AMSTAR 2: A critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ 2017, 358, j4008. [Google Scholar] [CrossRef]
  17. Aljarallah, N.A.; Dutta, A.K.; Sait, A.R.W. A systematic review of genetics-and molecular-pathway-based machine learning models for neurological disorder diagnosis. Int. J. Mol. Sci. 2024, 25, 6422. [Google Scholar] [CrossRef]
  18. Kaushal, A. A central role of stimulator of interferon genes’ adaptor protein in defensive immune response. Immunol. Res. 2025, 73, 39. [Google Scholar] [CrossRef]
  19. Kaur, A.; Mittal, M.; Bhatti, J.S.; Thareja, S.; Singh, S. A systematic literature review on the significance of deep learning and machine learning in predicting Alzheimer’s disease. Artif. Intell. Med. 2024, 154, 102928. [Google Scholar] [CrossRef]
  20. Mahavar, A.; Patel, A.; Patel, A. A Comprehensive Review on Deep Learning Techniques in Alzheimer’s Disease Diagnosis. Curr. Top. Med. Chem. 2025, 25, 335–349. [Google Scholar] [CrossRef]
  21. Dennis, A.G.P.; Strafella, A.P. The identification of cognitive impairment in Parkinson’s disease using biofluids, neuroimaging, and artificial intelligence. Front. Neurosci. 2024, 18, 1446878. [Google Scholar] [CrossRef]
  22. Dzialas, V.; Doering, E.; Eich, H.; Strafella, A.P.; Vaillancourt, D.E.; Simonyan, K.; van Eimeren, T.; International Parkinson Movement Disorders Society-Neuroimaging Study Group. Houston, we have AI problem! quality issues with neuroimaging-based artificial intelligence in Parkinson’s disease: A systematic review. Mov. Disord. 2024, 39, 2130–2143. [Google Scholar] [CrossRef]
  23. Lamba, R.; Gulati, T.; Alharbi, H.F.; Jain, A. A hybrid system for Parkinson’s disease diagnosis using machine learning techniques. Int. J. Speech Technol. 2022, 25, 583–593. [Google Scholar] [CrossRef]
  24. Bruno, M.K.; Dhall, R.; Duquette, A.; Haq, I.U.; Honig, L.S.; Lamotte, G.; Mari, Z.; McFarland, N.R.; Montaser-Kouhsari, L.; Rodriguez-Porcel, F.; et al. A general neurologist’s practical diagnostic algorithm for atypical parkinsonian disorders: A consensus statement. Neurol. Clin. Pract. 2024, 14, e200345. [Google Scholar] [CrossRef]
  25. Pratihar, R.; Sankar, R. Advancements in Parkinson’s Disease Diagnosis: A Comprehensive Survey on Biomarker Integration and Machine Learning. Computers 2024, 13, 293. [Google Scholar] [CrossRef]
  26. Sankineni, S.; Saraswat, A.; Suchetha, M.; Aakur, S.N.; Sehastrajit, S.; Dhas, D.E. An insight on recent advancements and future perspectives in detection techniques of Parkinson’s disease. Evol. Intell. 2024, 17, 1715–1731. [Google Scholar] [CrossRef]
  27. Bawa, A.; Banitsas, K.; Abbod, M. A review on the use of Microsoft Kinect for gait abnormality and postural disorder assessment. J. Healthc. Eng. 2021, 2021, 4360122. [Google Scholar] [CrossRef] [PubMed]
  28. Anikwe, C.V.; Nweke, H.F.; Ikegwu, A.C.; Egwuonwu, C.A.; Onu, F.U.; Alo, U.R.; Teh, Y.W. Mobile and wearable sensors for data-driven health monitoring system: State-of-the-art and future prospect. Expert Syst. Appl. 2022, 202, 117362. [Google Scholar] [CrossRef]
  29. Yen, J.M.; Lim, J.H. A clinical perspective on bespoke sensing mechanisms for remote monitoring and rehabilitation of neurological diseases: Scoping review. Sensors 2023, 23, 536. [Google Scholar] [CrossRef]
  30. Ngo, Q.C.; Motin, M.A.; Pah, N.D.; Drotár, P.; Kempster, P.; Kumar, D. Computerized analysis of speech and voice for Parkinson’s disease: A systematic review. Computer Methods Programs Biomed. 2022, 226, 107133. [Google Scholar] [CrossRef]
  31. Lam, W.W.; Tang, Y.M.; Fong, K.N. A systematic review of the applications of markerless motion capture (MMC) technology for clinical measurement in rehabilitation. J. Neuroeng. Rehabil. 2023, 20, 57. [Google Scholar] [CrossRef]
  32. Islam, M.A.; Majumder, M.Z.H.; Hussein, M.A.; Hossain, K.M.; Miah, M.S. A review of machine learning and deep learning algorithms for Parkinson’s disease detection using handwriting and voice datasets. Heliyon 2024, 10, e25469. [Google Scholar] [CrossRef]
  33. Park, K.W.; Mirian, M.S.; McKeown, M.J. Artificial intelligence-based video monitoring of movement disorders in the elderly: A review on current and future landscapes. Singap. Med. J. 2024, 65, 141–149. [Google Scholar] [CrossRef] [PubMed]
  34. Pereira, C.R.; Pereira, D.R.; Weber, S.A.; Hook, C.; De Albuquerque, V.H.C.; Papa, J.P. A survey on computer-assisted Parkinson’s disease diagnosis. Artif. Intell. Med. 2019, 95, 48–63. [Google Scholar] [CrossRef] [PubMed]
  35. Chudzik, A.; Śledzianowski, A.; Przybyszewski, A.W. Machine learning and digital biomarkers can detect early stages of neurodegenerative diseases. Sensors 2024, 24, 1572. [Google Scholar] [CrossRef] [PubMed]
  36. Sánchez-Fernández, L.P. Biomechanics of Parkinson’s Disease with Systems Based on Expert Knowledge and Machine Learning: A Scoping Review. Computation 2024, 12, 230. [Google Scholar] [CrossRef]
  37. Steins, D.; Dawes, H.; Esser, P.; Collett, J. Wearable accelerometry-based technology capable of assessing functional activities in neurological populations in community settings: A systematic review. J. Neuroeng. Rehabil. 2014, 11, 36. [Google Scholar] [CrossRef]
  38. Hubble, R.P.; Naughton, G.A.; Silburn, P.A.; Cole, M.H. Wearable sensor use for assessing standing balance and walking stability in people with Parkinson’s disease: A systematic review. PLoS ONE 2015, 10, e0123705. [Google Scholar] [CrossRef]
  39. Silva de Lima, A.L.; Evers, L.J.; Hahn, T.; Bataille, L.; Hamilton, J.L.; Little, M.A.; Okuma, Y.; Bloem, B.; Faber, M.J. Freezing of gait and fall detection in Parkinson’s disease using wearable sensors: A systematic review. J. Neurol. 2017, 264, 1642–1654. [Google Scholar] [CrossRef]
  40. Rovini, E.; Maremmani, C.; Cavallo, F. How wearable sensors can support Parkinson’s disease diagnosis and treatment: A systematic review. Front. Neurosci. 2017, 11, 555. [Google Scholar] [CrossRef]
  41. Hasan, H.; Athauda, D.S.; Foltynie, T.; Noyce, A.J. Technologies assessing limb bradykinesia in Parkinson’s disease. J. Park. Dis. 2017, 7, 65–77. [Google Scholar] [CrossRef]
  42. Son, H.; Park, W.S.; Kim, H. Mobility monitoring using smart technologies for Parkinson’s disease in free-living environment. Collegian 2018, 25, 549–560. [Google Scholar] [CrossRef]
  43. Thorp, J.E.; Adamczyk, P.G.; Ploeg, H.L.; Pickett, K.A. Monitoring motor symptoms during activities of daily living in individuals with Parkinson’s disease. Front. Neurol. 2018, 9, 1036. [Google Scholar] [CrossRef] [PubMed]
  44. Merola, A.; Sturchio, A.; Hacker, S.; Serna, S.; Vizcarra, J.A.; Marsili, L.; Fasano, A.; Espay, A.J. Technology-based assessment of motor and nonmotor phenomena in Parkinson disease. Expert Rev. Neurother. 2018, 18, 825–845. [Google Scholar] [CrossRef] [PubMed]
  45. Johansson, D.; Malmgren, K.; Murphy, M.A. Wearable sensors for clinical applications in epilepsy, Parkinson’s disease, and stroke: A mixed-methods systematic review. J. Neurol. 2018, 265, 1740–1752. [Google Scholar] [CrossRef]
  46. Sweeney, D.; Quinlan, L.R.; Browne, P.; Richardson, M.; Meskell, P.; ÓLaighin, G. A technological review of wearable cueing devices addressing freezing of gait in Parkinson’s disease. Sensors 2019, 19, 1277. [Google Scholar] [CrossRef]
  47. Brognara, L.; Palumbo, P.; Grimm, B.; Palmerini, L. Assessing gait in Parkinson’s disease using wearable motion sensors: A systematic review. Diseases 2019, 7, 18. [Google Scholar] [CrossRef]
  48. Rovini, E.; Maremmani, C.; Cavallo, F. Automated systems based on wearable sensors for the management of Parkinson’s disease at home: A systematic review. Telemed. E-Health 2019, 25, 167–183. [Google Scholar] [CrossRef]
  49. Linares-Del Rey, M.; Vela-Desojo, L.; Cano-de La Cuerda, R. Mobile phone applications in Parkinson’s disease: A systematic review. Neurología 2019, 34, 38–54. [Google Scholar] [CrossRef]
  50. Godoi, B.B.; Amorim, G.D.; Quiroga, D.G.; Holanda, V.M.; Júlio, T.; Tournier, M.B. Parkinson’s disease and wearable devices, new perspectives for a public health issue: An integrative literature review. Rev. Da Assoc. Médica Bras. 2019, 65, 1413–1420. [Google Scholar] [CrossRef]
  51. Vaartio-Rajalin, H.; Rauhala, A.; Fagerström, L. Person-centered home-based rehabilitation for persons with Parkinson’s disease: A scoping review. Int. J. Nurs. Stud. 2019, 99, 103395. [Google Scholar] [CrossRef]
  52. Garcia-Agundez, A.; Folkerts, A.K.; Konrad, R.; Caserman, P.; Tregel, T.; Goosses, M.; Göbel, S.; Kalbe, E. Recent advances in rehabilitation for Parkinson’s Disease with Exergames: A Systematic Review. J. Neuroeng. Rehabil. 2019, 16, 17. [Google Scholar] [CrossRef]
  53. Theodoros, D.; Aldridge, D.; Hill, A.J.; Russell, T. Technology-enabled management of communication and swallowing disorders in Parkinson’s disease: A systematic scoping review. Int. J. Lang. Commun. Disord. 2019, 54, 170–188. [Google Scholar] [CrossRef]
  54. Teshuva, I.; Hillel, I.; Gazit, E.; Giladi, N.; Mirelman, A.; Hausdorff, J.M. Using wearables to assess bradykinesia and rigidity in patients with Parkinson’s disease: A focused, narrative review of the literature. J. Neural Transm. 2019, 126, 699–710. [Google Scholar] [CrossRef]
  55. Pardoel, S.; Kofman, J.; Nantel, J.; Lemaire, E.D. Wearable-sensor-based detection and prediction of freezing of gait in Parkinson’s disease: A review. Sensors 2019, 19, 5141. [Google Scholar] [CrossRef]
  56. Orgeta, V.; McDonald, K.R.; Poliakoff, E.; Hindle, J.V.; Clare, L.; Leroi, I. Cognitive training interventions for dementia and mild cognitive impairment in Parkinson’s disease. Cochrane Database Syst. Rev. 2020, 2, CD011961. [Google Scholar] [CrossRef] [PubMed]
  57. Ponciano, V.; Pires, I.M.; Ribeiro, F.R.; Marques, G.; Villasana, M.V.; Garcia, N.M.; Zdravevski, E.; Spinsante, S. Identification of diseases based on the use of inertial sensors: A systematic review. Electronics 2020, 9, 778. [Google Scholar] [CrossRef]
  58. de Oliveira Gondim, I.T.G.; de Souza, C.D.C.B.; Rodrigues, M.A.B.; Azevedo, I.M.; de Sales, M.D.G.W.; Lins, O.G. Portable accelerometers for the evaluation of spatio-temporal gait parameters in people with Parkinson’s disease: An integrative review. Arch. Gerontol. Geriatr. 2020, 90, 104097. [Google Scholar] [CrossRef]
  59. Morgan, C.; Rolinski, M.; McNaney, R.; Jones, B.; Rochester, L.; Maetzler, W.; Craddock, I.; Whone, A.L. Systematic review looking at the use of technology to measure free-living symptom and activity outcomes in Parkinson’s disease in the home or a home-like environment. J. Park. Dis. 2020, 10, 429–454. [Google Scholar] [CrossRef]
  60. Corrà, M.F.; Warmerdam, E.; Vila-Chã, N.; Maetzler, W.; Maia, L. Wearable health technology to quantify the functional impact of peripheral neuropathy on mobility in Parkinson’s disease: A systematic review. Sensors 2020, 20, 6627. [Google Scholar] [CrossRef]
  61. Lora-Millan, J.S.; Delgado-Oleas, G.; Benito-León, J.; Rocon, E. A review on wearable technologies for tremor suppression. Front. Neurol. 2021, 12, 700600. [Google Scholar] [CrossRef]
  62. de Oliveira, L.C.; Mendes, L.C.; de Lopes, R.A.; Carneiro, J.A.; Cardoso, A.; Júnior, E.A.; de Oliveira Andrade, A. A systematic review of serious games used for rehabilitation of individuals with Parkinson’s disease. Res. Biomed. Eng. 2021, 37, 849–865. [Google Scholar] [CrossRef]
  63. Keogh, A.; Argent, R.; Anderson, A.; Caulfield, B.; Johnston, W. Assessing the usability of wearable devices to measure gait and physical activity in chronic conditions: A systematic review. J. Neuroeng. Rehabil. 2021, 18, 138. [Google Scholar] [CrossRef] [PubMed]
  64. Sica, M.; Tedesco, S.; Crowe, C.; Kenny, L.; Moore, K.; Timmons, S.; Barton, J.; O’Flynn, B.; Komaris, D.S. Continuous home monitoring of Parkinson’s disease using inertial sensors: A systematic review. PLoS ONE 2021, 16, e0246528. [Google Scholar] [CrossRef] [PubMed]
  65. Elena, P.; Demetris, S.; Christina, M.; Marios, P. Differences between exergaming rehabilitation and conventional physiotherapy on quality of life in Parkinson’s disease: A systematic review and meta-analysis. Front. Neurol. 2021, 12, 683385. [Google Scholar] [CrossRef]
  66. van Wamelen, D.J.; Sringean, J.; Trivedi, D.; Carroll, C.B.; Schrag, A.E.; Odin, P.; Antonini, A.; Bloem, B.R.; Bhidayasiri, R.; Chaudhuri, K.R.; et al. Digital health technology for non-motor symptoms in people with Parkinson’s disease: Futile or future? Park. Relat. Disord. 2021, 89, 186–194. [Google Scholar] [CrossRef] [PubMed]
  67. Vellata, C.; Belli, S.; Balsamo, F.; Giordano, A.; Colombo, R.; Maggioni, G. Effectiveness of telerehabilitation on motor impairments, non-motor symptoms and compliance in patients with Parkinson’s disease: A systematic review. Front. Neurol. 2021, 12, 627999. [Google Scholar] [CrossRef]
  68. Abou, L.; Peters, J.; Wong, E.; Akers, R.; Dossou, M.S.; Sosnoff, J.J.; Rice, L.A. Gait and balance assessments using smartphone applications in Parkinson’s disease: A systematic review. J. Med. Syst. 2021, 45, 87. [Google Scholar] [CrossRef]
  69. McDermott, A.; Haberlin, C.; Moran, J. The use of ehealth to promote physical activity in people living with Parkinson’s disease: A systematic review. Physiother. Pract. Res. 2021, 42, 79–92. [Google Scholar] [CrossRef]
  70. Gonçalves, H.R.; Rodrigues, A.M.; Santos, C.P. Vibrotactile biofeedback devices in Parkinson’s disease: A narrative review. Med. Biol. Eng. Comput. 2021, 59, 1185–1199. [Google Scholar] [CrossRef]
  71. Bowman, T.; Gervasoni, E.; Arienti, C.; Lazzarini, S.G.; Negrini, S.; Crea, S.; Cattaneo, D.; Carrozza, M.C. Wearable devices for biofeedback rehabilitation: A systematic review and meta-analysis to design application rules and estimate the effectiveness on balance and gait outcomes in neurological diseases. Sensors 2021, 21, 3444. [Google Scholar] [CrossRef]
  72. Ó Breasail, M.; Biswas, B.; Smith, M.D.; Mazhar, M.K.A.; Tenison, E.; Cullen, A.; Lithander, F.; Roudaut, A.; Henderson, E.J. Wearable GPS and accelerometer technologies for monitoring mobility and physical activity in neurodegenerative disorders: A systematic review. Sensors 2021, 21, 8261. [Google Scholar] [CrossRef]
  73. Albán-Cadena, A.C.; Villalba-Meneses, F.; Pila-Varela, K.O.; Moreno-Calvo, A.; Villalba-Meneses, C.P.; Almeida-Galárraga, D.A. Wearable sensors in the diagnosis and study of Parkinson’s disease symptoms: A systematic review. J. Med. Eng. Technol. 2021, 45, 532–545. [Google Scholar] [CrossRef] [PubMed]
  74. Barrachina-Fernández, M.; Maitín, A.M.; Sánchez-Ávila, C.; Romero, J.P. Wearable technology to detect motor fluctuations in Parkinson’s disease patients: Current state and challenges. Sensors 2021, 21, 4188. [Google Scholar] [CrossRef] [PubMed]
  75. Prieto-Avalos, G.; Sánchez-Morales, L.N.; Alor-Hernández, G.; Sánchez-Cervantes, J.L. A review of commercial and non-commercial wearables devices for monitoring motor impairments caused by neurodegenerative diseases. Biosensors 2022, 13, 72. [Google Scholar] [CrossRef]
  76. Tripathi, S.; Malhotra, A.; Qazi, M.; Chou, J.; Wang, F.; Barkan, S.; Hellmers, N.; Henchcliffe, C.; Sarva, H. Clinical review of smartphone applications in Parkinson’s disease. Neurologist 2022, 27, 183–193. [Google Scholar] [CrossRef]
  77. Gavelin, H.M.; Domellöf, M.E.; Leung, I.; Neely, A.S.; Launder, N.H.; Nategh, L.; Finke, C.; Lampit, A. Computerized cognitive training in Parkinson’s disease: A systematic review and meta-analysis. Ageing Res. Rev. 2022, 80, 101671. [Google Scholar] [CrossRef]
  78. Alfalahi, H.; Khandoker, A.H.; Chowdhury, N.; Iakovakis, D.; Dias, S.B.; Chaudhuri, K.R.; Hadjileontiadis, L.J. Diagnostic accuracy of keystroke dynamics as digital biomarkers for fine motor decline in neuropsychiatric disorders: A systematic review and meta-analysis. Sci. Rep. 2022, 12, 7690. [Google Scholar] [CrossRef]
  79. Guo, C.C.; Chiesa, P.A.; de Moor, C.; Fazeli, M.S.; Schofield, T.; Hofer, K.; Belachew, S.; Scotland, A. Digital devices for assessing motor functions in mobility-impaired and healthy populations: Systematic literature review. J. Med. Internet Res. 2022, 24, e37683. [Google Scholar] [CrossRef]
  80. Li, X.; Chen, Z.; Yue, Y.; Zhou, X.; Gu, S.; Tao, J.; Guo, H.; Zhu, M.; Du, Q. Effect of wearable sensor-based exercise on musculoskeletal disorders in individuals with neurodegenerative diseases: A systematic review and meta-analysis. Front. Aging Neurosci. 2022, 14, 934844. [Google Scholar] [CrossRef]
  81. Giannakopoulou, K.M.; Roussaki, I.; Demestichas, K. Internet of things technologies and machine learning methods for Parkinson’s disease diagnosis, monitoring and management: A systematic review. Sensors 2022, 22, 1799. [Google Scholar] [CrossRef]
  82. Gerritzen, E.V.; Lee, A.R.; McDermott, O.; Coulson, N.; Orrell, M. Online peer support for people with Parkinson disease: Narrative synthesis systematic review. JMIR Aging 2022, 5, e35425. [Google Scholar] [CrossRef]
  83. Mughal, H.; Javed, A.R.; Rizwan, M.; Almadhor, A.S.; Kryvinska, N. Parkinson’s disease management via wearable sensors: A systematic review. IEEE Access 2022, 10, 35219–35237. [Google Scholar] [CrossRef]
  84. Gopal, A.; Hsu, W.Y.; Allen, D.D.; Bove, R. Remote assessments of hand function in neurological disorders: Systematic review. JMIR Rehabil. Assist. Technol. 2022, 9, e33157. [Google Scholar] [CrossRef]
  85. Sakamaki, T.; Furusawa, Y.; Hayashi, A.; Otsuka, M.; Fernandez, J. Remote patient monitoring for neuropsychiatric disorders: A scoping review of current trends and future perspectives from recent publications and upcoming clinical trials. Telemed. E-Health 2022, 28, 1235–1250. [Google Scholar] [CrossRef]
  86. Lim, A.C.Y.; Natarajan, P.; Fonseka, R.D.; Maharaj, M.; Mobbs, R.J. The application of artificial intelligence and custom algorithms with inertial wearable devices for gait analysis and detection of gait-altering pathologies in adults: A scoping review of literature. Digit. Health 2022, 8, 20552076221074128. [Google Scholar] [CrossRef]
  87. Lee, J.; Yeom, I.; Chung, M.L.; Kim, Y.; Yoo, S.; Kim, E. Use of mobile apps for self-care in people with Parkinson disease: Systematic review. JMIR Mhealth Uhealth 2022, 10, e33944. [Google Scholar] [CrossRef] [PubMed]
  88. Ancona, S.; Faraci, F.D.; Khatab, E.; Fiorillo, L.; Gnarra, O.; Nef, T.; Bassetti, C.L.A.; Bargiotas, P. Wearables in the home-based assessment of abnormal movements in Parkinson’s disease: A systematic review of the literature. J. Neurol. 2022, 269, 100–110. [Google Scholar] [CrossRef] [PubMed]
  89. Vanmechelen, I.; Haberfehlner, H.; De Vleeschhauwer, J.; Van Wonterghem, E.; Feys, H.; Desloovere, K.; Aerts, J.M.; Monbaliu, E. Assessment of movement disorders using wearable sensors during upper limb tasks: A scoping review. Front. Robot. AI 2023, 9, 1068413. [Google Scholar] [CrossRef] [PubMed]
  90. Sigcha, L.; Borzì, L.; Amato, F.; Rechichi, I.; Ramos-Romero, C.; Cárdenas, A.; Gascó, L.; Olmo, G. Deep learning and wearable sensors for the diagnosis and monitoring of Parkinson’s disease: A systematic review. Expert Syst. Appl. 2023, 229, 120541. [Google Scholar] [CrossRef]
  91. Zhang, W.; Sun, H.; Huang, D.; Zhang, Z.; Li, J.; Wu, C.; Sun, Y.; Gong, M.; Wang, Z.; Sun, C.; et al. Detection and prediction of freezing of gait with wearable sensors in Parkinson’s disease. Neurol. Sci. 2023, 45, 431–453. [Google Scholar] [CrossRef]
  92. Fujikawa, J.; Morigaki, R.; Yamamoto, N.; Nakanishi, H.; Oda, T.; Izumi, Y.; Takagi, Y. Diagnosis and treatment of tremor in Parkinson’s disease using mechanical devices. Life 2023, 13, 78. [Google Scholar] [CrossRef] [PubMed]
  93. Scataglini, S.; Van Dyck, Z.; Declercq, V.; Van Cleemput, G.; Struyf, N.; Truijen, S. Effect of music based therapy rhythmic auditory stimulation (ras) using wearable device in rehabilitation of neurological patients: A systematic review. Sensors 2023, 23, 5933. [Google Scholar] [CrossRef] [PubMed]
  94. Moreta-de-Esteban, P.; Martin-Casas, P.; Ortiz-Gutierrez, R.M.; Straudi, S.; Cano-de-la-Cuerda, R. Mobile applications for resting tremor assessment in parkinson’s disease: A systematic review. J. Clin. Med. 2023, 12, 2334. [Google Scholar] [CrossRef] [PubMed]
  95. Huang, T.; Li, M.; Huang, J. Recent trends in wearable device used to detect freezing of gait and falls in people with Parkinson’s disease: A systematic review. Front. Aging Neurosci. 2023, 15, 1119956. [Google Scholar] [CrossRef]
  96. Laar, A.; de Lima, A.L.S.; Maas, B.R.; Bloem, B.R.; de Vries, N.M. Successful implementation of technology in the management of Parkinson’s disease: Barriers and facilitators. Clin. Park. Relat. Disord. 2023, 8, 100188. [Google Scholar] [CrossRef]
  97. Bansal, S.K.; Basumatary, B.; Bansal, R.; Sahani, A.K. Techniques for the detection and management of freezing of gait in Parkinson’s disease–A systematic review and future perspectives. MethodsX 2023, 10, 102106. [Google Scholar] [CrossRef]
  98. Tuena, C.; Borghesi, F.; Bruni, F.; Cavedoni, S.; Maestri, S.; Riva, G.; Tettamanti, M.; Liperoti, R.; Rossi, L.; Ferrarin, M.; et al. Technology-assisted cognitive motor dual-task rehabilitation in chronic age-related conditions: Systematic review. J. Med. Internet Res. 2023, 25, e44484. [Google Scholar] [CrossRef]
  99. Özden, F. The effect of mobile application-based rehabilitation in patients with Parkinson’s disease: A systematic review and meta-analysis. Clin. Neurol. Neurosurg. 2023, 225, 107579. [Google Scholar] [CrossRef]
  100. Li, P.; van Wezel, R.; He, F.; Zhao, Y.; Wang, Y. The role of wrist-worn technology in the management of Parkinson’s disease in daily life: A narrative review. Front. Neuroinformatics 2023, 17, 1135300. [Google Scholar] [CrossRef]
  101. Taniguchi, S.; Yamamoto, A.; D’cruz, N. Assessing impaired bed mobility in patients with Parkinson’s disease: A scoping review. Physiotherapy 2024, 124, 29–39. [Google Scholar] [CrossRef]
  102. Sapienza, S.; Tsurkalenko, O.; Giraitis, M.; Mejia, A.C.; Zelimkhanov, G.; Schwaninger, I.; Klucken, J. Assessing the clinical utility of inertial sensors for home monitoring in Parkinson’s disease: A comprehensive review. NPJ Park. Dis. 2024, 10, 161. [Google Scholar] [CrossRef]
  103. Cox, E.; Wade, R.; Hodgson, R.; Fulbright, H.; Phung, T.H.; Meader, N.; Walker, S.; Rothery, C.; Simmonds, M. Devices for remote continuous monitoring of people with Parkinson’s disease: A systematic review and cost-effectiveness analysis. Health Technol. Assess. 2024, 28, 1. [Google Scholar] [CrossRef]
  104. Janssen Daalen, J.M.; van den Bergh, R.; Prins, E.M.; Moghadam, M.S.C.; van den Heuvel, R.; Veen, J.; Mathur, S.; Meijerink, H.; Mirelman, A.; Darweesh, S.K.L.; et al. Digital biomarkers for non-motor symptoms in Parkinson’s disease: The state of the art. NPJ Digit. Med. 2024, 7, 186. [Google Scholar] [CrossRef] [PubMed]
  105. Polvorinos-Fernández, C.; Sigcha, L.; Borzì, L.; Olmo, G.; Asensio, C.; López, J.M.; Arcas, G.; Pavón, I. Evaluating Motor Symptoms in Parkinson’s Disease Through Wearable Sensors: A Systematic Review of Digital Biomarkers. Appl. Sci. 2024, 14, 10189. [Google Scholar] [CrossRef]
  106. Lai, P.; Zhang, J.; Lai, Q.; Li, J.; Liang, Z. Impact of Wearable Device-Based Walking Programs on Gait Speed in Older Adults: A Systematic Review and Meta-Analysis. Geriatr. Orthop. Surg. Rehabil. 2024, 15, 21514593241284473. [Google Scholar] [CrossRef] [PubMed]
  107. Elbatanouny, H.; Kleanthous, N.; Dahrouj, H.; Alusi, S.; Almajali, E.; Mahmoud, S.; Hussain, A. Insights into Parkinson’s Disease-Related Freezing of Gait Detection and Prediction Approaches: A Meta Analysis. Sensors 2024, 24, 3959. [Google Scholar] [CrossRef]
  108. Peng, Y.; Ma, C.; Li, M.; Liu, Y.; Yu, J.; Pan, L.; Zhang, Z. Intelligent devices for assessing essential tremor: A comprehensive review. J. Neurol. 2024, 271, 4733–4750. [Google Scholar] [CrossRef]
  109. di Biase, L.; Pecoraro, P.M.; Pecoraro, G.; Shah, S.A.; Di Lazzaro, V. Machine learning and wearable sensors for automated Parkinson’s disease diagnosis aid: A systematic review. J. Neurol. 2024, 271, 6452–6470. [Google Scholar] [CrossRef]
  110. Jeyasingh-Jacob, J.; Crook-Rumsey, M.; Shah, H.; Joseph, T.; Abulikemu, S.; Daniels, S.; Sharp, D.; Haar, S. Markerless motion capture to quantify functional performance in neurodegeneration: Systematic review. JMIR Aging 2024, 7, e52582. [Google Scholar] [CrossRef]
  111. Boege, S.; Milne-Ives, M.; Ananthakrishnan, A.; Carroll, C.; Meinert, E. Self-Management Systems for Patients and Clinicians in Parkinson’s Disease Care: A Scoping Review. J. Park. Dis. 2024, 14, 1387–1404. [Google Scholar] [CrossRef]
  112. Willemse, I.H.; Schootemeijer, S.; van den Bergh, R.; Dawes, H.; Nonnekes, J.H.; van de Warrenburg, B.P. Smartphone applications for Movement Disorders: Towards collaboration and re-use. Park. Relat. Disord. 2024, 120, 105988. [Google Scholar] [CrossRef]
  113. Fu, Y.; Zhang, Y.; Ye, B.; Babineau, J.; Zhao, Y.; Gao, Z.; Mihailidis, A. Smartphone-Based Hand Function Assessment: Systematic Review. J. Med. Internet Res. 2024, 26, e51564. [Google Scholar] [CrossRef] [PubMed]
  114. Silva-Batista, C.; de Almeida, F.O.; Wilhelm, J.L.; Horak, F.B.; Mancini, M.; King, L.A. Telerehabilitation by Videoconferencing for Balance and Gait in People with Parkinson’s Disease: A Scoping Review. Geriatrics 2024, 9, 66. [Google Scholar] [CrossRef] [PubMed]
  115. Zhang, T.; Meng, D.T.; Lyu, D.Y.; Fang, B.Y. The Efficacy of Wearable Cueing Devices on Gait and Motor Function in Parkinson Disease: A Systematic Review and Meta-analysis of Randomized Controlled Trials. Arch. Phys. Med. Rehabil. 2024, 105, 369–380. [Google Scholar] [CrossRef] [PubMed]
  116. Franco, A.; Russo, M.; Amboni, M.; Ponsiglione, A.M.; Di Filippo, F.; Romano, M.; Amato, F.; Ricciardi, C. The Role of Deep Learning and Gait Analysis in Parkinson’s Disease: A Systematic Review. Sensors 2024, 24, 5957. [Google Scholar] [CrossRef]
  117. Tao, D.; Awan-Scully, R.; Ash, G.I.; Cole, A.; Zhong, P.; Gao, Y.; Sun, Y.; Shao, S.; Wiltshire, H.; Baker, J.S. The Role of Technology-based Dance Intervention for Enhancing Wellness-A Systematic Scoping Review and Meta-synthesis. Ageing Res. Rev. 2024, 100, 102462. [Google Scholar] [CrossRef]
  118. Craig, S.N.; Dempster, M.; Curran, D.; Cuddihy, A.M.; Lyttle, N. A systematic review of the effectiveness of digital cognitive assessments of cognitive impairment in Parkinson’s disease. Appl. Neuropsychol. Adult 2025, 1–13. [Google Scholar] [CrossRef]
  119. Hall, A.M.; Allgar, V.; Carroll, C.B.; Meinert, E. Digital health technologies and self-efficacy in Parkinson’s: A scoping review. BMJ Open 2025, 15, e088616. [Google Scholar] [CrossRef]
  120. Rábano-Suárez, P.; Del Campo, N.; Benatru, I.; Moreau, C.; Desjardins, C.; Sánchez-Ferro, Á.; Fabbri, M. Digital Outcomes as Biomarkers of Disease Progression in Early Parkinson’s Disease: A Systematic Review. Mov. Disord. 2025, 40, 184–203. [Google Scholar] [CrossRef]
  121. Gattoni, M.F.; Gobbo, S.; Feroldi, S.; Salvatore, A.; Navarro, J.; Sorbi, S.; Saibene, F.L. Identification of Cognitive Training for Individuals with Parkinson’s Disease: A Systematic Review. Brain Sci. 2025, 15, 61. [Google Scholar] [CrossRef]
  122. Tripoliti, E.E.; Tzallas, A.T.; Tsipouras, M.G.; Rigas, G.; Bougia, P.; Leontiou, M.; Konitsiotis, S.; Chondrogiorgi, M.; Tsouli, S.; Fotiadis, D.I. Automatic detection of freezing of gait events in patients with Parkinson’s disease. Comput. Methods Programs Biomed. 2013, 110, 12–26. [Google Scholar] [CrossRef]
  123. Tzallas, A.T.; Tsipouras, M.G.; Rigas, G.; Tsalikakis, D.G.; Karvounis, E.C.; Chondrogiorgi, M.; Psomadellis, F.; Cancela, J.; Pastorino, M.; Arredondo, M.T.; et al. PERFORM: A system for monitoring, assessment and management of patients with Parkinson’s disease. Sensors 2014, 14, 21329–21357. [Google Scholar] [CrossRef]
  124. Arora, S.; Venkataraman, V.; Zhan, A.; Donohue, S.; Biglan, K.M.; Dorsey, E.R.; Little, M.A. Detecting and monitoring the symptoms of Parkinson’s disease using smartphones: A pilot study. Park. Relat. Disord. 2015, 21, 650–653. [Google Scholar] [CrossRef] [PubMed]
  125. Ferreira, J.J.; Godinho, C.; Santos, A.T.; Domingos, J.; Abreu, D.; Lobo, R.; Gonçalves, N.; Barra, M.; Larsen, F.; Fagerbakke, Ø.; et al. Quantitative home-based assessment of Parkinson’s symptoms: The SENSE-PARK feasibility and usability study. BMC Neurol. 2015, 15, 89. [Google Scholar] [CrossRef] [PubMed]
  126. Kim, H.; Lee, H.J.; Lee, W.; Kwon, S.; Kim, S.K.; Jeon, H.S.; Park, H.; Shin, C.W.; Yi, W.J.; Jeon, B.S.; et al. Unconstrained detection of freezing of Gait in Parkinson’s disease patients using smartphone. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milano, Italy, 25–29 August 2015. [Google Scholar] [CrossRef]
  127. Pan, D.; Dhall, R.; Lieberman, A.; Petitti, D.B. A mobile cloud-based Parkinson’s disease assessment system for home-based monitoring. JMIR Mhealth Uhealth 2015, 3, e3956. [Google Scholar] [CrossRef]
  128. Rezvanian, S.; Lockhart, T.E. Towards real-time detection of freezing of gait using wavelet transform on wireless accelerometer data. Sensors 2016, 16, 475. [Google Scholar] [CrossRef]
  129. Capecci, M.; Pepa, L.; Verdini, F.; Ceravolo, M.G. A smartphone-based architecture to detect and quantify freezing of gait in Parkinson’s disease. Gait Posture 2016, 50, 28–33. [Google Scholar] [CrossRef]
  130. Ginis, P.; Nieuwboer, A.; Dorfman, M.; Ferrari, A.; Gazit, E.; Canning, C.G.; Rocchi, L.; Chiari, L.; Hausdorff, J.M.; Mirelman, A. Feasibility and effects of home-based smartphone-delivered automated feedback training for gait in people with Parkinson’s disease: A pilot randomized controlled trial. Park. Relat. Disord. 2016, 22, 28–34. [Google Scholar] [CrossRef]
  131. Kassavetis, P.; Saifee, T.A.; Roussos, G.; Drougkas, L.; Kojovic, M.; Rothwell, J.C.; Edwards, M.; Bhatia, K.P. Developing a tool for remote digital assessment of Parkinson’s disease. Mov. Disord. Clin. Pract. 2016, 3, 59–64. [Google Scholar] [CrossRef]
  132. Lee, C.Y.; Kang, S.J.; Kim, Y.E.; Lee, U.; Ma, H.I.; Kim, Y.J. A validation study of a smartphone-based finger tapping application for quantitative assessment of bradykinesia in Parkinson’s disease: 553. Mov. Disord. 2016, 31, S180. [Google Scholar] [CrossRef]
  133. Camps, J.; Samà, A.; Martín, M.; Rodríguez-Martín, D.; Pérez-López, C.; Alcaine, S.; Mestre, B.; Prats, A.; Crespo, M.C.; Cabestany, J.; et al. Deep learning for detecting freezing of gait episodes in Parkinson’s disease based on accelerometers. In Proceedings of the 14th International Work-Conference on Artificial Neural Networks, IWANN 2017, Cadiz, Spain, 14–16 June 2017. [Google Scholar] [CrossRef]
  134. Rodríguez-Martín, D.; Samà, A.; Pérez-López, C.; Català, A.; Moreno Arostegui, J.M.; Cabestany, J.; Bayés, À.; Alcaine, S.; Mestre, B.; Prats, A.; et al. Home detection of freezing of gait using support vector machines through a single waist-worn triaxial accelerometer. PLoS ONE 2017, 12, e0171764. [Google Scholar] [CrossRef]
  135. Lipsmeier, F.; Taylor, K.I.; Kilchenmann, T.; Wolf, D.; Scotland, A.; Schjodt-Eriksen, J.; Cheng, W.Y.; Fernandez-Garcia, I.; Siebourg-Polster, J.; Jin, L.; et al. Evaluation of smartphone-based testing to generate exploratory outcome measures in a phase 1 Parkinson’s disease clinical trial. Mov. Disord. 2018, 33, 1287–1297. [Google Scholar] [CrossRef] [PubMed]
  136. Queirós, A.; Dias, A.; Silva, A.G.; Rocha, N.P. Ambient assisted living and health-related outcomes—A systematic literature review. Informatics 2017, 4, 19. [Google Scholar] [CrossRef]
  137. Whiting, P.F.; Rutjes, A.W.; Westwood, M.E.; Mallett, S.; Deeks, J.J.; Reitsma, J.B.; Leeflang, M.M.; Sterne, J.A.; Bossuyt, P.M.; QUADAS-2 Group. QUADAS-2: A revised tool for the quality assessment of diagnostic accuracy studies. Ann. Intern. Med. 2011, 155, 529–536. [Google Scholar] [CrossRef]
  138. Boyle, L.D.; Giriteka, L.; Marty, B.; Sandgathe, L.; Haugarvoll, K.; Steihaug, O.M.; Husebo, B.; Patrascu, M. Activity and Behavioral Recognition Using Sensing Technology in Persons with Parkinson’s Disease or Dementia: An Umbrella Review of the Literature. Sensors 2025, 25, 668. [Google Scholar] [CrossRef]
  139. Wartenberg, C.; Elden, H.; Frerichs, M.; Jivegård, L.L.; Magnusson, K.; Mourtzinis, G.; Nyström, O.; Quitz, K.; Sjöland, H.; Svanberg, T.; et al. Clinical benefits and risks of remote patient monitoring: An overview and assessment of methodological rigour of systematic reviews for selected patient groups. BMC Health Serv. Res. 2025, 25, 133. [Google Scholar] [CrossRef]
  140. Ahmed, S.K. How to choose a sampling technique and determine sample size for research: A simplified guide for researchers. Oral Oncol. Rep. 2024, 12, 100662. [Google Scholar] [CrossRef]
  141. Hopewell, S.; Chan, A.W.; Collins, G.S.; Hróbjartsson, A.; Moher, D.; Schulz, K.F.; Tunn, R.; Aggarwal, R.; Berkwits, M.; Boutron, I. CONSORT 2025 statement: Updated guideline for reporting randomised trials. Lancet 2025, 405, 1633–1640. [Google Scholar] [CrossRef]
  142. Martins, A.I.; Queirós, A.; Silva, A.G.; Rocha, N.P. Usability evaluation methods: A systematic review. In Human Factors in Software Development and Design; Daeed, I., Bajwa, I.S., Mahmood, Z., Eds.; IGI Global: Hershey, PA, USA, 2014; pp. 250–273. [Google Scholar] [CrossRef]
  143. Silva, A.G.; Caravau, H.; Martins, A.; Almeida, A.M.P.; Silva, T.; Ribeiro, Ó.; Santinha, G.; Rocha, N.P. Procedures of user-centered usability assessment for digital solutions: Scoping review of reviews reporting on digital solutions relevant for older adults. JMIR Hum. Factors 2021, 8, e22774. [Google Scholar] [CrossRef]
  144. O’herrin, J.K.; Fost, N.; Kudsk, K.A. Health Insurance Portability Accountability Act (HIPAA) regulations: Effect on medical record research. Ann. Surg. 2004, 239, 772–778. [Google Scholar] [CrossRef]
  145. Shuren, J.; Patel, B.; Gottlieb, S. FDA regulation of mobile medical apps. JAMA 2018, 320, 337–338. [Google Scholar] [CrossRef]
  146. Keutzer, L.; Simonsson, U.S. Medical device apps: An introduction to regulatory affairs for developers. JMIR Mhealth Uhealth 2020, 8, e17567. [Google Scholar] [CrossRef]
  147. van Vroonhoven, J. Risk Management for Medical Devices and the New BS EN ISO 14971; BSI Standards Ltd.: London, UK, 2020. [Google Scholar]
Figure 1. PRISMA flowchart.
Figure 1. PRISMA flowchart.
Futureinternet 17 00376 g001
Figure 2. Number of studies published by year.
Figure 2. Number of studies published by year.
Futureinternet 17 00376 g002
Figure 3. The quality of the reviews included according to the appraisal items of AMSTAR 2 [16]: Q1—research questions and inclusion criteria; Q2—review methods established a priori; Q3—explanation of the selection of the studies’ designs; Q4—comprehensive literature search strategy; Q5—study selection performed in duplicate; Q6—data selection performed in duplicate; Q7—list of excluded studies and exclusions justified; Q8—description of the included studies in adequate detail; Q9—satisfactory technique for assessing the risk of bias; Q10—report on the sources of funding for the studies included in the review; Q11—use of appropriate methods for statistical combination of results when a meta-analysis was performed; Q12—assessment of the potential impact of risk of bias when a meta-analysis was performed; Q13—risk of bias accounted for when interpreting or discussing the results of the review; Q14—a satisfactory explanation for any heterogeneity observed in the results of the review; Q15—quantitative synthesis performed and publication bias discussed when a meta-analysis was conducted; Q16—report of potential sources of conflict of interest, including funding sources.
Figure 3. The quality of the reviews included according to the appraisal items of AMSTAR 2 [16]: Q1—research questions and inclusion criteria; Q2—review methods established a priori; Q3—explanation of the selection of the studies’ designs; Q4—comprehensive literature search strategy; Q5—study selection performed in duplicate; Q6—data selection performed in duplicate; Q7—list of excluded studies and exclusions justified; Q8—description of the included studies in adequate detail; Q9—satisfactory technique for assessing the risk of bias; Q10—report on the sources of funding for the studies included in the review; Q11—use of appropriate methods for statistical combination of results when a meta-analysis was performed; Q12—assessment of the potential impact of risk of bias when a meta-analysis was performed; Q13—risk of bias accounted for when interpreting or discussing the results of the review; Q14—a satisfactory explanation for any heterogeneity observed in the results of the review; Q15—quantitative synthesis performed and publication bias discussed when a meta-analysis was conducted; Q16—report of potential sources of conflict of interest, including funding sources.
Futureinternet 17 00376 g003
Figure 4. Measurement and interaction technologies reported by the reviews included: Steins et al., 2014 [37]; Hubble et al., 2015 [38]; Silva de Lima et al., 2017 [39]; Rovini et al., 2017 [40]; Hasan et al., 2017 [41]; Son et al., 2018 [42]; Thorp et al., 2018 [43]; Merola et al., 2018 [44]; Johansson et al., 2018 [45]; Sweeney et al., 2019 [46]; Brognara et al., 2019 [47]; Rovini et al., 2019 [48]; Linares-Del Rey et al., 2019 [49]; Godoi et al., 2019 [50]; Vaartio-Rajalin et al., 2019 [51]; Garcia-Agundez et al., 2019 [52]; Theodoros et al., 2019 [53]; Teshuva et al., 2019 [54]; Pardoel et al., 2019 [55]; Orgeta et al., 2020 [56]; Ponciano et al., 2020 [57]; de Oliveira Gondim et al., 2020 [58]; Morgan et al., 2020 [59]; Corrà et al., 2020 [60]; Lora-Millan et al., 2021 [61]; de Oliveira et al., 2021 [62]; Keogh et al., 2021 [63]; Sica et al., 2021 [64]; Elena et al., 2021 [65]; van Wamelen et al., 2021 [66]; Vellata et al., 2021 [67]; Abou et al., 2021 [68]; McDermott et al., 2021 [69]; Gonçalves et al., 2021 [70]; Bowman et al., 2021 [71]; Ó Breasail et al., 2021 [72]; Albán-Cadena et al., 2021 [73]; Barrachina-Fernández et al., 2021 [74]; Prieto-Avalos et al., 2022 [75]; Tripathi et al., 2022 [76]; Gavelin et al., 2022 [77]; Alfalahi et al., 2022 [78]; Guo et al., 2022 [79]; Yue et al., 2022 [80]; Giannakopoulou et al., 2022 [81]; Gerritzen et al., 2022 [82]; Mughal et al., 2022 [83]; Gopal et al., 2022 [84]; Sakamaki et al., 2022 [85]; Lim et al., 2022 [86]; Chung et al., 2022 [87]; Ancona et al., 2022 [88]; Vanmechelen et al., 2023 [89]; Sigcha et al., 2023 [90]; Zhang et al., 2023 [91]; Fujikawa et al., 2023 [92]; Scataglini et al., 2023 [93]; Moreta-de-Esteban et al., 2023 [94]; Huang et al., 2023 [95]; Laar et al., 2023 [96]; Bansal et al., 2023 [97]; Tuena et al., 2023 [98]; Özdenet, 2023 [99]; Li et al., 2023 [100]; Taniguchi et al., 2024 [101]; Sapienza et al., 2024 [102]; Cox et al., 2024 [103]; Janssen Daalen et al., 2024 [104]; Polvorinos-Fernández et al., 2024 [105]; Lai et al., 2024 [106]; Elbatanouny et al., 2024 [107]; Peng et al., 2024 [108]; di Biase et al., 2024 [109]; Jeyasingh-Jacob et al., 2024 [110]; Boege et al., 2024 [111]; Willemse et al., 2024 [112]; Fu et al., 2024 [113]; Silva-Batista et al., 2024 [114]; Zhang et al., 2024 [115]; Franco et al., 2024 [116]; Tao et al., 2024 [117]; Craig et al., 2025 [118]; Hall et al., 2025 [119]; Rábano-Suárez et al., 2025 [120]; and Gattoni et al., 2025 [121].
Figure 4. Measurement and interaction technologies reported by the reviews included: Steins et al., 2014 [37]; Hubble et al., 2015 [38]; Silva de Lima et al., 2017 [39]; Rovini et al., 2017 [40]; Hasan et al., 2017 [41]; Son et al., 2018 [42]; Thorp et al., 2018 [43]; Merola et al., 2018 [44]; Johansson et al., 2018 [45]; Sweeney et al., 2019 [46]; Brognara et al., 2019 [47]; Rovini et al., 2019 [48]; Linares-Del Rey et al., 2019 [49]; Godoi et al., 2019 [50]; Vaartio-Rajalin et al., 2019 [51]; Garcia-Agundez et al., 2019 [52]; Theodoros et al., 2019 [53]; Teshuva et al., 2019 [54]; Pardoel et al., 2019 [55]; Orgeta et al., 2020 [56]; Ponciano et al., 2020 [57]; de Oliveira Gondim et al., 2020 [58]; Morgan et al., 2020 [59]; Corrà et al., 2020 [60]; Lora-Millan et al., 2021 [61]; de Oliveira et al., 2021 [62]; Keogh et al., 2021 [63]; Sica et al., 2021 [64]; Elena et al., 2021 [65]; van Wamelen et al., 2021 [66]; Vellata et al., 2021 [67]; Abou et al., 2021 [68]; McDermott et al., 2021 [69]; Gonçalves et al., 2021 [70]; Bowman et al., 2021 [71]; Ó Breasail et al., 2021 [72]; Albán-Cadena et al., 2021 [73]; Barrachina-Fernández et al., 2021 [74]; Prieto-Avalos et al., 2022 [75]; Tripathi et al., 2022 [76]; Gavelin et al., 2022 [77]; Alfalahi et al., 2022 [78]; Guo et al., 2022 [79]; Yue et al., 2022 [80]; Giannakopoulou et al., 2022 [81]; Gerritzen et al., 2022 [82]; Mughal et al., 2022 [83]; Gopal et al., 2022 [84]; Sakamaki et al., 2022 [85]; Lim et al., 2022 [86]; Chung et al., 2022 [87]; Ancona et al., 2022 [88]; Vanmechelen et al., 2023 [89]; Sigcha et al., 2023 [90]; Zhang et al., 2023 [91]; Fujikawa et al., 2023 [92]; Scataglini et al., 2023 [93]; Moreta-de-Esteban et al., 2023 [94]; Huang et al., 2023 [95]; Laar et al., 2023 [96]; Bansal et al., 2023 [97]; Tuena et al., 2023 [98]; Özdenet, 2023 [99]; Li et al., 2023 [100]; Taniguchi et al., 2024 [101]; Sapienza et al., 2024 [102]; Cox et al., 2024 [103]; Janssen Daalen et al., 2024 [104]; Polvorinos-Fernández et al., 2024 [105]; Lai et al., 2024 [106]; Elbatanouny et al., 2024 [107]; Peng et al., 2024 [108]; di Biase et al., 2024 [109]; Jeyasingh-Jacob et al., 2024 [110]; Boege et al., 2024 [111]; Willemse et al., 2024 [112]; Fu et al., 2024 [113]; Silva-Batista et al., 2024 [114]; Zhang et al., 2024 [115]; Franco et al., 2024 [116]; Tao et al., 2024 [117]; Craig et al., 2025 [118]; Hall et al., 2025 [119]; Rábano-Suárez et al., 2025 [120]; and Gattoni et al., 2025 [121].
Futureinternet 17 00376 g004
Figure 5. Application areas. Steins et al., 2014 [37]; Hubble et al., 2015 [38]; Silva de Lima et al., 2017 [39]; Rovini et al., 2017 [40]; Hasan et al., 2017 [41]; Son et al., 2018 [42]; Thorp et al., 2018 [43]; Merola et al., 2018 [44]; Johansson et al., 2018 [45]; Sweeney et al., 2019 [46]; Brognara et al., 2019 [47]; Rovini et al., 2019 [48]; Linares-Del Rey et al., 2019 [49]; Godoi et al., 2019 [50]; Vaartio-Rajalin et al., 2019 [51]; Garcia-Agundez et al., 2019 [52]; Theodoros et al., 2019 [53]; Teshuva et al., 2019 [54]; Pardoel et al., 2019 [55]; Orgeta et al., 2020 [56]; Ponciano et al., 2020 [57]; de Oliveira Gondim et al., 2020 [58]; Morgan et al., 2020 [59]; Corrà et al., 2020 [60]; Lora-Millan et al., 2021 [61]; de Oliveira et al., 2021 [62]; Keogh et al., 2021 [63]; Sica et al., 2021 [64]; Elena et al., 2021 [65]; van Wamelen et al., 2021 [66]; Vellata et al., 2021 [67]; Abou et al., 2021 [68]; McDermott et al., 2021 [69]; Gonçalves et al., 2021 [70]; Bowman et al., 2021 [71]; Ó Breasail et al., 2021 [72]; Albán-Cadena et al., 2021 [73]; Barrachina-Fernández et al., 2021 [74]; Prieto-Avalos et al., 2022 [75]; Tripathi et al., 2022 [76]; Gavelin et al., 2022 [77]; Alfalahi et al., 2022 [78]; Guo et al., 2022 [79]; Yue et al., 2022 [80]; Giannakopoulou et al., 2022 [81]; Gerritzen et al., 2022 [82]; Mughal et al., 2022 [83]; Gopal et al., 2022 [84]; Sakamaki et al., 2022 [85]; Lim et al., 2022 [86]; Chung et al., 2022 [87]; Ancona et al., 2022 [88]; Vanmechelen et al., 2023 [89]; Sigcha et al., 2023 [90]; Zhang et al., 2023 [91]; Fujikawa et al., 2023 [92]; Scataglini et al., 2023 [93]; Moreta-de-Esteban et al., 2023 [94]; Huang et al., 2023 [95]; Laar et al., 2023 [96]; Bansal et al., 2023 [97]; Tuena et al., 2023 [98]; Özdenet, 2023 [99]; Li et al., 2023 [100]; Taniguchi et al., 2024 [101]; Sapienza et al., 2024 [102]; Cox et al., 2024 [103]; Janssen Daalen et al., 2024 [104]; Polvorinos-Fernández et al., 2024 [105]; Lai et al., 2024 [106]; Elbatanouny et al., 2024 [107]; Peng et al., 2024 [108]; di Biase et al., 2024 [109]; Jeyasingh-Jacob et al., 2024 [110]; Boege et al., 2024 [111]; Willemse et al., 2024 [112]; Fu et al., 2024 [113]; Silva-Batista et al., 2024 [114]; Zhang et al., 2024 [115]; Franco et al., 2024 [116]; Tao et al., 2024 [117]; Craig et al., 2025 [118]; Hall et al., 2025 [119]; Rábano-Suárez et al., 2025 [120]; and Gattoni et al., 2025 [121].
Figure 5. Application areas. Steins et al., 2014 [37]; Hubble et al., 2015 [38]; Silva de Lima et al., 2017 [39]; Rovini et al., 2017 [40]; Hasan et al., 2017 [41]; Son et al., 2018 [42]; Thorp et al., 2018 [43]; Merola et al., 2018 [44]; Johansson et al., 2018 [45]; Sweeney et al., 2019 [46]; Brognara et al., 2019 [47]; Rovini et al., 2019 [48]; Linares-Del Rey et al., 2019 [49]; Godoi et al., 2019 [50]; Vaartio-Rajalin et al., 2019 [51]; Garcia-Agundez et al., 2019 [52]; Theodoros et al., 2019 [53]; Teshuva et al., 2019 [54]; Pardoel et al., 2019 [55]; Orgeta et al., 2020 [56]; Ponciano et al., 2020 [57]; de Oliveira Gondim et al., 2020 [58]; Morgan et al., 2020 [59]; Corrà et al., 2020 [60]; Lora-Millan et al., 2021 [61]; de Oliveira et al., 2021 [62]; Keogh et al., 2021 [63]; Sica et al., 2021 [64]; Elena et al., 2021 [65]; van Wamelen et al., 2021 [66]; Vellata et al., 2021 [67]; Abou et al., 2021 [68]; McDermott et al., 2021 [69]; Gonçalves et al., 2021 [70]; Bowman et al., 2021 [71]; Ó Breasail et al., 2021 [72]; Albán-Cadena et al., 2021 [73]; Barrachina-Fernández et al., 2021 [74]; Prieto-Avalos et al., 2022 [75]; Tripathi et al., 2022 [76]; Gavelin et al., 2022 [77]; Alfalahi et al., 2022 [78]; Guo et al., 2022 [79]; Yue et al., 2022 [80]; Giannakopoulou et al., 2022 [81]; Gerritzen et al., 2022 [82]; Mughal et al., 2022 [83]; Gopal et al., 2022 [84]; Sakamaki et al., 2022 [85]; Lim et al., 2022 [86]; Chung et al., 2022 [87]; Ancona et al., 2022 [88]; Vanmechelen et al., 2023 [89]; Sigcha et al., 2023 [90]; Zhang et al., 2023 [91]; Fujikawa et al., 2023 [92]; Scataglini et al., 2023 [93]; Moreta-de-Esteban et al., 2023 [94]; Huang et al., 2023 [95]; Laar et al., 2023 [96]; Bansal et al., 2023 [97]; Tuena et al., 2023 [98]; Özdenet, 2023 [99]; Li et al., 2023 [100]; Taniguchi et al., 2024 [101]; Sapienza et al., 2024 [102]; Cox et al., 2024 [103]; Janssen Daalen et al., 2024 [104]; Polvorinos-Fernández et al., 2024 [105]; Lai et al., 2024 [106]; Elbatanouny et al., 2024 [107]; Peng et al., 2024 [108]; di Biase et al., 2024 [109]; Jeyasingh-Jacob et al., 2024 [110]; Boege et al., 2024 [111]; Willemse et al., 2024 [112]; Fu et al., 2024 [113]; Silva-Batista et al., 2024 [114]; Zhang et al., 2024 [115]; Franco et al., 2024 [116]; Tao et al., 2024 [117]; Craig et al., 2025 [118]; Hall et al., 2025 [119]; Rábano-Suárez et al., 2025 [120]; and Gattoni et al., 2025 [121].
Futureinternet 17 00376 g005
Figure 6. Distribution of the types of measurement devices related to motor clinical variables reported in the studies.
Figure 6. Distribution of the types of measurement devices related to motor clinical variables reported in the studies.
Futureinternet 17 00376 g006
Figure 7. Distribution of the number of studies according to the body location of the wearable devices being reported.
Figure 7. Distribution of the number of studies according to the body location of the wearable devices being reported.
Futureinternet 17 00376 g007
Figure 8. Frequency of the motor clinical variables addressed in the reviews.
Figure 8. Frequency of the motor clinical variables addressed in the reviews.
Futureinternet 17 00376 g008
Figure 9. Challenges and open issues.
Figure 9. Challenges and open issues.
Futureinternet 17 00376 g009
Table 1. Instantiation of the general combination of keywords to Scopus, Web of Science, and PubMed.
Table 1. Instantiation of the general combination of keywords to Scopus, Web of Science, and PubMed.
DatabaseQuery
ScopusTITLE-ABS ((Parkinson OR neurodegenerative OR neurological) AND (smart OR remote OR computerized OR ehealth OR digital OR mobile OR sensor OR biosensor OR wearable OR “artificial intelligence” OR “machine learning” OR “deep learning” OR ((technology OR technologies) AND (information OR communication))) AND (review OR secondary OR “systematic mapping”))
Web of ScienceTS = (Parkinson OR neurodegenerative OR neurological) AND (smart OR remote OR computerized OR ehealth OR digital OR Mobile OR sensor OR biosensor OR wearable OR “artificial intelligence” OR “machine learning” OR “deep learning” OR ((technology OR technologies) AND (information OR communication))) AND TS = (review OR secondary OR “systematic mapping”)
PubMed(Parkinson [Title/Abstract] OR neurodegenerative [Title/Abstract] OR neurological [Title/Abstact]) AND (smart [Title/Abstract] OR remote [Title/Abstract] OR computerized [Title/Abstract] OR ehealth [Title/Abstract] OR digital [Title/Abstract] OR mobile [Title/Abstract] OR sensor [Title/Abstract] OR biosensor [Title/Abstract] OR wearable [Title/Abstract] OR “artificial intelligence” [Title/Abstract] OR “machine learning” [Title/Abstract] OR “deep learning” [Title/Abstract] OR ((technology [Title/Abstract] OR technologies [Title/Abstract]) AND (information [Title/Abstract] OR communication [Title/Abstract])) AND (review [Title/Abstract] OR secondary [Title/Abstract] OR “systematic mapping” [Title/Abstract]))
Table 2. Eligibility criteria.
Table 2. Eligibility criteria.
Inclusion CriteriaExclusion Criteria
IC1: Reviews related to patient-oriented smart applications to support the diagnosis, rehabilitation, and care of PwP published in peer-reviewed journals.
IC2: Reviews with a well-documented literature search strategy, allowing for its replication.
EC1: References without authors or abstract.
EC2: Reviews focusing on smart applications oriented towards formal and informal caregivers.
EC3: Reviews published in conference proceedings or as book chapters.
EC4: Reviews that, although including studies targeting PwP, targeted other health conditions in 70% of their primary studies.
EC5: Reviews that, although considered patient-oriented smart applications, were focused on smart applications oriented towards formal and informal caregivers in more than 70% of their primary studies.
EC6: Reviews published in languages other than English.
EC7: Articles whose full text was not available.
Table 3. Motor clinical variables and measurement devices considered in the reviews included.
Table 3. Motor clinical variables and measurement devices considered in the reviews included.
Motor Clinical VariablesWearable DevicesSmartphonesOther Technologies
Freezing of gait[39,40,43,44,45,48,55,58,59,64,73,79,81,83,85,90,91,95,96,97,103,107,116,120][44,59,68,73,79,81,112,116,120][97,110] (a)
Other gait disturbances[37,38,40,41,42,44,45,47,50,57,58,59,63,64,73,74,75,79,81,83,85,86,88,90,96,100,102,105,109,110,116,120][41,42,44,49,57,59,68,73,79,81,112,116,120][110] (a)
Tremor[40,41,42,43,44,45,73,75,79,81,83,85,89,90,92,100,103,105,108,109][41,42,44,49,73,76,79,81,92,94,108,112]
Balance[38,40,42,44,45,58,73,75,79,81,85,88,105,109,120][42,44,68,73,76,79,81,112,120]
Bradykinesia[41,42,43,44,49,54,64,73,75,79,83,88,89,90,100,103][41,42,44]
Dyskinesia[40,41,42,43,45,48,73,75,79,85,88,89,100,103,109][41,42,73,79]
Functional activities[37,40,42,59,60,72,73][42,73,112,120]
Physical activity[42,45,59,63,64,72,73,96,100][42,73,112]
Falls[39,45,48,59,64,73,75,85,95][42,68,73]
Fine motor impairments[73,79,81,84,105,109][73,79,81,112,113][78,84] (b,c)
Rigidity[40,41,44,54,83,109][41,44]
Swallowing disorders[104,109][104]
Hypokinesia[43,73][73]
Impaired bed mobility[101] [101] (d)
(a) Motion capture systems; (b) computer-based objective assessments based on special-purpose keyboards; (c) online platforms; (d) ambient intelligence.
Table 4. Assessment results of the proposed smart applications based on the quantitative results presented by the reviews.
Table 4. Assessment results of the proposed smart applications based on the quantitative results presented by the reviews.
Motor Clinical
Variables
Assessment Results
Freezing of gaitJohansson et al. [45] showed good agreement between wearable and video-based ratings regarding the number of freezing episodes and the percentage of time with freezing of gait, while other studies [39,68,79,90,103] concluded that wearables and smartphone devices might be used to detect freezing of gait. However, inconsistencies in the assessment processes still hinder the use of these applications in clinical practice [91].
Gait disturbances excluding freezing of gaitSome gait outcomes, such as jerk, harmonic stability or oscillation range, were able to differentiate PwP and healthy controls [38,40,90]. Additionally, good correlations were found between gait outcomes measured by wearable devices and clinical scales such as UPDRS, MDP-UPDRS or Hoehn and Yahr [81]. However, significant differences in gait spatiotemporal parameters were found between the primary studies, with high variability in terms of parameters and placement of the wearable devices [47]. This led to contradictory findings, which seem to indicate that some parameters are more consistent than others [38].
TremorSome reviews [40,43,89,94] found good results for correlation with gold-standard clinical tremor measurements (e.g., UPDRS or MDS-UPDRS), while other studies [49,73,76,81,90,94,103,109] demonstrated good accuracy in identifying tremor presence and severity and in differentiating PwP with tremor from healthy individuals.
BalanceConsidering diverse parameters (e.g., mean acceleration, jerk, or sway distance), Abou et al. [68] found significant correlations between smartphone balance assessments and balance clinical tests, Barrachina-Fernández et al. [73] demonstrated that wearable devices can detect fluctuations of the center of gravity, and Johansson et al. [38] and di Biase et al. [109] found smart applications with good accuracy when discriminating PwP from healthy controls. However, it seems that the measured parameters have different consistency levels [38].
BradykinesiaSome applications can accurately measure bradykinesia [41,54,73,88,89,90,103] since their outcomes present good correlations with clinical assessment scales [41,54,73,88,89,103] or the severity of bradykinesia assessed by experienced clinicians [90]. Moreover, Son et al. [42] and Thorp et al. [43] reported smart applications with good accuracy in detecting bradykinesia versus no bradykinesia.
DyskinesiaSome measurements provided by the proposed smart applications might support discrimination between dyskinetic and non-dyskinetic events [43,45,89,103] compared to clinical ratings [43,45,89,103] such as UPDRS or an experienced observer [43]. However, considering the significant differences in accuracy between the proposed smart applications, Thorp et al. [43] suggested that the location of the wearable devices influences the precision of the measurements, making the detection of dyskinesia more accurate when the wearable devices are not attached to body parts involved in the tasks performed by patients. Moreover, Rovini et al. [40] and Thorp et al. [43] concluded that detecting dyskinesia during daily activities is particularly complex due to the difficulty in distinguishing voluntary movements from dyskinetic movements.
Functional activitiesAccording to two studies [37,42], the proposed applications are not only capable of assessing the type, quantity and quality measures, but also of differentiating the Parkinson’s disease-specific mobility patterns of healthy controls and classifying disease severity and progression within PwP.
Physical activityThere is some evidence of good correlations between physical activity measurements and clinical instruments [42,59] and capacity to discriminate sedentary or upright and walking behavior [45].
Detection of fallsJohansson et al. [45] reported that the quantification of missteps and risk of falls was shown to discriminate non-fallers and fallers, Abou et al. [68] suggested the discriminative ability of smartphone applications to predict future falls, and Sica et al. [64] reported the ability to predict future falls, even in patients without a prior history of falling.
Fine motor impairmentsKeystroke dynamics presents good accuracy, sensitivity and specificity to discriminate and classify PwP [78]. Moreover, Tripathi et al. [76] and Gopal et al. [84] concluded that statistically significant correlations with traditional clinic metrics were most frequently reported, and Polvorinos-Fernández et al. [105] highlighted finger tapping as a particularly strong indicator for diagnosing and monitoring disease progression.
RigidityTeshuva et al. [54] suggested that some of the proposed applications have the capacity to discriminate between patients with rigidity and healthy controls but concluded that the accuracy of the applications measuring rigidity was not very high.
Other motor clinical variables (i.e., swallowing disorders, hypokinesia, and impaired bed mobility)Due to the reduced number of primary studies addressing swallowing disorders, hypokinesia, and impaired bed mobility, it was not possible to systematize evidence about the effectiveness of the respective applications.
Table 5. Non-motor clinical variables.
Table 5. Non-motor clinical variables.
Non-Motor Clinical VariablesWearable DevicesSmartphonesSerious Computerized Games
Affective state [49]
Anxiety[104]
Autonomic function[44][44]
Brain dysfunction[90]
Cognition[41,66,90,104][41,104,112][118]
Constipation[66,83,104]
Depression[66,83,104]
Emotional dysfunction[90]
Fatigue[104]
Hallucinations and delusions[104]
Heart rate variability[104]
Impulse control disorder[66]
Skin impedance[104]
Orthostatic hypotension[104]
Pain[104]
Sleep disturbances[44,45,59,66,73,83,103,104][44,73]
Voice and speech quality[41,90,109][41]
Urinary dysfunction[66,104]
Table 6. Challenges and open issues: categories and sub-categories.
Table 6. Challenges and open issues: categories and sub-categories.
CategorySub-Category
Experimental designHeterogeneity of methods.
Heterogeneity of outcomes.
Heterogeneity of participants.
Representativeness of the participants.
Number of participants.
Variability and specification of scripted tasks.
Specification of the placement of the wearable devices.
Correlations with clinical assessment instruments.
Learning effects.
Confounding variables.
Reproducibility.
Clinical viability Clinical utility.
Limited research evidence on the efficiency and efficacy of smart applications with contradictory findings, namely when used in home environments.
Translation to clinical practice.
Long-term impact on health outcomes.
Sociocultural factors affecting adherence to new clinical models.
Cost effectiveness.
Integration within clinical workflows.
Efficient clinical management tools.
Unscripted and unconstrained daily tasks and activities.
Variability of metrics.
Usefulness of some metrics.
Non-motor clinical variables.
Variability of the number and placement of the wearable devices.
Transparency of the outcomes.
Passive monitoring versus active monitoring.
Closed-loop applications.
Interoperability.
Acceptability Adherence.
User experience.
Comfort.
Invasiveness.
Stakeholders’ perspectives.
Technical and training assistance.
Regulatory conformatyConformance requirements.
Security and data protection.
Risk analysis.
Standardized assessment criteria.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bastardo, R.; Pavão, J.; Martins, A.I.; Silva, A.G.; Rocha, N.P. Patient-Oriented Smart Applications to Support the Diagnosis, Rehabilitation, and Care of Patients with Parkinson’s: An Umbrella Review. Future Internet 2025, 17, 376. https://doi.org/10.3390/fi17080376

AMA Style

Bastardo R, Pavão J, Martins AI, Silva AG, Rocha NP. Patient-Oriented Smart Applications to Support the Diagnosis, Rehabilitation, and Care of Patients with Parkinson’s: An Umbrella Review. Future Internet. 2025; 17(8):376. https://doi.org/10.3390/fi17080376

Chicago/Turabian Style

Bastardo, Rute, João Pavão, Ana Isabel Martins, Anabela G. Silva, and Nelson Pacheco Rocha. 2025. "Patient-Oriented Smart Applications to Support the Diagnosis, Rehabilitation, and Care of Patients with Parkinson’s: An Umbrella Review" Future Internet 17, no. 8: 376. https://doi.org/10.3390/fi17080376

APA Style

Bastardo, R., Pavão, J., Martins, A. I., Silva, A. G., & Rocha, N. P. (2025). Patient-Oriented Smart Applications to Support the Diagnosis, Rehabilitation, and Care of Patients with Parkinson’s: An Umbrella Review. Future Internet, 17(8), 376. https://doi.org/10.3390/fi17080376

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop