Next Article in Journal
Autonomous Surface and Underwater Vehicles as Effective Ecosystem Monitoring and Research Platforms in the Arctic—The Glider Project
Next Article in Special Issue
Evaluation of User-Prosthesis-Interfaces for sEMG-Based Multifunctional Prosthetic Hands
Previous Article in Journal
Improving Machine Learning Classification Accuracy for Breathing Abnormalities by Enhancing Dataset
Previous Article in Special Issue
The Role of Surface Electromyography in Data Fusion with Inertial Sensors to Enhance Locomotion Recognition and Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Perspective

Wearable Assistive Robotics: A Perspective on Current Challenges and Future Trends

1
Multimodal Inte-R-Action Lab, University of Bath, Bath BA2 7AY, UK
2
Centre for Autonomous Robotics (CENTAUR), University of Bath, Bath BA2 7AY, UK
3
Centre for Biosensors, Bioelectronics and Biodevices (C3Bio), University of Bath, Bath BA2 7AY, UK
4
Department of Electronics and Electrical Engineering, University of Bath, Bath BA2 7AY, UK
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(20), 6751; https://doi.org/10.3390/s21206751
Submission received: 30 August 2021 / Revised: 30 September 2021 / Accepted: 6 October 2021 / Published: 12 October 2021

Abstract

:
Wearable assistive robotics is an emerging technology with the potential to assist humans with sensorimotor impairments to perform daily activities. This assistance enables individuals to be physically and socially active, perform activities independently, and recover quality of life. These benefits to society have motivated the study of several robotic approaches, developing systems ranging from rigid to soft robots with single and multimodal sensing, heuristics and machine learning methods, and from manual to autonomous control for assistance of the upper and lower limbs. This type of wearable robotic technology, being in direct contact and interaction with the body, needs to comply with a variety of requirements to make the system and assistance efficient, safe and usable on a daily basis by the individual. This paper presents a brief review of the progress achieved in recent years, the current challenges and trends for the design and deployment of wearable assistive robotics including the clinical and user need, material and sensing technology, machine learning methods for perception and control, adaptability and acceptability, datasets and standards, and translation from lab to the real world.

Graphical Abstract

1. Introduction

1.1. Importance of Wearable Assistive Robotics

Wearable assistive robotics has emerged as a promising technology to assist humans to enhance, supplement or replace limb motor functions, commonly affected after suffering an injury, a stroke or as a result of natural aging [1,2]. This robotic assistance is important to enable humans to perform physical and social activities of daily living (ADLs) independently, contributing to both dignity and improved quality of life [3,4]. Wearable robots can be found as exoskeletons, orthotics and prosthetics, capable of extending the strength of human limbs, restoring lost or weak limb functions and substituting lost limbs, respectively (Figure 1) [5,6,7,8,9,10,11,12]. These assistive devices are designed to be worn by humans and closely interact with the human body. Therefore, wearable robots need to be safe, reliable and intelligent, but also compliant, lightweight and comfortable to ensure the correct assistance, the safety of the user, and acceptability and usability of the device [13,14]. These requirements can be achieved by making use of technological advances such as multimodal wearable sensors, soft and hybrid materials, actuation systems, data fusion and machine learning and robotic architectures.
Section 2, Section 3 and Section 4 provide a description of the aspects involved in the design of wearable assistive robots including: the user perspective, methods for data processing and decision-making, actuators and materials for fabrication. Section 5 and Section 6 describe the challenges faced by current assistive technologies and forecast future trends in the field.

1.2. Clinical Need—Target User Groups

The world health organisation (WHO) estimates that 2 billion people will require assistive devices by 2050, doubling the current estimate [15]. This increase is driven by the growth of the aging population [16], people with upper and lower limb impairments, noncommunicable diseases and mental health conditions [17,18,19]. Wearable assistive robotic devices can enable their users to gradually recover the capability to perform ADLs independetly and naturally, leading a healthier life. Despite the importance of this technology, only 10% of those in the need of assistance have access to these robotic devices [15]. This limited access represents an issue for a sustainable future for all, which is one of the Sustainable Development Goals identified by the United Nations (UN) that strive to “leave no-one behind” [20,21]. The Global Cooperation on Assistive Technology (GATE) is another initiative created to improve global access to assistive devices [22], including wearables such as fall detectors, hearing aids, lower-limb prostheses and talking and touching watches [23].
The spread of smart and wearable technologies offers a strong set of tools to develop wearable assistive robots [24] that can impact positively on physical and social aspects of users [25]. Wearables also have an advantage over other forms of assistive technologies (e.g., handheld devices, mobility aids and distributed systems [26]) in their continuous close proximity to the user and compliance to the human body. These aspects enable the systems to collect vital data to provide customised assistance highly valued by people with different levels of physical, sensory and cognitive impairments [25,27].

2. Wearable Assistive Robots Technologies

Wearable assistive robots are designed with the goal to assist humans with physical impairments, particularly, assisting lower and upper limbs, and joints on the human body. These robots, which work in proximity with the human body, can be built using different material technology usually composed of rigid, soft or hybrid materials.

2.1. Rigid Materials in Wearable Robotics

Rigid and semi-rigid materials have been used traditionally for the development of exoskeletons, widely employed for assistance in locomotion activities. The ReWalk robotic system with a semi-rigid structure can assist the knee and hip of adults with partial and complete mobility impairments [12], detecting and enhancing the user’s walking action. Vanderbilt, REX and HAL are other examples of these type of assistive robots built with rigid structures (Figure 2). They can assist humans to keep balance while walking but also to accomplish daily activities including sitting, climbing stairs and kneeling [28,29,30,31,32,33]. Wearable robot hands can assist with daily activities such as buttoning, grasping, pouring liquids, closing and opening zips and jars. HandSOME, HandeXos-Beta, HexoSYS, HES Hand and Vanderbilt Hand are wearable hands with rigid structures that use, arrangements of motors and tendons to provide the required assistance (Figure 2) [34,35,36,37,38,39]. These lower and upper limb devices can be configured to respond quickly to the user movement intention using data from electromyographic (EMG), inertial measurement unit (IMU) and torque sensors. Unfortunately, these devices tend to be bulky, heavy (lower limb robots weighing between 15 kg to 25 kg) and expensive. The rigid structure of these robots can also make them uncomfortable and constraining the natural movement of human limbs in certain orientations.

2.2. Soft Materials in Wearable Robotics

Soft materials are becoming popular in the area of assistive robotics with different system developed in recent years to assist upper and lower limbs (Figure 3) [40,41]. Assistive robots with soft materials tend to be lighter and more comfortable compared to robots built with rigid structures. Pneumatic artificial muscles, Boden cables, textiles and shape memory alloys are the main material technologies that have been employed in a variety of wearable soft robots. Soft wearable knee and ankle robots have been used to assist contraction/extension leg movements and also with foot movements in dorsiflexion, plantarflexion, inversion and eversion orientations [42,43]. Exosuits using textiles and Boden cables are some of the most advanced and lightweight devices for assistance to the hip, leg and ankle-foot while walking on flat surfaces [44,45,46]. This technology has been explored with soft gloves, wrist and elbows for reaching, grasping, holding and manipulating objects, but also to potentially assist with buttoning and feeding (holding cutlery) [47,48,49,50]. Modular and customisable systems capable of adapting to the users body and required assistance have been investigated with designs based on tendons, shape memory alloys and pneumatic technology [51,52,53]. Soft materials offers a promising approach for wearable assistive robots that are comfortable, lightweight and do not constrain the movement of upper and lower limbs. However, soft assistive robots still require external large gearboxes that affect the portability of the systems, and pumps that can slow down the system response. Table 1 shows the comparison of the wearable assistive robot technologies presented in this section based on their applications, fabrication materials, degrees of freedom (DoF), body segment assisted, actuation type and weight.

3. Wearable Sensing Technologies

Rapid progress in the development of flexible electronics and materials has enabled the development of advanced wearable sensors for monitoring the state of the human body and the assistive robot. Some of the state-of-the-art wearable sensors can be found as e-textiles (e.g., smart garments) and e-patches (e.g., sensor patches) to monitor aspects such position and orientation of human limbs, human motion and muscle activation [54,55,56,57], heart function [58,59,60,61], brain activity, sleep apnea [62,63], Parkinson disease [64,65,66,67,68]; chemical and electrochemical sensors [69,70,71]. Examples of these sensors are shown in Figure 4. Many sensors have been used in healthy people to monitor, for example, sport activities as well as people affected by health conditions or for clinical tests to detect underlining conditions that could become more serious providing powerful diagnosis tools. These wearable sensors have also been used in devices such as ankle-foot robot orthotics, exosuits, soft wearable gloves and hand exoskeletons to assist humans to rehabilitate and perform ADLs. Another interesting aspects of wearables is that many of these devices can be used by care givers to monitor patients [72,73] to detect and manage treatments or conditions at home [74,75], which have had a direct impact on quality of life and influence on the care sector. Wireless communication technologies, lower power electronics, soft materials and the internet of things (IoT) [76] have largely helped wearable sensors to become smaller, lighter and less cumbersome (e.g., no/less wires), but also to make the sensor data accessible remotely for online monitoring, processing and control.

4. Machine Learning in Wearable Assistive Robots

Machine Learning (ML) offers a promising approach, capable of perceiving and making decisions, to develop intelligent wearable robots that can safely and accurately assist humans. This section provides a brief description of ML methods and Deep Learning (DL) based approaches for activity recognition with different wearable sensor platforms.

4.1. Traditional Machine Learning Methods

Traditional ML methods are generally well understood and can provide state-of-the-art results primarily during training. This approach includes methods such as Support Vector Machines (SVM), Random Forests (RF), Bayesian Inference, Decision Trees (DT), k-Nearest Neighbour (kNN) and logistic regression. Recognition of hand gestures has been studied using wearable sensors (bending and force sensors) and SVM, kNN and DT methods. SVM achieved the highest recognition accuracy in real-time mode, although it required the longest training and prediction time compared to kNN and DT methods [77]. SVM has also outperformed kNN, logistic regression and RF methods for recognition of grip action from an assistive tactile arm brace (TAB) worn on the forearm of participants [78]. However, in human activity recognition (HAR) with IMU data, RF methods showed to perform better compared to SVM, DT, NN, kNN and Naive Bayes methods [79]. RF methods have also shown accurate results for HAR with datasets containing IMU, audio and skeletal data [80]. Bayesian methods and sequential analysis have achieved accurate recognition of sit-to-stand activities using a single wearable accelerometer [81]. The combination of traditional machine learning methods and DL, such as Hidden Markov Models (HMM) and NN, has ensured reliable prediction of sequential gait stages [82]. Data-driven and Fuzzy Logic approaches are model-free and make use of knowledge from experts, they have been used for classification and system control [83,84], and recently in the recognition of ADLs and control of wearable assistive and rehabilitation robots [85,86,87].

4.2. Deep Learning

Deep Learning methods are primarily focused on Neural Network (NN) based architectures, and have demonstrated their ability for accurate activity recognition with complex datasets where other machine learning methods fail [88]. This section briefly presents three DL architectures widely used: the Multilayer Perceptron (MLP), Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN).
Multilayer Perceptrons were applied to data from accelerometers, gyroscopes and velocity sensors on the shank of users to predict the attitude angles of the thigh for walking assistance [89]. MLPs can recognise activities with IMU data [90], however, CNN and RNN methods have achieved higher accuracy for the recognition process. Surface electromyography (sEMG) sensors alongside MLPs and Linear Discriminant Analysis (LDA) have been used to recognise hand gestures and perform better than SVM [91]. CNNs have become popular for feature extraction from complex datasets for classification and recognition tasks. Image recognition and segmentation are key areas of CNNs applied to wearable devices for assistance to people suffering from visual impairment. FuseNet and GoogLeNet architectures are other examples of DL methods for semantic scene segmentation in human auditory assistance to avoid obstacles [92]. CNN architectures and IMU data have shown extensive use in HAR tasks and have achieved improved results when compared to other traditional methods [93,94,95,96,97]. RNN methods are becoming widely used for tasks that require the analysis of sequential data or events. Examples being the use of a Long Short Term Memory (LSTM) backend on the DeepConvLSTM architecture [98,99] or InnoHAR, an inception based CNN architecture followed by two Gated Recurrent Unit (GRU) layers [100]. Recent developments in the use of attention based models have shown their potential for accurate labelling of multimodal wearable sensors data, such as data from HAR tasks performed with upper and lower limbs [101,102]. Figure 5 shows examples of the use of LSTM, ANN, HMM, DBN, RF and DT methods for activity recognition with different wearable sensor platforms [81,82,89,103,104].

4.3. Sensor Fusion

Although sensor fusion is not an ML method, it is a process that can provide a better representation of input data for the subsequent ML processes, and improve the certainty, accuracy and completeness over the results when using a single sensor [105]. The fusion process can be classified as competitive, cooperative or complementary. The competitive fusion approach refers to multiple sensors to measure the same feature, while complementary fusion refers to measuring different aspects of the same phenomenon. Cooperative fusion refers to sensors measuring different attributes, where all are required to form a complete understanding of what is measured [106,107].
Various fusion methods have been studied with wearable sensor data and ML methods. A stacked generalisation architecture using RF first learner and meta learner classifiers was used for the fusion of audio and accelerometer data while performing ADLs [80]. A similar architecture using LSTM first learners on IMU data showed an RF meta learner to outperform SVM and kNN methods, however, a soft voting approach provided higher accuracy yet. The fusion of RF methods outperformed the use of a single RF method when recognising heart disease with a variety of body worn sensors [108]. Fusing IMU and vision as input data to a CNN first learners, and an LSTM fusion method, showed to outperform a variety of methods for the recognition of upper limb actions for assembly tasks in an industrial setting [109]. Fusion of acceleration and angular velocity data images with a CNN ResNet fusion classifier achieved higher recognition accuracy over other methods such as sum, average and maximum [110]. Gesture recognition was improved by the fusion of wearable pressure sensors and radar data from the finger movements with an SVM method [111].

5. Current Challenges in Wearable Assistive Robots

Wearable assistive robotics still faces various challenges to ensure that these systems are reliable, functional in real environments and comply with the user needs. These challenges, which include device adaptability, translation from the lab to the real world, sensing technology and user acceptability, are discussed in this section.

5.1. Device Adaptability to the User (Personalised Robotics)

The need for personalised and adaptive assistive devices is widely recognised. Adaptivity refers to how devices adapt to changes within their operating environment, such changes might include: user habits, situations, individual preferences and exogenous changes. In practice adaptivity means changing the way of the assistive robot to respond to specific user commands, detection of user intent, reacting to exogenous changes such as sensor instability, or modification of the intervention modality [112]. These changes can be split into two broad categories: user driven and exogenous, which require the use of contextual information together with a model of the user or system to ensure the correct adaption process. Adaptivity is often viewed as a machine learning problem, with much work on Human Activity Recognition (HAR) making use of supervised learning techniques to provide the vital contextual awareness [113]. Activities are often identified by comparing sensor events over sliding time windows with templates that are fit to specific ADLs, these templates can be iteratively developed but are generally fixed and are used to drive an ontological model [114]. These model-based approaches lack the flexibility required to allow wearable assistive robots to deal with the level of changes that might be expected in outdoor environments and may not accurately capture the complexities or diversity of many use cases. Hybrid solutions exist, where data and knowledge (activity recognition and model based) approaches are combined to form hybrid solutions [115]. Initial knowledge can be used to form templates that enable initial activity detection, while newly discovered activities are recorded for later labelling, often with active participation of the user. More recently, Smith et al. have proposed the Adaptivity as a Service (AaaS) approach that encompasses all the dimensions of adaptation. AaaS addresses the three types of long-term adaptation: contextual awareness, adaptation in intervention modality, and rapid adaptation to new users [112]. Various works have proposed methods to address the adaptivity of assistive systems, however, it remains a current challenge to ensure safe, reliable and efficient robot assistance and device acceptability.

5.2. Translational Issues from the Lab to the Real World

Wearable assistive robots are usually designed and tested in labs under well-controlled conditions and structured environment. The testing processes tend to follow protocols designed for the lab only, where the user performs actions in a predefined order, controlled setting and as indicated by the researcher. For instance, walking along a certain circuit composed of flat surfaces, stairs and ramps, grasping objects and moving them from one point to another [116,117]. In lab settings, these experiments involve the use of treadmills with controlled speeds and inclinations, and objects specifically designed for manipulation. This approach allows the systematic data analysis, monitoring and control of the assistive device while responding to different user movements. However, this process still does not reflect the situations than a person can experience in outdoor environments, such as walking on different terrains and condition environment, changing the walking speed, grasping and handling real objects for daily use [118]. This difference in testing settings affects the robot performance for sensing, making decisions and controlling the assistance required by the user. This translation of assistive robots from the lab to real environments represents an important issue that needs to be addressed to have robots that can be accepted and daily used by individuals. Computational methods are also generally designed and trained assuming the availability of clean and accurate data from wearable sensors and the robotic system [86]. This assumption works very well in simulation and well-controlled laboratory environments, however, their performance is drastically affected when tested in real environments. Furthermore, unexpected, different and continuous changing of daily situations experienced by the person undoubtedly generate unseen complex and unlabelled data decreasing the performance of the assistive robots and putting in risk the safety of the user [119,120]. Therefore, it is important to have systems with methods capable of learning continuously from the state of the human body, robot and environment to adapt to daily and unexpected situations safely and accurately.

5.3. Wearable Sensing Technology

Even though sensor technology has advanced rapidly allowing multimodal monitoring capabilities in wearable assistive devices, there are elements such as low-weight, low-power consumption, battery lifespan and calibration that remain a challenge. Having sensing devices with low-power consumption can extend the monitoring time between charges, help to reduce the weight needing smaller energy storage, but also it can open opportunities to exploit emerging technology such as energy harvesting [121] directly from the user and wearable robot e.g., while performing locomotion activities. Battery lifespan, weight, and maintenance are major elements to be factored in the process of sensor design. These aspects enable assistive robots to be used for longer periods without recharging the battery but also without the need to calibrate the sensors or replace them. Currently, there are also many elements not directly related to materials and new technologies that play a major critical role in wearable technologies with major challenges. Some of these are communication modules (e.g., WiFi, Bluetooth), data integrity and protection, difficulties in precise localization using for example GPS and mobile-phones [122] and design of suitable interfaces to allow information extraction from users that could have disabilities or difficulties to interact with digital systems.
These technical challenges need to be addressed for the design of wearable sensing technology. However, there is a lack of robust, systematic and testing on large number of patients with different requirements for monitoring and assistance that represent a bottleneck for the use of wearable sensors in a wide range of monitoring and assistive systems [123,124]. This process is usually limited or even ignored due to the time and cost of recruiting patients, preparing and running the tests, and thus, limiting the design of wearable sensors to academic and well-controlled environments only. The understanding of this sensor-patient interface, by keeping patients in the loop, is essential to develop wearable sensors that are clinically viable and approved to run under real conditions and patients but also robust, reliable and comfortable for the user [125].

5.4. User Acceptability

Even with the technological advances in assistive technology, device abandonment remains one of the main challenges. Upper-limb prostheses can be taken as an example where up to 75% of the users reject their prosthesis [126], given that they find the device uncomfortable and insufficiently functional [127]. Acceptance studies show similar trends across different assistive technologies [128], where the reasons behind device abandonment vary between users but tend to emphasise the mismatch between expectations and reality [129,130]. These issues have motivated researchers to move towards a more user-cantered design process. Co-creation is a design framework that has gained interest recently [131,132,133,134]. It ensures the participation of different stakeholders to minimise the mismatch between user needs and the developed product features and maximise impact. Stakeholders included are the users, healthcare professionals as well as industrial representatives and policy makers. Each of the stakeholders provide a special perspective. The users are at the heart of the process enabling the designers to gain empathy and learn from the users’ lived experience [135]. Healthcare professionals provide a more holistic perspective of common concerns and issues that arise. Their involvement is particularly important as they guide and support users during the decision making process of assistive technology [136]. Investors and policy makers are key to ensure that the translation of research does not end in what has been described as the “valley of death” that creates a gap between the available technologies and commercial products [137]. Weight reduction of the wearable assistive robot is another challenging aspect to make the system comfortable and acceptable for the user. Weight and portability are affected by heavier power supply systems, which are needed to provide longer autonomy as well as comply with activity-dependant assistive forces requirements, e.g., walking, running, sit-to-stand [138,139]. Some works have proposed the use of a compact variable gravity compensation approach to generate torque employing a cam and lever mechanism to improve the energy efficiency of assistive systems, surgical and mobile robots [140]. Spring-based gravity compensation mechanism and arrangement of springs have been used to improve energy efficiency for the delivery of fixed torques [141,142]. The Mechanically Adjustable Compliance and Controllable Equilibrium Position Actuation (MACCEPA) mechanism has shown to be an energy efficient actuation system for ankle-foot assistive robots and gait rehabilitation [143,144]. Recently, a design optimisation method, that uses mechanism parameters and mechanism architecture, has been employed to obtain the optimal arrangement of actuators to improve the energy efficiency of soft lower limb exoskeletons [138]. These works have shown impressive progress for actuation systems to assist with specific activities. However, the development of simple and energy efficient actuation systems, which would enable the use of small power supplies, remains a challenge in wearable robotics for assistance to ADLs.
Other important aspects to consider for the acceptability of the device is the user’s perception about the risk of using AI and IoT for data collection and sharing, and control of the system. These aspects make the user aware of ethical, privacy and safety concerns, which are more common amongst younger people- the future users of assistive devices currently being developed [145]. These concerns should be considered during the design process of assistive robots to ensure that individuals are able to trust and use the devices. Also, researchers need to be familiar with frameworks such as the Technology Acceptance Model (TAM) and Transparency paradigm to ensure that the relevant acceptance factors are included in the robot and to minimise the perceived risks of safety and privacy from AI and IoT [146,147,148,149].

6. Forecasting Future Trends on Wearable Assistive Robots

Undoubtedly, the field of wearable assistive robotics has seen impressive achievements given the advances in elements such as sensor technology, fabrication materials, machine learning and control methods, and computational power. However, to reach the aim of having smart and comfortable wearable systems that can assist humans to perform daily activities independently in a natural way, safely and efficiently, there are still aspects such as materials and sensing, learning and adaptability, datasets and standards that need to be considered and improved for the future development of wearable assistive robotics.

6.1. Hybrid Wearable Assistive Systems

Currently, most of the assistive robots are built using rigid or soft materials. Rigid materials have been widely used for the development of systems that can help humans to perform daily activities, but also to enhance the human strength to carried out tasks in industry. Soft materials has been gaining more attention in recent years for the development of wearable robots given that these materials are lightweight, compliant and do not constrain the natural movement of the human body, which usually occurs with rigid materials. Despite these characteristic they cannot provide the required torque to assist the human, for example lifting the legs for locomotion activities. Hybrid approaches, composed of rigid and soft elements, can offer a better trade-off for the design of assistive robots [150,151]. This approach can make the robot structure lightweight and capable of adapting to the human body, while still delivering the required torque safely [152]. This type of robot design enhances the sensing modalities and technology that can be integrated within the hybrid structure [153,154]. This can provide richer and larger datasets that have the potential to increase the repertoire of opportunities for research on novel machine learning and control methods for learning and adapting the level and type of required assistance [155,156]. Hybrid approaches, embedded with a large variety of sensing technologies, is a key aspect that we expect to see in the development of wearable assistive robots.

6.2. Learning and Adaptability to the User

Two key components required for the development of robots that can safely and efficiently assist humans are the capability to learn and adapt to the user over time [157,158]. Current wearable assistive robots are designed to support users under specific and controlled conditions, for example, assisting to walk and sit-to-stand or reach and grasp an object in well-controlled laboratory settings. These systems tend to fail when they are tested in outdoor environments or even when there are slightly changes in the test settings. For this reason, it is important to research and develop methods that allow the design of robots capable of learning and adapting to the user and changes in the surrounding environment. Advanced intelligent robot architectures, for data processing at different levels of abstraction, offer an approach to develop safe and reliable assistive systems, as has been seen in other robotic applications [159,160,161]. Usually, these architectures include modules such as sensing, data fusion, perception, decision-making and control, but also memory modules and reactive and adaptive layers for continuous learning and adaptive processes [162,163]. This approach can be implemented in wearable assistive robot architectures, together with novel machine learning and control methods, for example to allow the robot to identify the activity performed by the human, and thus deliver the required assistance. Another key aspect to achieve adaptability in the design of future assistive systems is the context of the task or activity [164,165]. Having wearable robots that for example know whether the person is at home or at work allow the system to identify the most likely actions that the person would perform, and thus, increase the reliability of decisions and assistive actions made the by robot [166]. This approach offers the potential to have smart wearable assistive robots that can safely react to unseen or unexpected events or data, but also to adapt to different and changing surrounding environments.

6.3. Datasets and Standards

Significant progress achieved in recent decades in robotics can offer huge societal benefits, however, they must be appropriately regulated and ethically developed. Datasets and standards are two key pillars in the ethical development of new robots. At present, significant research is undertaken purely because of dataset availability, rather than on a strong unmet clinical need (there is often poor correlation between clinical need and dataset availability). There are several publicly available datasets for ADL [167,168], however, they are usually collected and prepared using different protocols complicating analysis and replication. It is critical to have standards for robust and reliable collection and preparation of datasets correlating to the clinical needs, but also enabling researchers to replicate the data collection and analysis. Likewise, it is important that the emerging ethics are strongly linked to the development of standards and the implementation of regulations [169]. Current ethical frameworks include the EPSRC Principles of Robotics [170] and the 2006 EURON Roboethics Roadmap [171], and recent standards include ISO 13482 (Safety requirements for personal care robots) [172] and BS 8611:2016 (Robots and robotic devices in general) [173]. The IEEE has set forth on an ambitious programme of standards under the banner of the IEEE P7000 series, including standards on Data Privacy Processes (P7002), Ontological Standard for Ethically driven Robotics and Automation Systems (P7007) and IEEE Recommended Practice for Assessing the Impact of Autonomous and Intelligent Systems on Human Well-Being (Std 7010-2020) [174].
The combination of human factors with machine learning, sensors, materials and clinical feedback is crucial for the development of smart, efficient and comfortable wearable assistive robotics for daily usage by patients. The future development of this type of robot looks promising with great achievements and benefits for society in the coming years.

7. Conclusions

Wearable assistive robots have the potential to offer new alternative platforms and ways to deliver assistance, rehabilitation and care to users. This perspective paper has presented a list of design requirements to develop reliable, efficient, safe and comfortable assistive devices that can be used on a daily basis by patients. It has been shown that low-weight, portable and easy to put on and take off robots are key aspects to make robots comfortable for users. Soft robots tend to be lighter and more comfortable than rigid structures, however they still cannot deliver the required assistive forces, which suggests that the optimal approach is a hybrid approach with energy efficient actuation and control systems. Sensing technologies and computational methods have shown impressive progress for multimodal data collection and recognition of ADLs under well-controlled environments and for specific group of activities. Computational methods still have the challenge to perform reliably in daily life environments and respond safely to unseen data and unexpected body motions. This suggests that assistive systems need to be designed with the capability to learn continuously and adapt to the user and terrains autonomously. These aspects of autonomy and learning will require appropriate regulations for robust collection of datasets, privacy in data sharing and ethical design of assistive robots, but this will also require the direct involvement of clinicians and patients in the design process. All these aspects will ensure the development of reliable, safe and comfortable assistive robots, that are transparent in the decisions made and assistive actions performed, and thus, make the user trust the robot and accept it for daily usage.

Author Contributions

Conceptualization, U.M.-H., B.M., D.Z.; investigation, U.M.-H., B.M., T.A., L.J., J.M.; writing original draft, review and editing, U.M.-H., B.M., T.A., L.J., J.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sultan, N. Reflective thoughts on the potential and challenges of wearable technology for healthcare provision and medical education. Int. J. Inf. Manag. 2015, 35, 521–526. [Google Scholar] [CrossRef]
  2. Chen, T.L.; Ciocarlie, M.; Cousins, S.; Grice, P.M.; Hawkins, K.; Hsiao, K.; Kemp, C.C.; King, C.H.; Lazewatsky, D.A.; Leeper, A.E.; et al. Robots for humanity: Using assistive robotics to empower people with disabilities. IEEE Robot. Autom. Mag. 2013, 20, 30–39. [Google Scholar] [CrossRef]
  3. Park, S.; Jayaraman, S. Enhancing the quality of life through wearable technology. IEEE Eng. Med. Biol. Mag. 2003, 22, 41–48. [Google Scholar] [CrossRef] [PubMed]
  4. Lee, J.; Kim, D.; Ryoo, H.Y.; Shin, B.S. Sustainable wearables: Wearable technology for enhancing the quality of human life. Sustainability 2016, 8, 466. [Google Scholar] [CrossRef] [Green Version]
  5. Merad, M.; De Montalivet, É.; Touillet, A.; Martinet, N.; Roby-Brami, A.; Jarrassé, N. Can we achieve intuitive prosthetic elbow control based on healthy upper limb motor strategies? Front. Neurorobot. 2018, 12, 1. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Jiryaei, Z.; Alvar, A.A.; Bani, M.A.; Vahedi, M.; Jafarpisheh, A.S.; Razfar, N. Development and feasibility of a soft pneumatic-robotic glove to assist impaired hand function in quadriplegia patients: A pilot study. J. Bodyw. Mov. Ther. 2021, 27, 731–736. [Google Scholar] [CrossRef]
  7. Chung, J.; Heimgartner, R.; O’Neill, C.T.; Phipps, N.S.; Walsh, C.J. Exoboot, a soft inflatable robotic boot to assist ankle during walking: Design, characterization and preliminary tests. In Proceedings of the 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), Enschede, The Netherlands, 26–29 August 2018; pp. 509–516. [Google Scholar]
  8. Kapsalyamov, A.; Jamwal, P.K.; Hussain, S.; Ghayesh, M.H. State of the art lower limb robotic exoskeletons for elderly assistance. IEEE Access 2019, 7, 95075–95086. [Google Scholar] [CrossRef]
  9. Huo, W.; Mohammed, S.; Moreno, J.C.; Amirat, Y. Lower limb wearable robots for assistance and rehabilitation: A state of the art. IEEE Syst. J. 2014, 10, 1068–1081. [Google Scholar] [CrossRef]
  10. Oguntosin, V.; Harwin, W.S.; Kawamura, S.; Nasuto, S.J.; Hayashi, Y. Development of a wearable assistive soft robotic device for elbow rehabilitation. In Proceedings of the 2015 IEEE International Conference on Rehabilitation Robotics (ICORR), Singapore, 11–14 August 2015; pp. 747–752. [Google Scholar]
  11. Rahman, M.H.; Rahman, M.J.; Cristobal, O.; Saad, M.; Kenné, J.P.; Archambault, P.S. Development of a whole arm wearable robotic exoskeleton for rehabilitation and to assist upper limb movements. Robotica 2015, 33, 19–39. [Google Scholar] [CrossRef] [Green Version]
  12. Chen, B.; Ma, H.; Qin, L.Y.; Gao, F.; Chan, K.M.; Law, S.W.; Qin, L.; Liao, W.H. Recent developments and challenges of lower extremity exoskeletons. J. Orthop. Transl. 2016, 5, 26–37. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Koo, S.H. Design factors and preferences in wearable soft robots for movement disabilities. Int. J. Cloth. Sci. Technol. 2018, 30, 477–495. [Google Scholar] [CrossRef]
  14. Lee, H.R.; Riek, L.D. Reframing assistive robots to promote successful aging. ACM Trans. Hum.-Robot Interact. (THRI) 2018, 7, 1–23. [Google Scholar] [CrossRef] [Green Version]
  15. WHO. Assistive Technology. 2018. Available online: https://www.who.int/news-room/fact-sheets/detail/assistive-technology (accessed on 31 January 2021).
  16. Garçon, L.; Khasnabis, C.; Walker, L.; Nakatani, Y.; Lapitan, J.; Borg, J.; Ross, A.; Velazquez Berumen, A. Medical and assistive health technology: Meeting the needs of aging populations. Gerontologist 2016, 56, S293–S302. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Porciuncula, F.; Roto, A.V.; Kumar, D.; Davis, I.; Roy, S.; Walsh, C.J.; Awad, L.N. Wearable movement sensors for rehabilitation: A focused review of technological and clinical advances. PM&R 2018, 10, S220–S232. [Google Scholar]
  18. Tapu, R.; Mocanu, B.; Zaharia, T. Wearable assistive devices for visually impaired: A state of the art survey. Pattern Recognit. Lett. 2018, 137, 37–52. [Google Scholar] [CrossRef]
  19. Moyle, W. The promise of technology in the future of dementia care. Nat. Rev. Neurol. 2019, 15, 353–359. [Google Scholar] [CrossRef] [Green Version]
  20. MacLachlan, M.; McVeigh, J.; Cooke, M.; Ferri, D.; Holloway, C.; Austin, V.; Javadi, D. Intersections between systems thinking and market shaping for assistive technology: The smart (Systems-Market for assistive and related technologies) thinking matrix. Int. J. Environ. Res. Public Health 2018, 15, 2627. [Google Scholar] [CrossRef] [Green Version]
  21. Tebbutt, E.; Brodmann, R.; Borg, J.; MacLachlan, M.; Khasnabis, C.; Horvath, R. Assistive products and the sustainable development goals (SDGs). Glob. Health 2016, 12, 1–6. [Google Scholar] [CrossRef] [Green Version]
  22. Layton, N.; Murphy, C.; Bell, D. From individual innovation to global impact: The Global Cooperation on Assistive Technology (GATE) innovation snapshot as a method for sharing and scaling. Disabil. Rehabil. Assist. Technol. 2018, 13, 486–491. [Google Scholar] [CrossRef] [Green Version]
  23. WHO. Priority Assistive Products List. 2016. Available online: https://apps.who.int/iris/rest/bitstreams/920804/retrieve (accessed on 31 January 2021).
  24. Patel, S.; Park, H.; Bonato, P.; Chan, L.; Rodgers, M. A review of wearable sensors and systems with application in rehabilitation. J. Neuroeng. Rehabil. 2012, 9, 1–17. [Google Scholar] [CrossRef] [Green Version]
  25. Moon, N.W.; Baker, P.M.; Goughnour, K. Designing wearable technologies for users with disabilities: Accessibility, usability, and connectivity factors. J. Rehabil. Assist. Technol. Eng. 2019, 6, 1–12. [Google Scholar] [CrossRef] [Green Version]
  26. Vollmer Dahlke, D.; Ory, M.G. Emerging issues of intelligent assistive technology use among people with dementia and their caregivers: A US Perspective. Front. Public Health 2020, 8, 191. [Google Scholar] [CrossRef] [PubMed]
  27. Zeagler, C.; Gandy, M.; Baker, P. The Assistive Wearable: Inclusive by Design. Assist. Technol. Outcomes Benefits (ATOB) 2018, 12, 11–36. [Google Scholar]
  28. Herr, H. Exoskeletons and orthoses: Classification, design challenges and future directions. J. Neuroeng. Rehabil. 2009, 6, 1–9. [Google Scholar] [CrossRef] [Green Version]
  29. Esquenazi, A.; Talaty, M.; Packel, A.; Saulino, M. The ReWalk powered exoskeleton to restore ambulatory function to individuals with thoracic-level motor-complete spinal cord injury. Am. J. Phys. Med. Rehabil. 2012, 91, 911–921. [Google Scholar] [CrossRef] [Green Version]
  30. Barbareschi, G.; Richards, R.; Thornton, M.; Carlson, T.; Holloway, C. Statically vs. dynamically balanced gait: Analysis of a robotic exoskeleton compared with a human. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 6728–6731. [Google Scholar]
  31. Sankai, Y. HAL: Hybrid assistive limb based on cybernics. In Robotics Research; Springer: Berlin/Heidelberg, Germany, 2010; pp. 25–34. [Google Scholar]
  32. Birch, N.; Graham, J.; Priestley, T.; Heywood, C.; Sakel, M.; Gall, A.; Nunn, A.; Signal, N. Results of the first interim analysis of the RAPPER II trial in patients with spinal cord injury: Ambulation and functional exercise programs in the REX powered walking aid. J. Neuroeng. Rehabil. 2017, 14, 1–10. [Google Scholar] [CrossRef] [Green Version]
  33. Yan, T.; Cempini, M.; Oddo, C.M.; Vitiello, N. Review of assistive strategies in powered lower-limb orthoses and exoskeletons. Robot. Auton. Syst. 2015, 64, 120–136. [Google Scholar] [CrossRef]
  34. Marconi, D.; Baldoni, A.; McKinney, Z.; Cempini, M.; Crea, S.; Vitiello, N. A novel hand exoskeleton with series elastic actuation for modulated torque transfer. Mechatronics 2019, 61, 69–82. [Google Scholar] [CrossRef]
  35. Iqbal, J.; Khan, H.; Tsagarakis, N.G.; Caldwell, D.G. A novel exoskeleton robotic system for hand rehabilitation-conceptualization to prototyping. Biocybern. Biomed. Eng. 2014, 34, 79–89. [Google Scholar] [CrossRef]
  36. Bianchi, M.; Cempini, M.; Conti, R.; Meli, E.; Ridolfi, A.; Vitiello, N.; Allotta, B. Design of a series elastic transmission for hand exoskeletons. Mechatronics 2018, 51, 8–18. [Google Scholar] [CrossRef]
  37. Brokaw, E.B.; Holley, R.J.; Lum, P.S. Hand spring operated movement enhancer (HandSOME) device for hand rehabilitation after stroke. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 5867–5870. [Google Scholar]
  38. Chiri, A.; Vitiello, N.; Giovacchini, F.; Roccella, S.; Vecchi, F.; Carrozza, M.C. Mechatronic design and characterization of the index finger module of a hand exoskeleton for post-stroke rehabilitation. IEEE/ASmE Trans. Mechatron. 2011, 17, 884–894. [Google Scholar] [CrossRef]
  39. Gasser, B.W.; Bennett, D.A.; Durrough, C.M.; Goldfarb, M. Design and preliminary assessment of vanderbilt hand exoskeleton. In Proceedings of the 2017 International Conference on Rehabilitation Robotics (ICORR), London, UK, 17–20 July 2017; pp. 1537–1542. [Google Scholar]
  40. Majidi, C. Soft robotics: A perspective—Current trends and prospects for the future. Soft Robot. 2014, 1, 5–11. [Google Scholar] [CrossRef]
  41. Thalman, C.; Artemiadis, P. A review of soft wearable robots that provide active assistance: Trends, common actuation methods, fabrication, and applications. Wearable Technol. 2020, 1, e3. [Google Scholar] [CrossRef]
  42. Sridar, S.; Qiao, Z.; Muthukrishnan, N.; Zhang, W.; Polygerinos, P. A soft-inflatable exosuit for knee rehabilitation: Assisting swing phase during walking. Front. Robot. AI 2018, 5, 44. [Google Scholar] [CrossRef] [Green Version]
  43. Park, Y.L.; Chen, B.R.; Young, D.; Stirling, L.; Wood, R.J.; Goldfield, E.C.; Nagpal, R. Design and control of a bio-inspired soft wearable robotic device for ankle-foot rehabilitation. Bioinspir. Biomim. 2014, 9, 016007. [Google Scholar] [CrossRef]
  44. Asbeck, A.T.; Schmidt, K.; Walsh, C.J. Soft exosuit for hip assistance. Robot. Auton. Syst. 2015, 73, 102–110. [Google Scholar] [CrossRef]
  45. Asbeck, A.T.; De Rossi, S.M.; Holt, K.G.; Walsh, C.J. A biologically inspired soft exosuit for walking assistance. Int. J. Robot. Res. 2015, 34, 744–762. [Google Scholar] [CrossRef]
  46. Khomami, A.M.; Najafi, F. A survey on soft lower limb cable-driven wearable robots without rigid links and joints. Robot. Auton. Syst. 2021, 144, 103846. [Google Scholar] [CrossRef]
  47. Polygerinos, P.; Wang, Z.; Galloway, K.C.; Wood, R.J.; Walsh, C.J. Soft robotic glove for combined assistance and at-home rehabilitation. Robot. Auton. Syst. 2015, 73, 135–143. [Google Scholar] [CrossRef] [Green Version]
  48. Skorina, E.H.; Luo, M.; Onal, C.D. A soft robotic wearable wrist device for kinesthetic haptic feedback. Front. Robot. AI 2018, 5, 83. [Google Scholar] [CrossRef] [Green Version]
  49. Koh, T.H.; Cheng, N.; Yap, H.K.; Yeow, C.H. Design of a soft robotic elbow sleeve with passive and intent-controlled actuation. Front. Neurosci. 2017, 11, 597. [Google Scholar] [CrossRef] [Green Version]
  50. Ge, L.; Chen, F.; Wang, D.; Zhang, Y.; Han, D.; Wang, T.; Gu, G. Design, modeling, and evaluation of fabric-based pneumatic actuators for soft wearable assistive gloves. Soft Robot. 2020, 7, 583–596. [Google Scholar] [CrossRef]
  51. Gerez, L.; Chen, J.; Liarokapis, M. On the development of adaptive, tendon-driven, wearable exo-gloves for grasping capabilities enhancement. IEEE Robot. Autom. Lett. 2019, 4, 422–429. [Google Scholar] [CrossRef]
  52. Yun, S.S.; Kang, B.B.; Cho, K.J. Exo-glove PM: An easily customizable modularized pneumatic assistive glove. IEEE Robot. Autom. Lett. 2017, 2, 1725–1732. [Google Scholar] [CrossRef]
  53. Hadi, A.; Alipour, K.; Kazeminasab, S.; Elahinia, M. ASR glove: A wearable glove for hand assistance and rehabilitation using shape memory alloys. J. Intell. Mater. Syst. Struct. 2018, 29, 1575–1585. [Google Scholar] [CrossRef]
  54. Valdastri, P.; Rossi, S.; Menciassi, A.; Lionetti, V.; Bernini, F.; Recchia, F.A.; Dario, P. An implantable ZigBee ready telemetric platform for monitoring of physiological parameters. Sens. Actuators A Phys. 2008, 142, 369–378. [Google Scholar] [CrossRef]
  55. Malhi, K.; Mukhopadhyay, S.C.; Schnepper, J.; Haefke, M.; Ewald, H. A zigbee-based wearable physiological parameters monitoring system. IEEE Sens. J. 2010, 12, 423–430. [Google Scholar] [CrossRef]
  56. Díaz, S.; Stephenson, J.B.; Labrador, M.A. Use of wearable sensor technology in gait, balance, and range of motion analysis. Appl. Sci. 2020, 10, 234. [Google Scholar] [CrossRef] [Green Version]
  57. Tan, C.; Dong, Z.; Li, Y.; Zhao, H.; Huang, X.; Zhou, Z.; Jiang, J.W.; Long, Y.Z.; Jiang, P.; Zhang, T.Y.; et al. A high performance wearable strain sensor with advanced thermal management for motion monitoring. Nat. Commun. 2020, 11, 1–10. [Google Scholar] [CrossRef] [PubMed]
  58. Khan, Y.; Han, D.; Ting, J.; Ahmed, M.; Nagisetty, R.; Arias, A.C. Organic multi-channel optoelectronic sensors for wearable health monitoring. IEEE Access 2019, 7, 128114–128124. [Google Scholar] [CrossRef]
  59. Nisar, H.; Khan, M.B.; Yi, W.T.; Voon, Y.V.; Khang, T.S. Contactless heart rate monitor for multiple persons in a video. In Proceedings of the 2016 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), Nantou, Taiwan, 27–29 May 2016; pp. 1–2. [Google Scholar]
  60. Maestre-Rendon, J.R.; Rivera-Roman, T.A.; Fernandez-Jaramillo, A.A.; Guerron Paredes, N.E.; Serrano Olmedo, J.J. A non-contact photoplethysmography technique for the estimation of heart rate via smartphone. Appl. Sci. 2020, 10, 154. [Google Scholar] [CrossRef] [Green Version]
  61. Lee, H.; Chung, H.; Ko, H.; Lee, J. Wearable multichannel photoplethysmography framework for heart rate monitoring during intensive exercise. IEEE Sens. J. 2018, 18, 2983–2993. [Google Scholar] [CrossRef]
  62. Yüzer, A.; Sümbül, H.; Polat, K. A novel wearable real-time sleep apnea detection system based on the acceleration sensor. IRBM 2020, 41, 39–47. [Google Scholar] [CrossRef]
  63. Surrel, G.; Aminifar, A.; Rincón, F.; Murali, S.; Atienza, D. Online obstructive sleep apnea detection on medical wearable sensors. IEEE Trans. Biomed. Circuits Syst. 2018, 12, 762–773. [Google Scholar] [CrossRef]
  64. Haji Ghassemi, N.; Hannink, J.; Roth, N.; Gaßner, H.; Marxreiter, F.; Klucken, J.; Eskofier, B.M. Turning analysis during standardized test using on-shoe wearable sensors in Parkinson’s disease. Sensors 2019, 19, 3103. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  65. Mariani, B.; Jiménez, M.C.; Vingerhoets, F.J.; Aminian, K. On-shoe wearable sensors for gait and turning assessment of patients with Parkinson’s disease. IEEE Trans. Biomed. Eng. 2012, 60, 155–158. [Google Scholar] [CrossRef] [PubMed]
  66. Chen, B.R.; Patel, S.; Buckley, T.; Rednic, R.; McClure, D.J.; Shih, L.; Tarsy, D.; Welsh, M.; Bonato, P. A web-based system for home monitoring of patients with Parkinson’s disease using wearable sensors. IEEE Trans. Biomed. Eng. 2010, 58, 831–836. [Google Scholar] [CrossRef] [PubMed]
  67. Monje, M.H.; Foffani, G.; Obeso, J.; Sánchez-Ferro, Á. New sensor and wearable technologies to aid in the diagnosis and treatment monitoring of Parkinson’s disease. Annu. Rev. Biomed. Eng. 2019, 21, 111–143. [Google Scholar] [CrossRef] [PubMed]
  68. Silva de Lima, A.L.; Smits, T.; Darweesh, S.K.; Valenti, G.; Milosevic, M.; Pijl, M.; Baldus, H.; de Vries, N.M.; Meinders, M.J.; Bloem, B.R. Home-based monitoring of falls using wearable sensors in Parkinson’s disease. Mov. Disord. 2020, 35, 109–115. [Google Scholar] [CrossRef]
  69. Bae, C.W.; Toi, P.T.; Kim, B.Y.; Lee, W.I.; Lee, H.B.; Hanif, A.; Lee, E.H.; Lee, N.E. Fully stretchable capillary microfluidics-integrated nanoporous gold electrochemical sensor for wearable continuous glucose monitoring. ACS Appl. Mater. Interfaces 2019, 11, 14567–14575. [Google Scholar] [CrossRef] [PubMed]
  70. Bandodkar, A.J.; Wang, J. Non-invasive wearable electrochemical sensors: A review. Trends Biotechnol. 2014, 32, 363–371. [Google Scholar] [CrossRef]
  71. Wang, R.; Zhai, Q.; Zhao, Y.; An, T.; Gong, S.; Guo, Z.; Shi, Q.; Yong, Z.; Cheng, W. Stretchable gold fiber-based wearable electrochemical sensor toward pH monitoring. J. Mater. Chem. B 2020, 8, 3655–3660. [Google Scholar] [CrossRef] [PubMed]
  72. Edwards, J. Wireless sensors relay medical insight to patients and caregivers [special reports]. IEEE Signal Process. Mag. 2012, 29, 8–12. [Google Scholar] [CrossRef]
  73. Castillejo, P.; Martinez, J.F.; Rodriguez-Molina, J.; Cuerva, A. Integration of wearable devices in a wireless sensor network for an E-health application. IEEE Wirel. Commun. 2013, 20, 38–49. [Google Scholar] [CrossRef] [Green Version]
  74. Mukhopadhyay, S.C. Wearable sensors for human activity monitoring: A review. IEEE Sens. J. 2014, 15, 1321–1330. [Google Scholar] [CrossRef]
  75. Nag, A.; Mukhopadhyay, S.C.; Kosel, J. Wearable flexible sensors: A review. IEEE Sens. J. 2017, 17, 3949–3960. [Google Scholar] [CrossRef] [Green Version]
  76. Stavropoulos, T.G.; Papastergiou, A.; Mpaltadoros, L.; Nikolopoulos, S.; Kompatsiaris, I. IoT wearable sensors and devices in elderly care: A literature review. Sensors 2020, 20, 2826. [Google Scholar] [CrossRef] [PubMed]
  77. Chen, X.; Gong, L.; Wei, L.; Yeh, S.C.; Da Xu, L.; Zheng, L.; Zou, Z. A Wearable Hand Rehabilitation System with Soft Gloves. IEEE Trans. Ind. Inform. 2021, 17, 943–952. [Google Scholar] [CrossRef]
  78. Stefanou, T.; Chance, G.; Assaf, T.; Dogramadzi, S. Tactile Signatures and Hand Motion Intent Recognition for Wearable Assistive Devices. Front. Robot. AI 2019, 6, 124. [Google Scholar] [CrossRef] [Green Version]
  79. Badawi, A.A.; Al-Kabbany, A.; Shaban, H. Multimodal human activity recognition from wearable inertial sensors using machine learning. In Proceedings of the 2018 IEEE-EMBS Conference on Biomedical Engineering and Sciences (IECBES), Kuching, Malaysia, 3–6 December 2018; pp. 402–407. [Google Scholar]
  80. Garcia-Ceja, E.; Galván-Tejada, C.E.; Brena, R. Multi-view stacking for activity recognition with sound and accelerometer data. Inf. Fusion 2018, 40, 45–56. [Google Scholar] [CrossRef]
  81. Martinez-Hernandez, U.; Dehghani-Sanij, A.A. Probabilistic identification of sit-to-stand and stand-to-sit with a wearable sensor. Pattern Recognit. Lett. 2019, 118, 32–41. [Google Scholar] [CrossRef] [Green Version]
  82. Zhao, H.; Wang, Z.; Qiu, S.; Wang, J.; Xu, F.; Wang, Z.; Shen, Y. Adaptive gait detection based on foot-mounted inertial sensors and multi-sensor fusion. Inf. Fusion 2019, 52, 157–166. [Google Scholar] [CrossRef]
  83. Roman, R.C.; Precup, R.E.; Petriu, E.M. Hybrid data-driven fuzzy active disturbance rejection control for tower crane systems. Eur. J. Control 2021, 58, 373–387. [Google Scholar] [CrossRef]
  84. Zhang, H.; Liu, X.; Ji, H.; Hou, Z.; Fan, L. Multi-agent-based data-driven distributed adaptive cooperative control in urban traffic signal timing. Energies 2019, 12, 1402. [Google Scholar] [CrossRef] [Green Version]
  85. Yoon, J.G.; Lee, M.C. Design of tendon mechanism for soft wearable robotic hand and its fuzzy control using electromyogram sensor. Intell. Serv. Robot. 2021, 14, 119–128. [Google Scholar] [CrossRef]
  86. Parri, A.; Yuan, K.; Marconi, D.; Yan, T.; Crea, S.; Munih, M.; Lova, R.M.; Vitiello, N.; Wang, Q. Real-time hybrid locomotion mode recognition for lower limb wearable robots. IEEE/ASME Trans. Mechatron. 2017, 22, 2480–2491. [Google Scholar] [CrossRef]
  87. Jamwal, P.K.; Xie, S.Q.; Hussain, S.; Parsons, J.G. An adaptive wearable parallel robot for the treatment of ankle injuries. IEEE/ASME Trans. Mechatron. 2012, 19, 64–75. [Google Scholar] [CrossRef]
  88. Hatcher, W.G.; Yu, W. A Survey of Deep Learning: Platforms, Applications and Emerging Research Trends. IEEE Access 2018, 6, 24411–24432. [Google Scholar] [CrossRef]
  89. Liu, S.; Zhang, J.; Zhang, Y.; Zhu, R. A wearable motion capture device able to detect dynamic motion of human limbs. Nat. Commun. 2020, 11, 1–12. [Google Scholar] [CrossRef]
  90. Oniga, S.; József, S.S. Optimal Recognition Method of Human Activities Using Artificial Neural Networks. Meas. Sci. Rev. 2015, 15, 323–327. [Google Scholar] [CrossRef] [Green Version]
  91. Yang, G.; Deng, J.; Pang, G.; Zhang, H.; Li, J.; Deng, B.; Pang, Z.; Xu, J.; Jiang, M.; Liljeberg, P.; et al. An IoT-Enabled Stroke Rehabilitation System Based on Smart Wearable Armband and Machine Learning. IEEE J. Transl. Eng. Health Med. 2018, 6, 1. [Google Scholar] [CrossRef]
  92. Lin, Y.; Wang, K.; Yi, W.; Lian, S. Deep learning based wearable assistive system for visually impaired people. In Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, Seoul, Korea, 27 October–2 November 2019. [Google Scholar]
  93. Yang, J.B.; Nguyen, M.N.; San, P.P.; Li, X.L.; Krishnaswamy, S. Deep convolutional neural networks on multichannel time series for human activity recognition. In Proceedings of the IJCAI International Joint Conference on Artificial Intelligence, Buenos Aires, Argentina, 25–31 July 2015; Volume 2015, pp. 3995–4001. [Google Scholar]
  94. Xu, W.; Pang, Y.; Yang, Y.; Liu, Y. Human Activity Recognition Based on Convolutional Neural Network. In Proceedings of the International Conference on Pattern Recognition, Beijing, China, 20–24 August 2018; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2018; Volume 2018, pp. 165–170. [Google Scholar] [CrossRef]
  95. Lee, S.M.; Yoon, S.M.; Cho, H. Human activity recognition from accelerometer data using Convolutional Neural Network. In Proceedings of the 2017 IEEE International Conference on Big Data and Smart Computing, BigComp 2017, Jeju, Korea, 13–16 February 2017; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2017; pp. 131–134. [Google Scholar] [CrossRef]
  96. Suri, K.; Gupta, R. Convolutional Neural Network Array for Sign Language Recognition Using Wearable IMUs. In Proceedings of the 2019 6th International Conference on Signal Processing and Integrated Networks, SPIN 2019, Noida, India, 7–8 March 2019; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2019; pp. 483–488. [Google Scholar] [CrossRef]
  97. Ha, S.; Choi, S. Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors. In Proceedings of the International Joint Conference on Neural Networks, Vancouver, BC, Canada, 24–29 July 2016; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2016; Volume 2016, pp. 381–388. [Google Scholar] [CrossRef]
  98. Hammerla, N.Y.; Halloran, S.; Ploetz, T. Deep, Convolutional, and Recurrent Models for Human Activity Recognition using Wearables. In Proceedings of the IJCAI International Joint Conference on Artificial Intelligence, New York, NY, USA, 9–15 July 2016; Volume 2016, pp. 1533–1540. [Google Scholar]
  99. Ordóñez, F.J.; Roggen, D. Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [Green Version]
  100. Xu, C.; Chai, D.; He, J.; Zhang, X.; Duan, S. InnoHAR: A deep neural network for complex human activity recognition. IEEE Access 2019, 7, 9893–9902. [Google Scholar] [CrossRef]
  101. Wang, K.; He, J.; Zhang, L. Sequential Weakly Labeled Multi-Activity Recognition and Location on Wearable Sensors using Recurrent Attention Network. arXiv 2020, arXiv:2004.05768. [Google Scholar]
  102. Josephs, D.; Drake, C.; Heroy, A.M.; Santerre, J. sEMG Gesture Recognition with a Simple Model of Attention. arXiv 2020, arXiv:2006.03645. [Google Scholar]
  103. Chung, S.; Lim, J.; Noh, K.J.; Kim, G.; Jeong, H. Sensor data acquisition and multimodal sensor fusion for human activity recognition using deep learning. Sensors 2019, 19, 1716. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  104. Little, K.; Pappachan, K.B.; Yang, S.; Noronha, B.; Campolo, D.; Accoto, D. Elbow Motion Trajectory Prediction Using a Multi-Modal Wearable System: A Comparative Analysis of Machine Learning Techniques. Sensors 2021, 21, 498. [Google Scholar] [CrossRef] [PubMed]
  105. Fung, M.L.; Chen, M.Z.Q.; Chen, Y.H. Sensor fusion: A review of methods and applications. In Proceedings of the 29th Chinese Control Furthermore, Decision Conference (CCDC), Chongqing, China, 28–30 May 2017; pp. 3853–3860. [Google Scholar] [CrossRef]
  106. Gravina, R.; Alinia, P.; Ghasemzadeh, H.; Fortino, G. Multi-sensor fusion in body sensor networks: State-of-the-art and research challenges. Inf. Fusion 2017, 35, 1339–1351. [Google Scholar] [CrossRef]
  107. Mitchell, H.B. Data Fusion: Concepts and Ideas; Springer: Berlin/Heidelberg, Germany, 2012; pp. 1–344. [Google Scholar] [CrossRef]
  108. Muzammal, M.; Talat, R.; Sodhro, A.H.; Pirbhulal, S. A multi-sensor data fusion enabled ensemble approach for medical data from body sensor networks. Inf. Fusion 2020, 53, 155–164. [Google Scholar] [CrossRef]
  109. Male, J.; Martinez-Hernandez, U. Recognition of human activity and the state of an assembly task using vision and inertial sensor fusion methods. In Proceedings of the IEEE International Conference on Industrial Technology, Valencia, Spain, 10–12 March 2021. [Google Scholar]
  110. Qin, Z.; Zhang, Y.; Meng, S.; Qin, Z.; Choo, K.K.R. Imaging and fusing time series for wearable sensor-based human activity recognition. Inf. Fusion 2020, 53, 80–87. [Google Scholar] [CrossRef]
  111. Liang, X.; Li, H.; Wang, W.; Liu, Y.; Ghannam, R.; Fioranelli, F.; Heidari, H. Fusion of Wearable and Contactless Sensors for Intelligent Gesture Recognition. Adv. Intell. Syst. 2019, 1, 1900088. [Google Scholar] [CrossRef] [Green Version]
  112. Smith, R. Adaptivity as a Service (AaaS): Personalised Assistive Robotics for Ambient Assisted Living. In Proceedings of the International Symposium on Ambient Intelligence, L’Aquila, Italy, 7–9 October 2020; pp. 239–242. [Google Scholar]
  113. Martins, G.S.; Santos, L.; Dias, J. User-adaptive interaction in social robots: A survey focusing on non-physical interaction. Int. J. Soc. Robot. 2019, 11, 185–205. [Google Scholar] [CrossRef]
  114. Chen, L.; Nugent, C.D.; Wang, H. A knowledge-driven approach to activity recognition in smart homes. IEEE Trans. Knowl. Data Eng. 2011, 24, 961–974. [Google Scholar] [CrossRef]
  115. Chen, L.; Nugent, C.; Okeyo, G. An ontology-based hybrid approach to activity modeling for smart homes. IEEE Trans. Hum.- Syst. 2013, 44, 92–105. [Google Scholar] [CrossRef]
  116. Hu, B.; Rouse, E.; Hargrove, L. Benchmark datasets for bilateral lower-limb neuromechanical signals from wearable sensors during unassisted locomotion in able-bodied individuals. Front. Robot. AI 2018, 5, 14. [Google Scholar] [CrossRef] [Green Version]
  117. Young, A.J.; Simon, A.M.; Fey, N.P.; Hargrove, L.J. Intent recognition in a powered lower limb prosthesis using time history information. Ann. Biomed. Eng. 2014, 42, 631–641. [Google Scholar] [CrossRef]
  118. Berryman, N.; Gayda, M.; Nigam, A.; Juneau, M.; Bherer, L.; Bosquet, L. Comparison of the metabolic energy cost of overground and treadmill walking in older adults. Eur. J. Appl. Physiol. 2012, 112, 1613–1620. [Google Scholar] [CrossRef]
  119. Al-dabbagh, A.H.; Ronsse, R. A review of terrain detection systems for applications in locomotion assistance. Robot. Auton. Syst. 2020, 133, 103628. [Google Scholar] [CrossRef]
  120. Miyatake, T.; Lee, S.; Galiana, I.; Rossi, D.M.; Siviy, C.; Panizzolo, F.A.; Walsh, C.J. Biomechanical analysis and inertial sensing of ankle joint while stepping on an unanticipated bump. In Wearable Robotics: Challenges and Trends; Springer: Berlin/Heidelberg, Germany, 2017; pp. 343–347. [Google Scholar]
  121. Bahk, J.H.; Fang, H.; Yazawa, K.; Shakouri, A. Flexible thermoelectric materials and device optimization for wearable energy harvesting. J. Mater. Chem. C 2015, 3, 10362–10374. [Google Scholar] [CrossRef]
  122. Zhu, X.; Li, Q.; Chen, G. APT: Accurate outdoor pedestrian tracking with smartphones. In Proceedings of the IEEE INFOCOM 2013, Turin, Italy, 14–19 April 2013; pp. 2508–2516. [Google Scholar]
  123. Blobel, B.; Pharow, P.; Sousa, F. PHealth 2012: Proceedings of the 9th International Conference on Wearable Micro and Nano Technologies for Personalized Health, PHealth 2012, Porto, Portugal, 26–28 June 2012; IOS Press: Washington, DC, USA, 2012; Volume 177. [Google Scholar]
  124. Phillips, S.M.; Cadmus-Bertram, L.; Rosenberg, D.; Buman, M.P.; Lynch, B.M. Wearable technology and physical activity in chronic disease: Opportunities and challenges. Am. J. Prev. Med. 2018, 54, 144. [Google Scholar] [CrossRef] [PubMed]
  125. McAdams, E.; Krupaviciute, A.; Géhin, C.; Grenier, E.; Massot, B.; Dittmar, A.; Rubel, P.; Fayn, J. Wearable sensor systems: The challenges. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 3648–3651. [Google Scholar]
  126. Biddiss, E.; Chau, T. Upper-limb prosthetics: Critical factors in device abandonment. Am. J. Phys. Med. Rehabil. 2007, 86, 977–987. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  127. Smail, L.C.; Neal, C.; Wilkins, C.; Packham, T.L. Comfort and function remain key factors in upper limb prosthetic abandonment: Findings of a scoping review. Disabil. Rehabil. Assist. Technol. 2020, 16, 1–10. [Google Scholar] [CrossRef] [PubMed]
  128. Rodgers, M.M.; Alon, G.; Pai, V.M.; Conroy, R.S. Wearable technologies for active living and rehabilitation: Current research challenges and future opportunities. J. Rehabil. Assist. Technol. Eng. 2019, 6, 2055668319839607. [Google Scholar] [CrossRef]
  129. Sugawara, A.T.; Ramos, V.D.; Alfieri, F.M.; Battistella, L.R. Abandonment of assistive products: Assessing abandonment levels and factors that impact on it. Disabil. Rehabil. Assist. Technol. 2018, 13, 716–723. [Google Scholar] [CrossRef] [PubMed]
  130. Holloway, C.; Dawes, H. Disrupting the world of disability: The next generation of assistive technologies and rehabilitation practices. Healthc. Technol. Lett. 2016, 3, 254–256. [Google Scholar] [CrossRef] [Green Version]
  131. van den Kieboom, R.C.; Bongers, I.M.; Mark, R.E.; Snaphaan, L.J. User-driven living lab for assistive technology to support people with dementia living at home: Protocol for developing co-creation–based innovations. JMIR Res. Protoc. 2019, 8, e10952. [Google Scholar] [CrossRef]
  132. Callari, T.C.; Moody, L.; Magee, P.; Yang, D. ‘Smart–not only intelligent!’Co-creating priorities and design direction for ‘smart’footwear to support independent ageing. Int. J. Fash. Des. Technol. Educ. 2019, 12, 313–324. [Google Scholar] [CrossRef]
  133. Ugulino, W.C.; Fuks, H. Prototyping wearables for supporting cognitive mapping by the blind: Lessons from co-creation workshops. In Proceedings of the 2015 Workshop on Wearable Systems and Applications, Florence, Italy, 18 May 2015; pp. 39–44. [Google Scholar]
  134. van Dijk-de Vries, A.; Stevens, A.; van der Weijden, T.; Beurskens, A.J. How to support a co-creative research approach in order to foster impact. The development of a Co-creation Impact Compass for healthcare researchers. PLoS ONE 2020, 15, e0240543. [Google Scholar] [CrossRef]
  135. Van der Zijpp, T.; Wouters, E.; Sturm, J. To use or not to use: The design, implementation and acceptance of technology in the context of health care. In Assistive Technologies in Smart Cities; InTechOpen: London, UK, 2018. [Google Scholar]
  136. Ran, M.; Banes, D.; Scherer, M.J. Basic principles for the development of an AI-based tool for assistive technology decision making. Disabil. Rehabil. Assist. Technol. 2020, 1–4. [Google Scholar] [CrossRef]
  137. Seyhan, A.A. Lost in translation: The valley of death across preclinical and clinical divide-identification of problems and overcoming obstacles. Transl. Med. Commun. 2019, 4, 1–19. [Google Scholar] [CrossRef] [Green Version]
  138. Ortiz, J.; Poliero, T.; Cairoli, G.; Graf, E.; Caldwell, D.G. Energy efficiency analysis and design optimization of an actuation system in a soft modular lower limb exoskeleton. IEEE Robot. Autom. Lett. 2017, 3, 484–491. [Google Scholar] [CrossRef] [Green Version]
  139. Kim, J.; Lee, G.; Heimgartner, R.; Revi, D.A.; Karavas, N.; Nathanson, D.; Galiana, I.; Eckert-Erdheim, A.; Murphy, P.; Perry, D.; et al. Reducing the metabolic rate of walking and running with a versatile, portable exosuit. Science 2019, 365, 668–672. [Google Scholar] [CrossRef] [PubMed]
  140. Kim, J.; Moon, J.; Kim, J.; Lee, G. Compact Variable Gravity Compensation Mechanism With a Geometrically Optimized Lever for Maximizing Variable Ratio of Torque Generation. IEEE/ASME Trans. Mechatron. 2020, 25, 2019–2026. [Google Scholar] [CrossRef]
  141. Lin, P.Y.; Shieh, W.B.; Chen, D.Z. A theoretical study of weight-balanced mechanisms for design of spring assistive mobile arm support (MAS). Mech. Mach. Theory 2013, 61, 156–167. [Google Scholar] [CrossRef]
  142. Lenzo, B.; Zanotto, D.; Vashista, V.; Frisoli, A.; Agrawal, S. A new Constant Pushing Force Device for human walking analysis. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 6174–6179. [Google Scholar]
  143. Cherelle, P.; Grosu, V.; Beyl, P.; Mathys, A.; Van Ham, R.; Van Damme, M.; Vanderborght, B.; Lefeber, D. The MACCEPA actuation system as torque actuator in the gait rehabilitation robot ALTACRO. In Proceedings of the 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Tokyo, Japan, 26–29 September 2010; pp. 27–32. [Google Scholar]
  144. Moltedo, M.; Bacek, T.; Junius, K.; Vanderborght, B.; Lefeber, D. Mechanical design of a lightweight compliant and adaptable active ankle foot orthosis. In Proceedings of the 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob), Singapore, 26–29 June 2016; pp. 1224–1229. [Google Scholar]
  145. Niknejad, N.; Ismail, W.B.; Mardani, A.; Liao, H.; Ghani, I. A comprehensive overview of smart wearables: The state of the art literature, recent advances, and future challenges. Eng. Appl. Artif. Intell. 2020, 90, 103529. [Google Scholar] [CrossRef]
  146. Lee, Y.; Kozar, K.A.; Larsen, K.R. The technology acceptance model: Past, present, and future. Commun. Assoc. Inf. Syst. 2003, 12, 50. [Google Scholar] [CrossRef]
  147. Arias, O.; Wurm, J.; Hoang, K.; Jin, Y. Privacy and security in internet of things and wearable devices. IEEE Trans. Multi-Scale Comput. Syst. 2015, 1, 99–109. [Google Scholar] [CrossRef]
  148. Glikson, E.; Woolley, A.W. Human trust in artificial intelligence: Review of empirical research. Acad. Manag. Ann. 2020, 14, 627–660. [Google Scholar] [CrossRef]
  149. Cheng, M.; Nazarian, S.; Bogdan, P. There is hope after all: Quantifying opinion and trustworthiness in neural networks. Front. Artif. Intell. 2020, 3, 54. [Google Scholar] [CrossRef]
  150. Stokes, A.A.; Shepherd, R.F.; Morin, S.A.; Ilievski, F.; Whitesides, G.M. A hybrid combining hard and soft robots. Soft Robot. 2014, 1, 70–74. [Google Scholar] [CrossRef] [Green Version]
  151. Yang, H.D.; Asbeck, A.T. A new manufacturing process for soft robots and soft/rigid hybrid robots. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 8039–8046. [Google Scholar]
  152. Yu, S.; Huang, T.H.; Wang, D.; Lynn, B.; Sayd, D.; Silivanov, V.; Park, Y.S.; Tian, Y.; Su, H. Design and control of a high-torque and highly backdrivable hybrid soft exoskeleton for knee injury prevention during squatting. IEEE Robot. Autom. Lett. 2019, 4, 4579–4586. [Google Scholar] [CrossRef]
  153. Rose, C.G.; O’Malley, M.K. Hybrid rigid-soft hand exoskeleton to assist functional dexterity. IEEE Robot. Autom. Lett. 2018, 4, 73–80. [Google Scholar] [CrossRef]
  154. Araromi, O.A.; Walsh, C.J.; Wood, R.J. Hybrid carbon fiber-textile compliant force sensors for high-load sensing in soft exosuits. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 1798–1803. [Google Scholar]
  155. Gerez, L.; Dwivedi, A.; Liarokapis, M. A hybrid, soft exoskeleton glove equipped with a telescopic extra thumb and abduction capabilities. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 9100–9106. [Google Scholar]
  156. Park, S.; Meeker, C.; Weber, L.M.; Bishop, L.; Stein, J.; Ciocarlie, M. Multimodal sensing and interaction for a robotic hand orthosis. IEEE Robot. Autom. Lett. 2018, 4, 315–322. [Google Scholar] [CrossRef] [Green Version]
  157. Chiri, A.; Cempini, M.; De Rossi, S.M.M.; Lenzi, T.; Giovacchini, F.; Vitiello, N.; Carrozza, M.C. On the design of ergonomic wearable robotic devices for motion assistance and rehabilitation. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; pp. 6124–6127. [Google Scholar]
  158. Heerink, M.; Krose, B.; Evers, V.; Wielinga, B. Measuring acceptance of an assistive social robot: A suggested toolkit. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2009, Toyama, Japan, 27 September–2 October 2009; pp. 528–533. [Google Scholar]
  159. Zhang, B.; Wang, S.; Zhou, M.; Xu, W. An adaptive framework of real-time continuous gait phase variable estimation for lower-limb wearable robots. Robot. Auton. Syst. 2021, 143, 103842. [Google Scholar] [CrossRef]
  160. Umbrico, A.; Cesta, A.; Cortellessa, G.; Orlandini, A. A holistic approach to behavior adaptation for socially assistive robots. Int. J. Soc. Robot. 2020, 12, 617–637. [Google Scholar] [CrossRef]
  161. Cangelosi, A.; Invitto, S. Human-Robot Interaction and Neuroprosthetics: A review of new technologies. IEEE Consum. Electron. Mag. 2017, 6, 24–33. [Google Scholar] [CrossRef]
  162. Cesta, A.; Cortellessa, G.; Orlandini, A.; Umbrico, A. A cognitive loop for assistive robots-connecting reasoning on sensed data to acting. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 826–831. [Google Scholar]
  163. Moreno, J.C.; Asin, G.; Pons, J.L.; Cuypers, H.; Vanderborght, B.; Lefeber, D.; Ceseracciu, E.; Reggiani, M.; Thorsteinsson, F.; Del-Ama, A.; et al. Symbiotic wearable robotic exoskeletons: The concept of the biomot project. In Proceedings of the International Workshop on Symbiotic Interaction, Berlin, Germany, 7–8 October 2015; pp. 72–83. [Google Scholar]
  164. Khattak, A.M.; La, V.; Hung, D.V.; Truc, P.T.H.; Hung, L.X.; Guan, D.; Pervez, Z.; Han, M.; Lee, S.; Lee, Y.K.; et al. Context-aware human activity recognition and decision making. In Proceedings of the 12th IEEE International Conference on e-Health Networking, Applications and Services, Lyon, France, 1–3 July 2010; pp. 112–118. [Google Scholar]
  165. Riboni, D.; Bettini, C. COSAR: Hybrid reasoning for context-aware activity recognition. Pers. Ubiquitous Comput. 2011, 15, 271–289. [Google Scholar] [CrossRef]
  166. Wongpatikaseree, K.; Ikeda, M.; Buranarach, M.; Supnithi, T.; Lim, A.O.; Tan, Y. Activity recognition using context-aware infrastructure ontology in smart home domain. In Proceedings of the 2012 Seventh International Conference on Knowledge, Information and Creativity Support Systems, Melbourne, VIC, Australia, 8–10 November 2012; pp. 50–57. [Google Scholar]
  167. Cumin, J.; Lefebvre, G.; Ramparany, F.; Crowley, J.L. A dataset of routine daily activities in an instrumented home. In Proceedings of the International Conference on Ubiquitous Computing and Ambient Intelligence, Philadelphia, PA, USA, 7–10 November 2017; pp. 413–425. [Google Scholar]
  168. Alshammari, T.; Alshammari, N.; Sedky, M.; Howard, C. SIMADL: Simulated activities of daily living dataset. Data 2018, 3, 11. [Google Scholar] [CrossRef] [Green Version]
  169. Winfield, A.F.; Jirotka, M. Ethical governance is essential to building trust in robotics and artificial intelligence systems. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2018, 376, 20180085. [Google Scholar] [CrossRef] [Green Version]
  170. Boden, M.; Bryson, J.; Caldwell, D.; Dautenhahn, K.; Edwards, L.; Kember, S.; Newman, P.; Parry, V.; Pegman, G.; Rodden, T.; et al. Principles of robotics: Regulating robots in the real world. Connect. Sci. 2017, 29, 124–129. [Google Scholar] [CrossRef]
  171. Veruggio, G. The euron roboethics roadmap. In Proceedings of the 6th IEEE-RAS International Conference on Humanoid Robots, Genova, Italy, 4–6 December 2006; pp. 612–617. [Google Scholar]
  172. International Organization for Standardization. ISO 13842:2013. In Robot and Robotic Devices: Safety Requirements for Personal Care Robots; International Organization for Standardization: Geneva, Switzerland, 2013. [Google Scholar]
  173. British Standard Institute. BS 8611:2016. In Robots and Robotic Devices: Guide to the Ethical Design and Application of Robot and Robotic Systems; British Standard Institute: London, UK, 2016. [Google Scholar]
  174. Spiekermann, S. IEEE P7000—The first global standard process for addressing ethical concerns in system design. Proceedings 2017, 1, 159. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Wearable assistive robots such as (A) prosthetics [5] (Licensed under (CC BY 4.0)), (B) orthotics [6]) (Reproduced with permission from Elsevier) and (C) exoskeletons [8] (Licensed under (CC BY 4.0)), can help users to restore weak limb functions and substitute lost limbs.
Figure 1. Wearable assistive robots such as (A) prosthetics [5] (Licensed under (CC BY 4.0)), (B) orthotics [6]) (Reproduced with permission from Elsevier) and (C) exoskeletons [8] (Licensed under (CC BY 4.0)), can help users to restore weak limb functions and substitute lost limbs.
Sensors 21 06751 g001
Figure 2. Wearable assistive robots with rigid and semi-rigid structure. Lower limb assistive robots: (A) ReWalk [12] (Reproduced with permission from Elsevier), (B) HAL [28] (Licensed under (CC BY 2.0)), (C) REX [32] (Licensed under (CC BY 4.0)), (D) Vanderbilt exoskeleton [33] (Reproduced with permission from Elsevier). Wearable assistive hands: (E) HandeXos-Beta [34] (Reproduced with permission from Elsevier), (F) HexoSYS [35] (Reproduced with permission from Elsevier), (G) HES Hand [36] (Reproduced with permission from Elsevier).
Figure 2. Wearable assistive robots with rigid and semi-rigid structure. Lower limb assistive robots: (A) ReWalk [12] (Reproduced with permission from Elsevier), (B) HAL [28] (Licensed under (CC BY 2.0)), (C) REX [32] (Licensed under (CC BY 4.0)), (D) Vanderbilt exoskeleton [33] (Reproduced with permission from Elsevier). Wearable assistive hands: (E) HandeXos-Beta [34] (Reproduced with permission from Elsevier), (F) HexoSYS [35] (Reproduced with permission from Elsevier), (G) HES Hand [36] (Reproduced with permission from Elsevier).
Sensors 21 06751 g002
Figure 3. Wearable assistive robots with soft structure. Lower limb assistive robots: (A) soft-inflatable knee exosuit for rehabilitation [42] (Licensed under (CC BY 4.0)), (B) soft exosuit for hip assistance [44] (Reproduced with permission from Elsevier), (C) multi-articular hip and knee exosuit [46] (Reproduced with permission from Elsevier). Upper limb assistive robots: (D) soft robotic glove for assistance at home [47] (Reproduced with permission from Elsevier), (E) soft wearable wrist for rehabilitation [48] (Licensed under (CC BY 4.0)), (F) soft robotic elbow sleeve for assistance [49] (Licensed under (CC BY 4.0)).
Figure 3. Wearable assistive robots with soft structure. Lower limb assistive robots: (A) soft-inflatable knee exosuit for rehabilitation [42] (Licensed under (CC BY 4.0)), (B) soft exosuit for hip assistance [44] (Reproduced with permission from Elsevier), (C) multi-articular hip and knee exosuit [46] (Reproduced with permission from Elsevier). Upper limb assistive robots: (D) soft robotic glove for assistance at home [47] (Reproduced with permission from Elsevier), (E) soft wearable wrist for rehabilitation [48] (Licensed under (CC BY 4.0)), (F) soft robotic elbow sleeve for assistance [49] (Licensed under (CC BY 4.0)).
Sensors 21 06751 g003
Figure 4. Wearable sensors. (A) Zigbee to monitor physiological parameters [54] (Reproduced with permission from Elsevier). (B) Optoelectronic wrist sensor for health monitoring [58] (Licensed under (CC BY 4.0)). (C) Stretchable electrochemical sensor for glucose monitoring [69] (Reproduced with permission from ACS). (D) On-shoe wearable sensor for gait analysis in Parkinson’s Disease [64] (Licensed under (CC BY 4.0)).
Figure 4. Wearable sensors. (A) Zigbee to monitor physiological parameters [54] (Reproduced with permission from Elsevier). (B) Optoelectronic wrist sensor for health monitoring [58] (Licensed under (CC BY 4.0)). (C) Stretchable electrochemical sensor for glucose monitoring [69] (Reproduced with permission from ACS). (D) On-shoe wearable sensor for gait analysis in Parkinson’s Disease [64] (Licensed under (CC BY 4.0)).
Sensors 21 06751 g004
Figure 5. Examples of wearable systems that exploit the use of machine learning methods for activity recognition needed for assistive robots. (A) Multimodal sensor fusion with Long Short Term Memory networks for recognition of walking, sitting, lying, standing, driving and eating [103] (Licensed under (CC BY 4.0)). (B) Artificial Neural Networks for recognition of dynamic motion of human limbs [89] (Licensed under (CC BY 4.0)). (C) Gait detection using multimodal sensors with Hidden Markov Models and Artificial Neural Networks [82] (Reproduced with permission from Elsevier). (D) Detection of sit-to-stand and stand-to-sit using IMU sensors and Dynamic Bayesian Networks [81] (Reproduced with permission from Elsevier). (E) Prediction of elbow motion using multimodal sensors with Random Forest and Decision Trees [104] (Licensed under (CC BY 4.0)).
Figure 5. Examples of wearable systems that exploit the use of machine learning methods for activity recognition needed for assistive robots. (A) Multimodal sensor fusion with Long Short Term Memory networks for recognition of walking, sitting, lying, standing, driving and eating [103] (Licensed under (CC BY 4.0)). (B) Artificial Neural Networks for recognition of dynamic motion of human limbs [89] (Licensed under (CC BY 4.0)). (C) Gait detection using multimodal sensors with Hidden Markov Models and Artificial Neural Networks [82] (Reproduced with permission from Elsevier). (D) Detection of sit-to-stand and stand-to-sit using IMU sensors and Dynamic Bayesian Networks [81] (Reproduced with permission from Elsevier). (E) Prediction of elbow motion using multimodal sensors with Random Forest and Decision Trees [104] (Licensed under (CC BY 4.0)).
Sensors 21 06751 g005
Table 1. Wearable robot technology for assistance to lower and upper limbs.
Table 1. Wearable robot technology for assistance to lower and upper limbs.
Assistive
System
ApplicationMaterial
Structure
Degrees
of Freedom
(DoF)
Assisted Body
Segments
Actuation
Type
Weight
ReWalk [12]Assistance to
stand upright
and walk
Rigid
materials
6 DoFHip flexion/extension
Knee flextion/extension
Electric
motors
23 kg
HAL [28]Human gait
rehabilitation,
strength,
augmentation
Rigid
materials
4 DoFHip flexion/extension
Knee flextion/extension
Electric
motors
21 kg
REX [32]Human locomotion
in forward and
backward directions,
turn and climb stairs.
Rigid
materials
10 DoFHip flexion/extension
Knee flexion/extension
Posture support
Electric
motors
38 kg
Vanderbuilt
exoskeleton [33]
Assistance for walking,
sitting, standing,
walking up and
down stairs
Rigid
materials
4 DoFHip flexion/extension
Knee flexion/extension
Electric
motors
12 kg
HandeXos-
Beta [34]
Hand motion
rehabilitation
for multiple grip
configurations
Rigid
materials
5 DoFIndex finger flexion/extension
Thumb finger flexion/extension
and circumduction
Electric
motors
0.42 g
HexoSYS [35]Hand motion
rehabilitation
Rigid
materials
4 DoFAll fingers flexion/extension
and abduction/adduction
Electric
motors
1 kg
HES Hand [36]Hand motion
rehabilitation to recover
hand motor skills
Rigid
materials
5 DoFAll fingers flexion/extensionElectric
motors
1.5 kg
Soft-inflatable
knee exosuit [42]
Gait training for
stroke rehabilitation
Soft
pneumatic
materials
1 DoFKnee flexionPneumatic
system,
inflatable
actuators
0.16 kg
Soft hip
exosuit [44]
Assistance for
level-ground walking
Soft textile
materials
1 DoFHip extensionElectric
motors,
fabric bands
0.17 kg
Multi-articular
hip and knee
exosuit [46]
Assistance to
gait impairments in
sit-to-stand
and stair ascent
Soft materials
and Bowen cables
1 DoFHip and knee extensionElectrical
motors,
Bowden
cables
-
Soft robotic glove
for assistance
at home [47]
Assistance to hand
rehabilitation for
grasping movements
Soft elastomeric
chambers
3 DoFAll fingers flexion/extensionPneumatic
system
0.5 kg
Soft wearable
wrist [48]
Assistance for
rehabilitation of
writs movement
Soft reverse
pneumatic
artificial muscles
2 DoFWrist flexion/extension
and abduction/adduction
Pneumatic
system
-
Soft robotic
elbow sleeve [49]
Assistance for
rehabilitation of elbow
movements
Elastomeric and
fabric-based pneumatic
actuators
2 DoFElbow flexion and extensionPneumatic
system
-
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Martinez-Hernandez, U.; Metcalfe, B.; Assaf, T.; Jabban, L.; Male, J.; Zhang, D. Wearable Assistive Robotics: A Perspective on Current Challenges and Future Trends. Sensors 2021, 21, 6751. https://doi.org/10.3390/s21206751

AMA Style

Martinez-Hernandez U, Metcalfe B, Assaf T, Jabban L, Male J, Zhang D. Wearable Assistive Robotics: A Perspective on Current Challenges and Future Trends. Sensors. 2021; 21(20):6751. https://doi.org/10.3390/s21206751

Chicago/Turabian Style

Martinez-Hernandez, Uriel, Benjamin Metcalfe, Tareq Assaf, Leen Jabban, James Male, and Dingguo Zhang. 2021. "Wearable Assistive Robotics: A Perspective on Current Challenges and Future Trends" Sensors 21, no. 20: 6751. https://doi.org/10.3390/s21206751

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop