Next Article in Journal
Toward Blockchain-Enabled Privacy-Preserving Data Transmission in Cluster-Based Vehicular Networks
Next Article in Special Issue
Variability of Muscular Recruitment in Hemiplegic Walking Assessed by EMG Analysis
Previous Article in Journal
FCC-Net: A Full-Coverage Collaborative Network for Weakly Supervised Remote Sensing Object Detection
Previous Article in Special Issue
A Wireless Body Sensor Network for Clinical Assessment of the Flexion-Relaxation Phenomenon
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis

School of Mechanical Engineering, Hebei University of Technology, Tianjin 300401, China
*
Author to whom correspondence should be addressed.
Electronics 2020, 9(9), 1357; https://doi.org/10.3390/electronics9091357
Submission received: 28 July 2020 / Revised: 15 August 2020 / Accepted: 18 August 2020 / Published: 21 August 2020
(This article belongs to the Special Issue Recent Advances in Motion Analysis)

Abstract

:
The recent scientific and technical advances in Internet of Things (IoT) based pervasive sensing and computing have created opportunities for the continuous monitoring of human activities for different purposes. The topic of human activity recognition (HAR) and motion analysis, due to its potentiality in human–machine interaction (HMI), medical care, sports analysis, physical rehabilitation, assisted daily living (ADL), children and elderly care, has recently gained increasing attention. The emergence of some novel sensing devices featuring miniature size, a light weight, and wireless data transmission, the availability of wireless communication infrastructure, the progress of machine learning and deep learning algorithms, and the widespread IoT applications has promised new opportunities for a significant progress in this particular field. Motivated by a great demand for HAR-related applications and the lack of a timely report of the recent contributions to knowledge in this area, this investigation aims to provide a comprehensive survey and in-depth analysis of the recent advances in the diverse techniques and methods of human activity recognition and motion analysis. The focus of this investigation falls on the fundamental theories, the innovative applications with their underlying sensing techniques, data fusion and processing, and human activity classification methods. Based on the state-of-the-art, the technical challenges are identified, and future perspectives on the future rich, sensing, intelligent IoT world are given in order to provide a reference for the research and practices in the related fields.

1. Introduction

In the future information systems featuring rich sensing, wireless inter-connection, and data analytics, which are usually denoted as Internet of Things (IoT) and cyber-physical systems (CPS) [1,2], human motions and activities can be recognized to perform intelligent actions and recommendations. The monitoring, recognition, and in-depth analysis of human gesture, posture, gait, motion, and other daily activities has received growing attention in a number of application domains, and significant progress has been made, attributing to the latest technology development and application demands [3]. As a result, new sensing techniques and mathematical methods are dedicated to the numerical analysis of the posture and activities of human body parts, and plenty of related studies are found in the literature.
The significant progress in human activity recognition (HAR) and motion analysis may be driven by the following powers: (1) The new powerful information systems require more efficient human–machine interactions for potential intelligence; (2) the development of integrated circuits (ICs) and circuits fabrication techniques has prompted the emergence of low-power, low-cost, miniature-sized, and flexible wearable sensing devices; (3) there have been recent advances in data analytics techniques, including filtering and machine learning techniques in dealing with data processing and classification; and (4) some new emerging applications fields, including wearable sensor-based medical care, sports analysis, physical rehabilitation, assisted daily living (ADL), etc., have also provided new opportunities for the techniques to revolutionize the traditional techniques in these fields. As a result, much research effort has been devoted to this area, and a number of technical solutions have been proposed.
There have been plenty of innovative studies that introduce novel sensing techniques to human activity recognition and motion analysis. Human–machine interaction (HMI) may be a critical area, where human gesture, posture, or motion can be recognized for the efficient interaction between machines covering a wide field of applications, including human–object interaction, virtual reality, immersive entertainment, etc. [4,5,6]. Human activity recognition and motion analysis has also been an effective way for sports analysis evaluation, while peer investigations are identified for golf swing analysis [7], swimming velocity estimation [8], and sports training [9]. Medical care has become a new research interest in the recent years, where precise and quantitative representation of human motion can help physicians in diagnosis, treatment planning, and progress evaluation; typical applications include gait analysis for stoke rehabilitation therapy [10], clinical finger movement analysis [11], Parkinson’s disease treatment [12], etc. In addition, many other innovative applications are also found in assisted daily living, elderly and children care, etc. It is easy to see how through the applications, the innovative usage of the mentioned techniques has revolutionized the traditional approaches and resulted in convenience, efficiency, and intelligence that have never been seen before.
Many different techniques are employed to obtain the raw sensor data for monitoring human activities. The most commonly used techniques are the smartphone built-in inertial measurement units (IMUs) and optical cameras [13,14]. Since smartphones have almost become a must-have assistant for people’s everyday life, a lot of human body and human activity related studies are carried out by taking advantage of smartphones. An optical camera, as a widely used sensing device, is a mainstream solution for human activity recognition. A depth camera, due to the one added dimension of depth information, has its unique strength compared to normal optical cameras [15]. Then, electromagnetic waves, such as frequency modulated continuous wave (FMCW) radar [16], impulse radio ultra-wide band (IR-UWB) radar [17], WiFi [18], and capacitive sensor array [19], featuring non-contact and non-line-of-sight (NLOS) traits, have also been introduced to human activity recognition. For the sensing techniques, accuracy, safety, privacy protection, and convenience are key factors for their applications. The low-cost, lightweight, and miniature-sized wearable IMU sensor devices have been prevalent techniques employed by peer investigations for human activity recognition, and the integration of two or more techniques may result in a better performance [20]. In addition, discrete sensing devices may be combined to constitute a wireless body area network (WBAN) for the establishment of wearable computing in many high-performance systems [21].
In addition to sensing techniques and networking techniques, data processing techniques including filtering, segmentation, feature extraction and classification are also indispensable enablers. For the pre-processing of sensing data, moving average filter (MAF), Kalman filter (KF) and complementary filter (CF) are the common approaches [22,23,24]. For classification, different machine learning algorithms have been used to create recognition models such as decision tree (DT), Bayesian network (BN), principle component analysis (PCA), support vector machine (SVM), artificial neural networks (ANNs), logistic regression, hidden Markov model (HMM), K-nearest neighbors (KNN) and deep neural networks (DNNs) [25,26,27,28]. Deep learning methods, such as the convolutional neural network (CNN) and recurrent neural network (RNN), due to their performance and wide acceptance in data analysis, have also recently gained interest in being used as tools [29,30,31]. The data processing techniques play an important role in guaranteeing the efficiency and accuracy of the recognition and analysis of human activities.
Admittedly, due to the potential in the different application fields, there has been a great demand for accurate human activity recognition techniques that can lead to the convenience, efficiency, and intelligence of information systems, new opportunities of medical treatment, more convenient daily living assistance, etc. However, there lacks a timely report on the recent contributions of recent technical advances with an in-depth analysis of the underlying technical challenges and future perspectives. Motivated by both the great demand for a more efficient interaction between human and information systems and the lack of investigations about the new contributions to knowledge in the field, this paper aims to provide a comprehensive survey and in-depth analysis of the recent advances in the diverse techniques and methods for human activity recognition and motion analysis.
The rest of this paper is organized as follows: Section 2 summarizes the innovative application regarding human activity recognition and motion analysis; Section 3 illustrates the fundamentals, including the common methodology, the modeling of human parts, and identifiable human activities. Section 4 and Section 5 present the novel sensing techniques and mathematical methods, respectively. Then, Section 6 gives the underlying technical challenges and future perspectives, followed by Section 7, which concludes the work.

2. Latest Progress in HAR-Related Applications

The past few decades have witnessed unprecedented prosperity of electronics techniques and information systems, which have resulted in a revolutionary development in almost all aspects of technological domains, including aeronautics and astronautics, automotive industry, manufacturing and logistics, consumer electronics and entertainment, etc. Human activity recognition and motion analysis, due to its potential in wide areas of applications, has attracted much research interest and made remarkable progress in recent years. This section gives an overview of the latest technical progress and a summary of the application domains of human activity recognition and motion analysis.

2.1. Overview of the Latest Technical Progress

An overview of the latest technical progress of human activity recognition is given in Figure 1. The technical progresses of HAR are found to focus on the following aspects: new sensing device and methods, innovative mathematical methods, novel networking and computing paradigms, emerging consumer electronics, and convergence with different subject areas.
  • New sensing devices and methods: The acquisition of raw data is the first step for accurate and effective activity recognition. The cost reduction of the electronic devices has accelerated the pervasive sensing and computing systems, such as location and velocity tracking [32]. On account of the sensing techniques, many new techniques that were previously not possible for their cost, size, and technical readiness are now introduced for human activity related studies in addition to the traditional video cameras (including depth cameras), FMCW radar, CW-doppler radar, WiFi, ultrasound, radio frequency identification (RFID), and wearable IMU and electromyography (EMG) sensors, etc. [15,16,17,18]. Among the above candidates, FMCW radar, CW-doppler radar, WiFi, ultrasound, and RFID are both NLOS and contactless. Wearable IMU is NLOS and body-worn, for which the applications are not limited to specific areas. The micro-electro-mechanical system (MEMS) IMUs, due to their advantages in being low-power, low-cost, miniature-sized, as well as able to output rich sensing information, have become a dominant technical approach for HAR studies [33,34].
  • Innovative of mathematical methods: To take advantage of the sensor data, an appropriate modeling of human body parts activities, the pre-processing of raw data, feature extraction, classification, and application-oriented analysis are pivotal enablers for the success of HAR-related functions. With respect to pre-processing, a number of lightweight, time-domain filtering algorithms, which are appropriate for resource-limited computing, including KF, extended Kalman filter (EKF), unscented Kalman filter (UKF), and Mahony complementary filter, are common alternatives that deal with wearable electronics [35,36]. In terms of feature extraction, the time-domain and frequency-domain features, including mean, deviation, zero crossing rate, and statistical characteristics of amplitude and frequency, are the most fundamental parameters [37]. Fast Fourier transform (FFT), as a classical algorithm, is the main solution for the frequency-domain feature analysis. With respect to classification, much effort in classical machine learning, including DT, BN, PCA, SVM, KNN, and ANN, and deep learning methods, including CNN and RNN, has been devoted to this particular area in recent years [25,38].
  • Novel networking and computing paradigms: The sensing and computing platforms are crucial factors for effective human activity recognition. For visual-based solutions, high-performance graphics processing units (GPUs) have paved the way for the intensive computations in HAR [39]. For wearable sensing solutions, many new pervasive and mobile networking and computing techniques for battery-powered electronics have been investigated. Various wireless networking and communication protocols, including Bluetooth, Bluetooth low energy (BLE), Zigbee, and WiFi were introduced for building a body area network (BAN) for HAR [40]. New computation paradigms, such as mobile computing and wearable computing, were proposed to handle the location-free and resource-limited computing scenarios, and sometimes the computation is conducted on an 8-bit or 16-bit micro-controller unit (MCU) [41]. The novel networking and computing paradigms customized for HAR are more efficient and flexible for the related investigation and practices.
  • Emerging consumer electronics for HAR: Another evidence of the progress in HAR is the usage of emerging HAR consumer electronics, including Kinect, Fibaro, the Mi band, and Fitbit. The Kinect-based somatosensory game is a typical use case of recognizing human motion as an input for human–computer interactions [42]. Fibaro detects human motion and changes in location for smart home applications [43]. Some wearable consumer electronics, such as the Mi band and Fitbit, can provide users with health and motion parameters, including heartbeats, intensity of exercises, walking or running steps, sleeping quality evaluation, etc. [44,45]. In addition, there are electronic devices such as MAX25205 (Maxim, San Jose, USA) used for gesture sensing in automotive applications using an IR-based sensor, where hand swipe gestures, finger, and hand rotation can be recognized [46]. These devices have stepped into people’s daily lives to assist people to better understand their health and physical activities, or to perform automatic control and intelligent recommendation via HAR.
  • Convergence with different subject areas: The HAR techniques were also found to be converging with many other subject areas and thus continually allowing for new applications to be created. Typically, HAR is merging with medical care, and this has resulted in some medical treatment and physical rehabilitation methods of diseases such as stroke and Parkinson’s disease [10,12]. HAR has also been introduced in sports for sports analysis for the purpose of enhancing athletes’ performance [7,8]. HAR-assisted daily living is another example as a field of application, where power consumption, home appliance control, and intelligent recommendations can be implemented to customize the living environment to people’s preferences [2,47].

2.2. Widespread Application Domains

Driven by the subtantial technical progress in the related fields, HAR has been extended to a wide spectrum of application domains. Since human involvement is the most critical part for many information systems, the introduction of HAR can potentially result in greater efficiency and intelligence of the interaction between human and information systems, which also creates new opportunities for the human body or human activity related studies and applications. The fields of applications/domains are summrized as follows:
  • Human–machine interaction (HMI)—HAR is used to recognize the gesture, posture, or motion of human body parts and use the results as inputs for information systems, and to perform efficient interactions or provide people intelligent recommendations [48,49].
  • Medical care/physical rehabilitation—HAR is used to record the motion trajectory of human body parts, conduct numerical analysis, and the results can be used as reference for medical treatment [12,50].
  • Sports analysis—HAR is used to record information such as the acceleration, angular velocity, velocity, etc., for numerical analysis of human body parts motions in order to evaluate the performance of sports and advise improvement strategies [7,51].
  • Assisted daily living—HAR is used to recognize people’s gesture and motions for home appliance control, or analyze people’s activities and habits for living environment customizations and optimization.
  • Children and elderly care—HAR is used to analyze children’s or elderly people’s activities, such as fall detection and disease recognition and to perform assistance or report warning to caregivers [52,53].
  • Human intent analysis—HAR is used to analyze people’s motions and activities in order to predict people’s intents in a passive way for particular applications, such as intelligent recommendation, crime detection in public areas, etc. [54].
Although many new techniques and methods have been introduced in HAR for different applications and remarkable progress has been made, there is still large room for improving the performance of the methods and systems. More convenient, lightweight, powerful computation sensing devices and systems, more accurate classification algorithms, and some application-oriented studies will still continually gain attention and play a more important role in various information systems and in people’s daily lives.

3. Fundamentals

In order to give a comprehensive understanding of the HAR techniques and methods, this section presents the fundamentals of HAR regarding the sensing devices and data processing techniques. Since there are many different sensing devices, pre-processing, and classification techniques involved, the abstract, common methodology covering the different techniques and methods is illustrated, and the correspondence between the identifiable human activities with the innovative applications is reported as well.

3.1. Common Methodology

In view of the sensing and processing techniques for HAR, the methods can be described with a common methodology as shown in Figure 2, which may be applicable for both optical tracking and wearable sensing techniques [55,56]. The common methodology consists of four steps: data acquisition and pre-processing, segmentation, feature extraction, and classification [57].
  • Data acquisition and pre-processing—Motion sensors such as 3D cameras, IMU, IR-radar, CW radar, and ultrasound arrays, either mounted onto a fixed location or body-worn, are used to obtain raw data including human activity information, such as position, acceleration, velocity, angular velocity, and angles. In addition, various noise cancellation techniques, either time-domain or frequency-domain ones, need to be applied to optimize the quality of the signal for further processing.
  • Segmentation—Human motion is normally distributed in particular time spans, either occasionally or periodically. The signals need to be split into window segments for activity analysis, and the sliding window technique is usually used to handle the activities between segments.
  • Feature extraction—The features of activities including joint angle, acceleration, velocity and angular velocity, relative angle and position, etc., can be directly obtained for further analysis. There may also be indirect features that can be used for the analysis of particular activities.
  • Classification/model application—The extracted features can be used by various machine learning and deep learning algorithms for the classification and recognition of human activities. Some explicit models, such as skeleton segment kinematic models, can be applied directly for decision-making.

3.2. Modeling of Human Body Parts for HAR

Since the human body is a systematic whole, a primary step in using HAR is to build a mechanical and posture kinematic model for the motion of human body parts. Human skeleton joints are the key features for optical tracking based HAR methods, where markers are sometimes applied for human gesture and posture recognition as well as motion tracking [53,58]. Figure 3 shows a skeleton-based multi-section model for the hand and human body. For wearable IMU-based HAR, the multi-segment model of human skeleton is also used as a basis to build a kinematic model of body segments. For example, the kinematic model of an elbow joint can be estabished with two segments connected by two revolute joints, allowing two degrees of freedom (DoF), as shown in Figure 4 [59]; the DoF kinematic model of the left leg, including three DoF angle joints, one DoF knee joint, and three DoF hip joints, is presented in [60], and the kinematics of the ankle to the hip and trunk is discussed in [61]. With the multi-segment model and kinematic model, the posture and the motion of human body parts and the whole body can be estimated with the angle and position data obtained with sensor devices.

3.3. Identifiable Human Activities

Since human body parts are not rigid structures and are, instead, structures that deform and move in different ways following complex kinematic patterns, human activity parameters that belong to these parts are very hard to measure. Many new technical solutions are customized for human activity measurment, and remarkable progress has been made. The identifiable human activities are summarized in Table 1.
As shown in Table 1, many human body parts and the human body as a whole can be used for activity recognition for different purposes, including fingers, hand, arm, elbow, knee, ankle, and the whole body. Of course, the identifiable human activities and the corresponding applications are not limited to those listed in the table. Rapid technical progress has resulted in the continual addition of new possibilities and the creation of new applications in this particular field.

4. Novel Sensing Techniques

The appropriate measurement techniques and quality sensor data are the fundamentals for effective HAR and further applications. In this section, the sensing techniques for HAR are summarized with taxonomy and compared with proposed indexes. The sensor network solutions for wireless data collection and multi-sensor nodes for whole body monitoring are also discussed.

4.1. Taxonomy and Evaluation

According to the principles of sensor devices, the sensing techniques can be divided into six categories: optical tracking, radio frequency techniques, acoustic techniques, inertial measurement, force sensors, and EMG sensors. Each of the above categories may include a couple of different sensing techniques that may exhibit different performances in dealing with HAR applications. The feasibility of sensing techniques for HAR can be evaluated with five indicators: convenience of use, application scenarios, richness of information, the quality of raw data, and cost. The discussion and analysis of the sensing techniques are conducted with the abovementioned categories and evaluation indicators, which are as shown in Figure 5.

4.2. Sensing Techniques with Example Applications

To give a comprehensive overview and in-depth discussion of the sensing techniques for HAR and related studies, the sensing techniques of the five types and the corresponding typical applications are summarized in Table 2.
  • Optical tracking—Optical tracking with one or a few common cameras is the most traditional way of human activity recognition. The temporal and spatial features of the human body and the activities can be extracted with image processing algorithms. The classification that takes advantage of the features can be sorted out with machine learning or deep learning techniques. This method is competitive in accuracy and wide application domains, including entertainment, industrial surveillances, public security, etc. With the progress in high-speed cameras and depth sensors in recent years, more innovative investigations and applications are found in both academic studies and practical applications. The strengths of optical tracking methods are high accuracy, contactless monitoring, and rich activity information, while the weaknesses are the fixed location of use and potential risk of privacy leak.
  • RF and acoustic techniques—The sensing technique working under the principle of radar includes IR-UWB, FMCW radar, WiFi, and ultrasound. The motion of human body parts or the whole body can be perceived using Doppler frequency shift (DFS), and some time–frequency signatures can be obtained for further analysis. Machine learning and deep learning techniques are widely used for classifications of human activities. Evidently, the strengths of this method are contactless measurement, NLOS, and no risk of privacy leak. Specifically, the commodity WiFi takes advantage of the existing communication infrastructure without added cost. Their weaknesses are the monotonousness and implicity of information provided by the sensors which requires specialized processing.
  • Inertia sensors—The inertial sensors, especially the MEMS IMUs, became a dominant technical approach for human activity recognition and analysis. They provide 3-axis acceleration, 3-axis angular velocity, and 3-axis magnetometer signals, which can be employed for the estimation of attitude and motion of human body parts by mounting the devices on them. The strengths of IMUs in HAR are their miniature size, low cost, low power, and rich information output, which make them competitive for wearable applications that can be used without location constraints. Their weakness is their contact measurement, which may be inconvenient to people for daily activities recognition.
  • Force sensors and EMG sensors—Force sensors may include piezoelectric sensors, FSR, and some thin film pressure sensors of different materials. They may provide pressure or multiple axis forces of human gait, hand grasp, etc., for sports analysis, physical rehabilitation or bio-feedback. EMG sensors work in a similar way as force sensors do and implement similar functions by using the EMG signal. Since EMG provides the muscle activities, it may provide more useful information for medical care, such as rehabilitation and amputation. The strengths of force sensors and EMG sensors are their capability in obtaining useful information of body part local areas. Their weakness is also the contact measurement, which may be inconvenient to people for daily activities recognition.
  • Multiple sensor fusion—In addition to the above, there are also multiple sensor fusion based techniques that combine more than one of the above alternatives. For instance, inertial and magnetic sensors are combined for body segments motion observation [24]; optical sensor, IMU, and force sensor are combined for pose estimation in human bicycle riding [66]; knee electro-goniometer, foot-switches, and EMG are combined for gait phase and events recognition [95]; and FMCW radar and wearable IMUs are combined for fall detection [96]. The purpose of the combination is to compensate the limitations of one sensor technology by taking advantage of another so as to pursue maximal performance. For example, IMU and UWB radar are usually combined with real-time data fusion to overcome the limitations in continuous error accumulation of IMU, and NLOS shadowing and random noise of UWB [23].
On account of the different sensing techniques, they may behave differently with respect to the different evaluation indexes. In summary, for the convenience of use, the optical tracking and radar techniques are non-contact, but they are normally installed on particular locations. The inertial sensors can be wearable and provide rich useful information for human activity analysis, but contact measurement is involved, which may introduce inconvenience for users’ daily activities. As a contact measurement method, force sensors and EMG can reveal information of body part local areas, which is competitive in medical care.

4.3. Body Area Sensor Networks

For the wearable sensing techniques, it is important to carry out the data collection without interrupting people’s normal activities. Therefore, power cords data wires should be replaced with miniature batteries and wireless communication nodes. The optional wireless communication techniques which are commonly used include Bluetooth, WiFi, BLE, Zigbee, 4G/5G, NB-IoT, and LoRa, etc. [97] Since 4G/5G, NB-IoT, and LoRa are commonly used for long range data transmission for different purposes, Bluetooth, WiFi, BLE, and Zigbee are more likely to be chosen for short range wearable communications. A comparison of the wireless techniques is given in Table 3.
Since wearable sensing devices are powered with batteries, the energy consumption, data rate, and network topology are the key parameters to be considered when establishing the systems. Normally, Bluetooth is used for the data collection of single sensor devices: For example, Bluetooth is employed to transmit IMU data for sports analysis [7], force data of piezoelectric sensors for gait recognition [89], and IMU data for assessment of elbow spasticity [98]. Then, BLE and ZigBee are competitive in building a BAN for the data collection of multiple nodes systems; ZigBee is used for multiple IMU data collection in [66], while BLE-based wireless BAN (WBAN) was used for joint angle estimation in [99].

5. Mathematical Methods

Once the sensor data are obtained, how to make use of it is also a challenging task. As presented in the common methodology given in Figure 2, the key steps include pre-processing, segmentation, feature extraction, and classification/model application. In this section, we focus on the mathematical methods for the two key steps: feature extraction and classification.

5.1. Features and Feature Extraction

The feature extraction methods can be categorized into four types according to their outputs: heuristic, time-domain, time–frequency, or frequency-domain features [85,100], which are as shown in Table 4. Heuristic features are those derived and characterized by an intuitive understanding of how activities produce signal changes. Time-domain features are typically statistical measures obtained from a windowed signal that are not directly related to specific aspects of individual movements or postures. Time–frequency features, such as wavelet coefficients, are competitive for detecting the transitions between human activities [65]. Then, frequency-domain features are usually the preferred option for human activity recognition. Mathematical tools such as FFT, short-time Fourier transform (STFT), and discrete cosine transform (DCT) are the commonly used methods.

5.2. Classification and Decision-Making

For most human activity recognition studies, the classification to discriminate the different types of activities using the extracted features is a critical step. Due to the complexity in human gesture, posture, and daily activities, how to descriminiate them with certain accuracy using different mathematical models becomes a research focus. The classification can be classified into two categories:
  • Threshold-based method—Threshold can be eaisly used to obtain many simple gestures, postures, and motions. For example, fall detection, hand shaking, and static standing can be recognized using the acceleration threshold calculated using equation a T H = a x 2 + a y 2 + a z 2 , and walking or running can be recognized using Doppler frquency shift thresholds.
  • Machine learning techniques—Machine learning is a method of data analysis that automates analytical model building, which is suitable for the classification of human activities using sensor data. Its feasibility and efficiency have been demonstrated by many published studies with accuracies over 95%. Typically, HMM achieves an accuracy of 95.71% for gesture recognition [101]; SVM and KNN are employed for human gait classification between walking and running with an accurate over 99% [102]; PCA and SVM recognize three different actions for sports training with an accuracy of 97% [9]; KNN achieves an accuracy of 95% for dynamic activity classification [65]. There are also peer investigations providing evaluations of different machine learning methods [9,88,103].
  • Deep learning methods—Deep learning techniques take advantages of many layers of non-linear information processing for supervised or unsupervised feature extraction and transformation, as well as for pattern analysis and classification. They presents advantages over traditional approaches and have gained continual research interest in many different fields, including HAR in recent years. For example, CNN achieves an accuracy of 97% for the recognition of nine different activities in [31], an accuracy of 95.87% for arm movement classification [64], an accuracy of 99.7% for sEMG-based gesture recognition in [93], and an accuracy of 95% for RFID based in-car activity recognition in [81]. The past investigations have demonstrated that deep learning technique is an effective way of classification for human activity recognition. There are also investigations applying long short-term memory (LSTM) [104] or bidirectional LSTM (Bi-LSTM) [76,96], as well as recurrent neural networks in sports and motion recognition, which have exhibited competitive performance compared to CNN.
Due to the superior performance of machine learning and deep learning techniques, many tool libraries—such as MATLAB machine learning toolbox, pandas, and scikit-learn [105]—for different development environments are available, which makes the implementation of the classification quite convenient. The different machine learning techniques will continually be considered the dominant tool for classification for various sensor data based human activity recognition.

6. Discussions

Driven by the related sensing, communication, and data processing techniques, HAR has undergone a rapid progress and is extended to widespread fields of applications. This section summarizes the state-of-the-art, the underlying technical challenges, and the future trend.

6.1. Performance of the State-of-the-Art

Based on the above investigation, it is a critical issue to determine how to select the appropriate sensing techniques and mathematical methods for HAR applications. It is found that the technical solutions are highly dependent on the particular HAR tasks. For accurate indoor human activity recognition, optical depth sensors with PCA, HMM, SVM, KNN, and CNN are commonly seen to obtain an accuracy over 95%, and combination of UWB and IMU can also find applications with accuracy over 90%. For wearable outdoor applications of consumer electronics, sports analysis, and physical rehabilitation, a single IMU is a competitive choice, which normally uses Kalman filter or a complementary filter for noise cancellation and uses SVM or CNN for decision-making. The accuracy for walking and running step counting and other motion recognition is normally over 90%. Then, for medical care, physical rehabilitation or bio-feedback, thin film force sensors and EMG sensors are commonly used to obtain information of local areas of body parts. Deep learning is the most prevalent choices, which may result in an accuracy at about 90%. According to the analysis in Section 4 and Section 5, each sensing technique and mathematical method has its own pros and cons and suitable application scenarios. The combination of two or more sensing techniques or mathematical methods may overcome the limitations of one technique by taking advantage of the others.

6.2. Technical Challenges

According the survey of the state-of-the-art, the techniques for HAR are far from technically ready for various application in the different application fields. Eventually, the technical challenges for the human activity recognition and motion analysis techniques are summarized as follows:
  • Limitation of sensing techniques for convenient uses—The sensing techniques, including optical tracking, radar, inertial measurement, force, and EMG sensors, all have their limitations in performing accurate, non-contact, location-free human activity sensing. Optical sensors are competitive in accuracy, but they are LOS and limited to specific areas. Radar techniques, such as FMCW, IR-UWB, WiFi, and ultrasound, are NLOS, but they work in specific areas, and they are not as accurate. IMUs, force data, and EMG are low cost and low power, making them suitable for wearable applications, but the battery power and wireless modules make the size of the devices inconvenient for daily use. They lack sensing techniques that do not interrupt people’s normal daily activities.
  • Dependency on PCs for data processing—For the body-worn sensing devices, the data acquisition and pre-processing are usually completed with low power MCUs, and further processing such as feature extraction and classification are conducted on PCs. This results in the requirements of a high transmission data rate and the involvement of a powerful PC. The high data rate is also a challenge for the deployment of multiple sensor nodes to establish a BAN. The traditional way of communication and computation has constrained the application of body-worn HAR.
  • Difficulties in minor gesture recognition—Most of the existing techniques are for the recognition of regular and large-scale human gesture, posture, and movements, such as hand gesture, arm swing, gait, and whole-body posture, with classification techniques. It is still a challenge to quantitively discriminate minor gesture variations, which may be potentially valuable for many applications.
  • Specificity in human activity recognition and motion analysis—Most of the state-of-the-art may focus on the postures or activities for a particular body part, or a particular posture or activity of the whole body. The focus usually falls on the specific sensing devices and classification techniques, while lacks comprehensive posture and activity recognition and further understanding of the kinematics of human body and behaviors of people based on the in-depth analysis of their daily activities.

6.3. Future Perspective

To overcome the technical challenges mentioned above, a number of future investigations are expected to focus on the following aspects:
  • Unobtrusive human activity monitoring techniques—The sensing techniques that can perform an unobtrusive perception of human activities without or with minimum interferece to people’s normal activities, and that can also be used anywhere, need greater study. New investigations about minimizing the size of sensing devices and the way of powering the device may be new research interests.
  • Optimization of sensing network and computation powers—It is critical to explore the optimal BAN of sensing devices for HAR, and to optimize the computation power between sensor nodes and the central processor to relieve the burden of the sensor network for data transmission. A new paradigm of wearable computing for wearable BAN may become an effort attracting topic.
  • Further studies for minor gestures and posture recognition—Further investigation of the sensing techniques and feature extraction techniques for minor gesture recognitions may find applications in different fields, such as HMI. For example, UWB Doppler features based human finger motion recognition for the operation of handheld electronics may be one potential use case.
  • Comprehensive recognition and understanding of human activities—Based on the recognition of human gestures and movements, the further positional and angular kinematics of the human body and human behaviors can be investigated for the further analysis. Research outputs may provide valuable reference for the studies of bionic robotics and for customized human intent estimation as well as smart recommendations in smart environments.
The above technical challenges may draw attention to the investigations of this field in the future, and the related studies about unobtrusive sensing, optimized sensing network and computing system, minor gesture and posture recognition, and further human body kinematics and behavior understanding are promising research topics that may add new features to HAR and create new opportunities of research and applications. Since HMI is a key attribute for the usability of information systems, HAR as a promising solution for the efficient and convenient interaction between human and inforamtion systems will play an important role in the future rich, sensing IoT world.

7. Conclusions

This article presented a comprehensive overview of the recent progress in the sensing techniques that have been introduced to human activity recognition and motion analysis. Remarkable technical progresses in new sensing devices and methods, innovative mathematical models for feature extraction and classification, novel networking and computing paradigms, convergence with different subject areas have been identified, which have extended to widespread application fields. To provide a comprehensive understanding of the fundamentals, the skeletal based multi-segment models and kinematic modeling of human body parts were firstly presented. Then, the sensing techniques were summarized and classified into six categories: optical tracking, RF sensors, acoustic sensors, inertial sensors, force sensors, and EMG sensors, followed by in-depth discussions about the pros and cons of the proposed evaluation indexes. Further to the sensing devices, the mathematical methods including feature extraction and classification techniques are discussed as well. According to the state-of-the-art HAR techniques, the technical challenges focused on were found to include the limitation of sensing techniques for convenient uses, dependency on PCs for data processing, difficulties in minor gesture recognition, and specificity in human activity recognition and motion analysis. The solutions for these challenges are considered to be the development trend of future studies. Since human activity recognition and moton analysis is a promising way for efficient interaction between human and information systems, it may play a more important role in future IoT enabled intelligient information systems.

Author Contributions

Conceptualization, Z.M. and M.Z.; methodology, M.Z. and H.Z.; formal analysis, Z.M.; investigation, M.Z., C.G., and Q.F.; resources, Z.M.; writing—original draft preparation, M.Z., C.G., and Q.F.; writing—review and editing, Z.M. and H.Z.; visualization, supervision, N.G. and Z.Z.; project administration, Z.M.; funding acquisition, Z.M. All authors have read and agreed to the published version of the manuscript.

Funding

The work presented in this paper is supported by the National Natural Science Foundation of China (NSFC) (51805143), Natural Science Foundation of Hebei province (E2019202131), and the Department of Human Resources and Social Security of Hebei Province (E2019050014 and C20190324).

Acknowledgments

The authors would like to thank Nondestructive Detection and Monitoring Technology for High Speed Transportation Facilities, Key Laboratory of Ministry of Industry, and Information Technology for support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Dian, F.J.; Vahidnia, R.; Rahmati, A. Wearables and the Internet of Things (IoT), Applications, Opportunities, and Challenges: A Survey. IEEE Access 2020, 8, 69200–69211. [Google Scholar] [CrossRef]
  2. Aksanli, B.; Rosing, T.S. Human Behavior Aware Energy Management in Residential Cyber-Physical Systems. IEEE Trans. Emerg. Top. Comput. 2017, 845–857. [Google Scholar] [CrossRef]
  3. Chen, L.; Hoey, J.; Nugent, C.D.; Cook, D.J.; Yu, Z. Sensor-based Activity Recognition. IEEE Trans. Syst. ManCybern. Part C Appl. Rev. 2012, 42, 790–808. [Google Scholar] [CrossRef]
  4. Ziaeefard, M.; Bergevin, R. Semantic Human Activity Recognition: A Literature Review. Pattern Recognit. 2015, 48, 2329–2345. [Google Scholar] [CrossRef]
  5. Zhang, F. Human–Computer Interactive Gesture Feature Capture and Recognition in Virtual Reality. Ergon. Des. Q. Hum. Factors Appl. 2020. [Google Scholar] [CrossRef]
  6. Vrigkas, M.; Nikou, C.; Kakadiaris, I.A. A review of Human Activity Recognition Methods. Front. Robot. AI 2015, 2, 28. [Google Scholar] [CrossRef]
  7. Kim, Y.J.; Kim, K.D.; Kim, S.H.; Lee, S.; Lee, H.S. Golf Swing Analysis System with a Dual Band and Motion Analysis Algorithm. IEEE Trans. Consum. Electron. 2017, 63, 309–316. [Google Scholar] [CrossRef]
  8. Dadashi, F.; Millet, G.P.; Aminian, K. Gaussian Process Framework for Pervasive Estimation of Swimming Velocity with Body-worn IMU. Electron. Lett. 2013, 49, 44–46. [Google Scholar] [CrossRef]
  9. Wang, Y.; Chen, M.; Wang, X.; Chan, R.H.M.; Li, W.J. IoT for Next Generation Racket Sports Training. IEEE Internet Things J. 2018, 5, 4559–4566. [Google Scholar] [CrossRef]
  10. Wang, L.; Sun, Y.; Li, Q.; Liu, T.; Yi, J. Two Shank-Mounted IMUs-based Gait Analysis and Classification for Neurological Disease Patients. IEEE Robot. Autom. Lett. 2020, 5, 1970–1976. [Google Scholar] [CrossRef]
  11. Connolly, J.; Condell, J.; O’Flynn, B.; Sanchez, J.T.; Gardiner, P. IMU Sensor-Based Electronic Goniometric Glove for Clinical Finger Movement Analysis. IEEE Sens. J. 2018, 18, 1273–1281. [Google Scholar] [CrossRef]
  12. Nguyen, H.; Lebel, K.; Bogard, S.; Goubault, E.; Boissy, P.; Duval, C. Using Inertial Sensors to Automatically Detect and Segment Activities of Daily Living in People with Parkinson’s Disease. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 197–204. [Google Scholar] [CrossRef] [PubMed]
  13. Mukhopadhyay, S.C. Wearable Sensors for Human Activity Monitoring: A Review. IEEE Sens. J. 2015, 15, 1321–1330. [Google Scholar] [CrossRef]
  14. Lara, O.D.; Labrador, M.A. A Survey on Human Activity Recognition Using Wearable Sensors. IEEE Commun. Surv. Tutor. 2013, 15, 1192–1209. [Google Scholar] [CrossRef]
  15. Rossol, N.; Cheng, I.; Basu, A. A Multisensor Technique for Gesture Recognition Through Intelligent Skeletal Pose Analysis. IEEE Trans. Hum. Mach. Syst. 2016, 46, 350–359. [Google Scholar] [CrossRef]
  16. Vaishnav, P.; Santra, A. Continuous Human Activity Classification with Unscented Kalman Filter Tracking Using FMCW Radar. IEEE Sens. J. 2020, 4, 7001704. [Google Scholar] [CrossRef]
  17. Rana, S.P.; Dey, M.; Ghavami, M.; Dudley, S. Non-Contact Human Gait Identification Through IR-UWB Edge-based Monitoring Sensor. IEEE Sens. J. 2019, 19, 9282–9293. [Google Scholar] [CrossRef]
  18. Wang, H.; Zhang, D.; Wang, Y.; Ma, Y.; Li, S. RT-Fall: A Real-Time a Contactless Fall Detection Systems with Commodity WiFi Devices. IEEE Trans. Mob. Comput. 2017, 16, 511–526. [Google Scholar] [CrossRef]
  19. Tariq, O.B.; Lazarescu, M.T.; Lavagno, L. Neural Networks for Indoor Human Activity Reconstructions. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
  20. Li, T.; Fong, S.; Wong, K.K.L.; Wu, Y.; Yang, X.-S.; Li, X. Fusing Wearable and Remote Sensing Data Streams by Fast Incremental Learning with Swarm Decision Table for Human Activity Recognition. Inf. Fusion 2020, 60, 41–64. [Google Scholar] [CrossRef]
  21. Paoletti, M.; Belli, A.; Palma, L.; Vallasciani, M.; Pierleoni, P. A Wireless Body Sensor Network for Clinical Assessment of the Flexion-Relaxation Phenomenon. Electronics 2020, 9, 1044. [Google Scholar] [CrossRef]
  22. Baldi, T.; Farina, F.; Garulli, A.; Giannitrapani, A.; Prattichizzo, D. Upper Body Pose Estimation Using Wearable Inertial Sensors and Multiplicative Kalman Filter. IEEE Sens. J. 2020, 20, 492–500. [Google Scholar] [CrossRef] [Green Version]
  23. Zhang, H.; Zhang, Z.; Gao, N.; Xiao, Y.; Meng, Z.; Li, Z. Cost-Effective Wearable Indoor Localization and Motion Analysis via the Integration of UWB and IMU. Sensors 2020, 20, 344. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Fourati, H.; Manamanni, N.; Afilal, L.; Handrich, Y. Complementary Observer for Body Segments Motion Capturing by Inertial and Magnetic Sensor. IEEE ASME Trans. Mechatron. 2014, 19, 149–157. [Google Scholar] [CrossRef] [Green Version]
  25. Ferrari, A.; Micucci, D.; Mobilio, M.; Napoletano, P. On the Personalization of Classification Models for Human Activity Recognition. IEEE Access 2020, 8, 32066. [Google Scholar] [CrossRef]
  26. Stikic, M.; Larlus, D.; Ebert, S.; Schiele, B. Weakly Supervised Recognition of Daily Life Activities with Wearable Sensors. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 2521–2537. [Google Scholar] [CrossRef]
  27. Ghazal, S.; Khan, U.S.; Saleem, M.M.; Rashid, N.; Iqbal, J. Human Activity Recognition Using 2D Skeleton Data and Supervised Machine Learning. IET Image Process. 2019, 13, 2572–2578. [Google Scholar] [CrossRef]
  28. Manzi, A.; Dario, P.; Cavallo, F. A human Activity Recognition System based on Dynamic Clustering of Skeleton Data. Sensors 2017, 17, 1100. [Google Scholar] [CrossRef] [Green Version]
  29. Plotz, T.; Guan, Y. Deep Learning for Human Activity Recognition in Mobile Computing. Computer 2018, 51, 50–59. [Google Scholar] [CrossRef]
  30. Xu, C.; Chai, D.; He, J.; Zhang, X.; Duan, S. InnoHAR: A Deep Neural Network for Complex Human Activity Recognition. IEEE Access 2019, 7, 9893. [Google Scholar] [CrossRef]
  31. Bianchi, V.; Bassoli, M.; Lombardo, G.; Fornacciari, P.; Mordonini, M.; Munari, I.D. IoT Wearable Sensor and Deep Learning: An Integrated Approach for Personalized Human Activity Recognition in a Smart Home Environment. IEEE Internet Things J. 2019, 6, 8553–8562. [Google Scholar] [CrossRef]
  32. Yuan, Q.; Chen, I.-M. Localization and Velocity Tracking of Human via 3 IMU Sensors. Sens. Actuators A Phys. 2014, 212, 25–33. [Google Scholar] [CrossRef]
  33. Tian, Y.; Meng, X.; Tao, D.; Liu, D.; Feng, C. Upper Limb Motion Tracking with the Integration of IMU and Kinect. Neurocomputing 2015, 159, 207–218. [Google Scholar] [CrossRef]
  34. Zihajehzadeh, S.; Loh, D.; Lee, T.J.; Hoskinson, R.; Park, E.J. A Cascaded Kalman Filter-based GPS/MEMS-IMU Integrated for Sports Applications. Measurement 2018, 73, 200–210. [Google Scholar] [CrossRef]
  35. Tong, X.; Li, Z.; Han, G.; Liu, N.; Su, Y.; Ning, J.; Yang, F. Adaptive EKF based on HMM Recognizer for Attitude Estimation Using MEMS MARG Sensors. IEEE Sens. J. 2018, 18, 3299–3310. [Google Scholar] [CrossRef]
  36. Enayati, N.; Momi, E.D.; Ferrigno, G. A Quaternion-Based Unscented Kalman Filter for Robust Optical/Inertial Motion Tracking in Computer-Assisted Surgery. IEEE Trans. Instrum. Meas. 2015, 64, 2291–2301. [Google Scholar] [CrossRef] [Green Version]
  37. Wagstaff, B.; Peretroukhin, V.; Kelly, J. Robust Data-Driven Zero-Velocity Detection for Foot-Mounted Inertial Navigation. IEEE Sens. J. 2020, 20, 957–967. [Google Scholar] [CrossRef] [Green Version]
  38. Jia, H.; Chen, S. Integrated Data and Knowledge Driven Methodology for Human Activity Recognition. Inf. Sci. 2020, 536, 409–430. [Google Scholar] [CrossRef]
  39. Khaire, P.; Kumar, P.; Imran, J. Combining CNN Streams of RGB-D and Skeletal Data for Human Activity Recognition. Pattern Recognit. Lett. 2018, 115, 107–116. [Google Scholar] [CrossRef]
  40. Xie, X.; Huang, G.; Zarei, R.; Ji, Z.; Ye, H.; He, J. A Novel Nest-Based Scheduling Method for Mobile Wireless Body Area Networks. Digit. Commun. Netw. 2020. [Google Scholar] [CrossRef]
  41. Zhou, X. Wearable Health Monitoring System based on Human Motion State Recognition. Comput. Commun. 2020, 150, 62–71. [Google Scholar] [CrossRef]
  42. Li, G.; Li, C. Learning Skeleton Information for Human Action Analysis Using Kinect. Signal Process. Image Commun. 2020, 84, 115814. [Google Scholar] [CrossRef]
  43. Motion Sensor—Motion, Light and Temperature Sensor. Available online: https://www.fibaro.com/en/products/motion-sensor/ (accessed on 12 August 2020).
  44. Technology That’s Inventing the Future. Available online: https://www.fitbit.com/us/technology (accessed on 12 August 2020).
  45. Mi Band—Understand Your Every Move. Available online: https://www.mi.com/global/miband (accessed on 12 August 2020).
  46. MAX25205—Gesture Sensor for Automotive Applications. Available online: https://www.maximintegrated.com/en/products/sensors/MAX25205.html (accessed on 12 August 2020).
  47. De, P.; Chatterjee, A.; Rakshit, A. Recognition of Human Behavior for Assisted Living Using Dictionary Learning Approach. IEEE Sens. J. 2018, 16, 2434–2441. [Google Scholar] [CrossRef]
  48. Lima, Y.; Gardia, A.; Pongsakornsathien, N.; Sabatini, R.; Ezer, N.; Kistan, T. Experimental Characterization of Eye-tracking Sensors for Adaptive Human-Machine Systems. Measurement 2019, 140, 151–160. [Google Scholar] [CrossRef]
  49. Shu, Y.; Xiong, C.; Fan, S. Interactive Design of Intelligent Machine Vision based on Human–Computer Interaction Mode. Microprocess. Microsyst. 2020, 75, 103059. [Google Scholar] [CrossRef]
  50. Anitha, G.; Priya, S.B. Posture based Health Monitoring and Unusual Behavior Recognition System for Elderly Using Dynamic Bayesian Network. Clust. Comput. 2019, 22, 13583–13590. [Google Scholar] [CrossRef]
  51. Wang, Z.; Wang, J.; Zhao, H.; Qiu, S.; Li, J.; Gao, F.; Shi, X. Using Wearable Sensors to Capture Posture of the Human Lumbar Spine in Competitive Swimming. IEEE Trans. Hum. Mach. Syst. 2019, 49, 194–205. [Google Scholar] [CrossRef]
  52. Concepción, M.A.; Morillo, L.M.S.; García, J.A.A.; González-Abril, L. Mobile Activity Recognition and Fall Detection System for Elderly People Using Ameva Algorithm. Pervasive Mob. Comput. 2017, 34, 3–13. [Google Scholar] [CrossRef] [Green Version]
  53. Hbali, Y.; Hbali, S.; Ballihi, L.; Sadgal, M. Skeleton-based Human Activity Recognition for Elderly Monitoring Systems. IET Comput. Vis. 2018, 12, 16–26. [Google Scholar] [CrossRef]
  54. Yu, Z.; Lee, M. Human Motion based Intent Recognition Using a Deep Dynamic Neural Model. Robot. Auton. Syst. 2015, 71, 134–149. [Google Scholar] [CrossRef]
  55. Bragança, H.; Colonna, J.G.; Lima, W.S.; Souto, E. A Smartphone Lightweight Method for Human Activity Recognition Based on Information Theory. Sensors 2018, 20, 1856. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  56. Cardenas, E.J.E.; Chavez, G.C. Multimodel Hand Gesture Recognition Combining Temporal and Pose Information based on CNN Descriptors and Histogram of Cumulative Magnitude. J. Vis. Commun. Image Represent. 2020, 71, 102772. [Google Scholar] [CrossRef]
  57. Lima, W.S.; Souto, E.; El-Khatib, K.; Jalali, R.; Gama, J. Human Activity Recognition Using Inertial Sensors in a Smartphone: An Overview. Sensors 2019, 19, 3213. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  58. Smedt, Q.D.; Wannous, H.; Vandeborre, J.-P. Heterogeneous Hand Gesture Recognition Using 3D Dynamic Skeletal Data. Comput. Vis. Image Underst. 2019, 181, 60–72. [Google Scholar] [CrossRef] [Green Version]
  59. Muller, P.; Begin, M.-A.; Schauer, T.; Seel, T. Alignment-Free, Self-Calibrating Elbow Angles Measurement Using Inertial Sensors. IEEE J. Biomed. Health Inform. 2017, 21, 312–319. [Google Scholar] [CrossRef]
  60. Kianifar, R.; Lee, A.; Raina, S.; Kulic, N. Automated Assessment of Dynamic Knee Valgus and Risk of Knee Injury During the Single Leg Squat. IEEE J. Transl. Eng. Health Med. 2017, 5, 2100213. [Google Scholar] [CrossRef]
  61. Baghdadi, A.; Cavuoto, L.A.; Crassidis, J.H. Hip and Trunk Kinematics Estimation in Gait Through Kalman Filter Using IMU Data at the Ankle. IEEE Sens. J. 2018, 18, 4243–4260. [Google Scholar] [CrossRef]
  62. Zhu, C.; Sheng, W. Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2011, 41, 569–573. [Google Scholar] [CrossRef]
  63. Guo, Z.; Xiao, F.; Sheng, B.; Fei, H.; Yu, S. WiReader: Adaptive Air Handwriting Recognition Based on Commercial Wi-Fi Signal. IEEE Internet Things J. 2020. [Google Scholar] [CrossRef]
  64. Panwar, M.; Biswas, D.; Bajaj, H.; Jobges, M.; Turk, R.; Maharatna, K.; Acharyya, A. Rehab-Net: Deep Learning Framework for Arm Movement Classification Using Wearable Sensors for Stroke Rehabilitation. IEEE Trans. Med Eng. 2019, 66, 3026–3037. [Google Scholar] [CrossRef]
  65. Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.J.; Howard, D. A Comparison of Feature Extraction Methods for the Classification of Dynamic Activities from Accelerometer Data. IEEE Trans. Biomed. Eng. 2009, 56, 871–879. [Google Scholar] [CrossRef] [PubMed]
  66. Zhang, Y.; Chen, K.; Yi, J.; Liu, T.; Pan, Q. Whole-Body Pose Estimation in Human Bicycle Riding Using a Small Set of Wearable Sensors. IEEE ASME Trans. Mech. 2016, 21, 163–174. [Google Scholar] [CrossRef]
  67. Ross, G.; Dowling, B.; Troje, N.F.; Fischer, S.L.; Graham, R.B. Objectively Differentiating Whole-body Movement Patterns between Elite and Novice Athletes. Med. Sci. Sports Exerc. 2018, 50, 1457–1464. [Google Scholar] [CrossRef]
  68. Dahmani, D.; Larabi, S. User-Independent System for Sign Language Finger Spelling Recognition. J. Vis. Commun. Image Represent. 2014, 25, 1240–1250. [Google Scholar] [CrossRef]
  69. Ni, P.; Lv, S.; Zhu, X.; Cao, Q.; Zhang, W. A Light-weight On-line Action Detection with hand Trajectory for Industrial Surveillance. Digit. Commun. Netw. 2020. [Google Scholar] [CrossRef]
  70. Yagi, K.; Sugiura, Y.; Hasegawa, K.; Saito, H. Gait Measurement at Home Using A Single RGB Camera. Gait Posture 2020, 76, 136–140. [Google Scholar] [CrossRef] [PubMed]
  71. Devanne, M.; Berretti, S.; Pala, P.; Wannous, H.; Daoudi, M.; Bimbo, A.D. Motion Segment Decomposition of RGB-D Sequence of Human Behavior Understanding. Pattern Recognit. 2017, 61, 222–233. [Google Scholar] [CrossRef] [Green Version]
  72. Xu, W.; Su, P.; Cheung, S.S. Human Body Reshaping and Its Application Using Multiple RGB-D Sensors. Signal Process. Image Commun. 2019, 79, 71–81. [Google Scholar] [CrossRef]
  73. Wu, H.; Huang, Z.; Hu, B.; Yu, Z.; Li, X.; Gao, M.; Shen, Z. Real-Time Continuous Action Recognition Using Pose Contexts with Depth Sensors. IEEE Access 2018, 6, 51708. [Google Scholar] [CrossRef]
  74. Ding, C.; Zhang, L.; Gu, C.; Bai, L.; Liao, Z.; Hong, H.; Li, Y.; Zhu, X. Non-Contact Human Motion Recognition based on UWB Radar. IEEE J. Emerg. Sel. Top. Circuits Syst. 2018, 8, 306–315. [Google Scholar] [CrossRef]
  75. Kim, S.-H.; Geem, Z.W.; Han, G.-T. A Novel Human Respiration Pattern Recognition Using Signals of Ultra-Wideband Radar Sensors. Sensors 2019, 19, 3340. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  76. Shrestha, A.; Li, H.; Kernec, J.L.; Fioranelli, F. Continuous Human Activity Classification from FMCW Radar with Bi-LSTM Networks. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
  77. Wang, Y.; Zheng, Y. An FMCW Radar Transceiver Chip for Object Positioning and Human Limb Motion Detection. IEEE Sens. J. 2017, 17, 236–237. [Google Scholar] [CrossRef]
  78. Han, Z.; Lu, Z.; Wen, X.; Zhao, J.; Guo, L.; Liu, Y. In-Air Handwriting by Passive Gesture Tracking Using Commodity WiFi. IEEE Commun. Lett. 2020. [Google Scholar] [CrossRef]
  79. Li, C.; Liu, M.; Cao, Z. WiHF: Gesture and User Recognition with WiFi. IEEE Trans. Mob. Comput. 2020. [Google Scholar] [CrossRef]
  80. Oguntala, G.A.; Abd-Alhameed, R.A.; Hu, Y.-F.; Noras, J.M.; Eya, N.N.; Elfergani, I.; Rodriguez, J. SmartWall: Novel RFID-Enabled Ambient Human Activity Recognition Using Machine Learning for Unobtrusive Health Monitoring. IEEE Access 2019, 7, 68022. [Google Scholar] [CrossRef]
  81. Wang, F.; Liu, J.; Gong, W. Multi-Adversarial In-Car Activity Recognition Using RFIDs. IEEE Trans. Mob. Comput. 2020. [Google Scholar] [CrossRef]
  82. Jahanandish, M.H.; Fey, N.P.; Hoyt, K. Lower Limb Motion Estimation Using Ultrasound Imaging: A Framework for Assistive Device Control. IEEE J. Biomed. Health Inform. 2019, 23, 2505–2514. [Google Scholar] [CrossRef]
  83. Zhou, F.; Li, X.; Wang, Z. Efficient High Cross-User Recognition Rate Ultrasonic Hand Gesture Recognition System. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
  84. Ling, K.; Dai, H.; Liu, Y.; Liu, A.X. UltraGuest: Fine-Grained gesture sensing and recognition. In Proceedings of the 15th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON), Hong Kong, China, 11–13 June 2018. [Google Scholar]
  85. Vanrell, S.R.; Milone, D.H.; Rufiner, H.L. Assessment of Homomorphic Analysis Human Activity Recognition from Acceleration Signals. IEEE J. Biomed. Health Inform. 2018, 22, 1001–1010. [Google Scholar] [CrossRef]
  86. Hsu, Y.-L.; Yang, S.-C.; Chang, H.-C.; Lai, H.-C. Human Daily and Sport Activity Recognition Using a Wearable Inertial Sensor Network. IEEE Access 2018, 6, 31715. [Google Scholar] [CrossRef]
  87. Villeneuve, E.; Harwin, W.; Holderbaum, W.; Janko, B.; Sherratt, R.S. Reconstruction of Angular Kinenatics From Wrist-Worn Inertial Sensor Data for Smart Home Healthcare. IEEE Access 2017, 5, 2351. [Google Scholar] [CrossRef]
  88. Chelli, A.; Patzold, M. A Machine Learning Approach for Fall Detection and Daily Living Activity Recognition. IEEE Access 2019, 7, 38670. [Google Scholar] [CrossRef]
  89. Cha, Y.; Kim, H.; Kim, D. Flexible Piezoelectric Sensor-Based Gait Recognition. Sensors 2018, 18, 468. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  90. Wang, H.; Tong, Y.; Zhao, X.; Tang, Q.; Liu, Y. Flexible, High-Sensitive, and Wearable Strain Sensor based on Organic Crystal for Human Motion Detection. Org. Electron. 2018, 61, 304–311. [Google Scholar] [CrossRef]
  91. Redd, C.B.; Bamberg, S.J.M. A Wireless Sensory Feedback Device for Real-time Gait Feedback and Training. IEEE ASME Trans. Mechatron. 2012, 17, 425–433. [Google Scholar] [CrossRef]
  92. Jarrassé, N.; Nicol, C.; Touillet, A.; Richer, F.; Martinet, N.; Paysant, J.; Graaf, J.B. Classification of Phantom Finger, Hand, Wrist, and Elbow Voluntary Gestures in Transhumeral Amputees with sEMG. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 68–77. [Google Scholar] [CrossRef] [Green Version]
  93. Li, Z.; Guan, X.; Zou, K.; Xu, C. Estimation of Knee Movement from Surface EMG Using Random Forest with Principal Component Analysis. Electronics 2020, 9, 43. [Google Scholar] [CrossRef] [Green Version]
  94. Raurale, S.A.; McAllister, J.; Rincon, J.M. Real-Time Embedded EMG Signal Analysis for Wrist-Hand Pose Identification. IEEE Trans. Signal Process. 2020, 68, 2713–2723. [Google Scholar] [CrossRef]
  95. Di Nardo, F.; Morbidoni, C.; Cucchiarelli, A.; Fioretti, S. Recognition of Gait Phases with a Single Knee Electrogoniometer: A Deep Learning Approach. Electronics 2020, 9, 355. [Google Scholar] [CrossRef] [Green Version]
  96. Li, H.; Shrestha, A.; Heidari, H.; Kernec, J.L.; Fioranelli, F. Bi-LSTM Network for Multimodel Continuous Human Activity Recognition and Fall Detection. IEEE Sens. J. 2020, 20, 1191–1201. [Google Scholar] [CrossRef] [Green Version]
  97. Lee, H.-C.; Ke, K.-H. Monitoring of Large-Area IoT Sensors Using a LoRa Wireless Mesh Network System: Design and Evaluation. IEEE Trans. Instrum. Meas. 2018, 67, 2177–2187. [Google Scholar] [CrossRef]
  98. Kim, J.-Y.; Park, G.; Lee, S.-A.; Nam, Y. Analysis of Machine Learning-based Assessment of Elbow Spasticity Using Inertial Sensors. Sensors 2020, 20, 1622. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  99. Lee, J.K.; Han, S.J.; Kim, K.; Kim, Y.H.; Lee, S. Wireless Epidermal Six-Axis Inertial Measurement Units for Real-time Joint Angle Estimation. Appl. Sci. 2020, 10, 2240. [Google Scholar] [CrossRef] [Green Version]
  100. Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.J.; Howard, D.; Meijer, K.; Crompton, R. Activity identification Using Body-mounted Sensors—A Review of Classification Techniques. Physiol. Meas. 2009, 30, 1–33. [Google Scholar] [CrossRef]
  101. Sun, H.; Lu, Z.; Chen, C.-L.; Cao, J.; Tan, Z. Accurate Human Gesture Sensing with Coarse-grained RF Signatures. IEEE Access 2019, 7, 81228. [Google Scholar] [CrossRef]
  102. Piris, C.; Gärtner, L.; González, M.A.; Noailly, J.; Stöcker, F.; Schönfelder, M.; Adams, T.; Tassani, S. In-Ear Accelerometer-Based Sensor for Gait Classification. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
  103. Vu, C.C.; Kim, J. Human Motion Recognition by Textile Sensors based on Machine Learning Algorithms. Sensors 2018, 18, 3109. [Google Scholar] [CrossRef] [Green Version]
  104. Barut, O.; Zhou, L.; Luo, Y. Multi-task LSTM Model for Human Activity Recognition an Intensity Estimation Using Wearable Sensor Data. IEEE Internet Things J. 2020. [Google Scholar] [CrossRef]
  105. Christ, M.; Braun, N.; Neuffer, J.; Kempa-Liehr, A.W. Time Series FeatuRe Extraction on basis of Scalable Hypothesis Tests (Tsfresh—A Python Package). Neurocomputing 2018, 307, 72–77. [Google Scholar] [CrossRef]
Figure 1. Overview of the latest technical progress of human activity recognition (HAR) investigations.
Figure 1. Overview of the latest technical progress of human activity recognition (HAR) investigations.
Electronics 09 01357 g001
Figure 2. Common methodology for HAR.
Figure 2. Common methodology for HAR.
Electronics 09 01357 g002
Figure 3. Multi-segments model for human body: (a) hand skeletal model, (b) whole body skeletal model.
Figure 3. Multi-segments model for human body: (a) hand skeletal model, (b) whole body skeletal model.
Electronics 09 01357 g003
Figure 4. Kinematic model of elbow joint.
Figure 4. Kinematic model of elbow joint.
Electronics 09 01357 g004
Figure 5. Taxonomy and evaluation indexes for HAR sensing techniques.
Figure 5. Taxonomy and evaluation indexes for HAR sensing techniques.
Electronics 09 01357 g005
Table 1. Identifiable human activities and example applications.
Table 1. Identifiable human activities and example applications.
Human Body PartsActivitiesExample Applications
HandFinger movementRheumatoid arthritis treatment [11]
Hand gestureRobot-assisted living [62]
Hand movementHandwriting recognition [63]
Upper limbForearm movementHome health care [64]
Arm swingGolf swing analysis [7]
Elbow angleStroke rehabilitation [59]
Lower limbGait analysisGait rehabilitation training [10]
Knee angles in movementRisk assessment of knee injury [60]
Ankle movementHip and trunk kinematics [61]
Stairs ascent/descentHuman activity classification [65]
SpineSwimmingSwimming motion evaluation [51]
Whole bodyFall detectionElderly care [18]
Whole-body postureBicycle riding analysis [66]
Whole-body movementDifferentiating people [67]
Table 2. Sensing techniques with typical applications.
Table 2. Sensing techniques with typical applications.
TypeSensing TechniquesTypical Applications
OpticalRGB cameraFinger spelling recognition [68], detection of hand trajectories for industrial surveillance [69], gait measurement at home [70]
RGB-DHand gesture recognition [51], human behavior understanding [71], human body reshaping [72]
Depth sensor3D dynamic gesture recognition [58], real-time continuous action recognition [73], full hand pose recognition [15]
RFIR-UWB 1Non-contact human gait identification [17], non-contact human motion recognition [74], human respiration pattern recognition [75]
FMCW radarContinuous human activity classification [16,76], human limb motion detection [77]
Commodity WiFiContactless fall detection [18], handwritting recognition [63,78], gesture and user detection [79]
UHF 2 RFIDAmbient assisted living [80], in-car activity recognition [81]
AcousticUltrasoundLower limb motion estimation [82], hand gesture recognition [83], finger motion perception and recognition [84]
Inertial3-axis accelerometerDiscrimination of human activity [85], arm movement classification for stroke rehabilitation [64]
6- or 9-axis IMURecognition of daily and sport activity [86], hip and trunk motion estimation [61], assessment of leg squat [60], measurement of elbow angles for physical rehabilitation [59], reconstruction of angular kinematics for smart home healthcare [87], fall detection and daily activity recognition [88]
ForcePiezoelectric, FSR 3, thin film strain sensor, etc.Wearable gait recognition [89], human motion detection [90], real-time gait feedback and training [91]
EMGEMG electrodesClassification of finger, hand, wrist, and elbow gestures [92]; knee movement estimation [93]; wrist-hand pose identification [94]
1 IR-UWB, 2 UHF, and 3 FSR are the abbreviations of impulse radio ultra-wide band, ultra-high frequency, and force sensitive resistor, respectively.
Table 3. Alternative wireless communication standards for HAR.
Table 3. Alternative wireless communication standards for HAR.
StandardsRangeMaximum Data RateCarrier FrequencyEnergy ConsumptionNetwork Topology
Bluetooth (Ver. 4.2)10 m2 Mbps2.4 GHzMediumP2P
BLE10 m1 Mbps2.4 GHzLowStar, mesh
ZigBee50 m250 Kbps2.4 GHzLowStar, tree, mesh
WiFi (802.11 n)10–50 m450 Mbps2.4 GHzHighStar, tree, mesh, P2P
Table 4. Types of features for HAR [60,71,80,95].
Table 4. Types of features for HAR [60,71,80,95].
TypesExamples of Features
HeuristicMinimum, maximum, zero-crossing, threshold amplitude, inclination angles, signal magnitude area (SMA), etc.
Time domainMean, variance, standard deviation (SD), skewness, kurtosis, root mean square (RMS), median absolute square (MAS), interquartile range, minimum, interquartile range, auto-correlation, cross correlation, motion length, motion cycle duration, kinematic asymmetry, etc.
Frequency domainSpectral distribution, coefficients sum, spectral entropy, etc.
Time–frequencyApproximation coefficients, detail coefficients, transition times

Share and Cite

MDPI and ACS Style

Meng, Z.; Zhang, M.; Guo, C.; Fan, Q.; Zhang, H.; Gao, N.; Zhang, Z. Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis. Electronics 2020, 9, 1357. https://doi.org/10.3390/electronics9091357

AMA Style

Meng Z, Zhang M, Guo C, Fan Q, Zhang H, Gao N, Zhang Z. Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis. Electronics. 2020; 9(9):1357. https://doi.org/10.3390/electronics9091357

Chicago/Turabian Style

Meng, Zhaozong, Mingxing Zhang, Changxin Guo, Qirui Fan, Hao Zhang, Nan Gao, and Zonghua Zhang. 2020. "Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis" Electronics 9, no. 9: 1357. https://doi.org/10.3390/electronics9091357

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop