Next Article in Journal
Synchrotron Radiation Study of Gain, Noise, and Collection Efficiency of GaAs SAM-APDs with Staircase Structure
Next Article in Special Issue
Semi-Supervised Adversarial Learning Using LSTM for Human Activity Recognition
Previous Article in Journal
SiO2 Microsphere Array Coated by Ag Nanoparticles as Raman Enhancement Sensor with High Sensitivity and High Stability
Previous Article in Special Issue
A Novel Central Camera Calibration Method Recording Point-to-Point Distortion for Vision-Based Human Activity Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The State-of-the-Art Sensing Techniques in Human Activity Recognition: A Survey

German Research Centre for Artificial Intelligence (DFKI), 67663 Kaiserslautern, Germany
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(12), 4596; https://doi.org/10.3390/s22124596
Submission received: 13 May 2022 / Revised: 13 June 2022 / Accepted: 16 June 2022 / Published: 17 June 2022
(This article belongs to the Special Issue Sensors for Human Activity Recognition)

Abstract

:
Human activity recognition (HAR) has become an intensive research topic in the past decade because of the pervasive user scenarios and the overwhelming development of advanced algorithms and novel sensing approaches. Previous HAR-related sensing surveys were primarily focused on either a specific branch such as wearable sensing and video-based sensing or a full-stack presentation of both sensing and data processing techniques, resulting in weak focus on HAR-related sensing techniques. This work tries to present a thorough, in-depth survey on the state-of-the-art sensing modalities in HAR tasks to supply a solid understanding of the variant sensing principles for younger researchers of the community. First, we categorized the HAR-related sensing modalities into five classes: mechanical kinematic sensing, field-based sensing, wave-based sensing, physiological sensing, and hybrid/others. Specific sensing modalities are then presented in each category, and a thorough description of the sensing tricks and the latest related works were given. We also discussed the strengths and weaknesses of each modality across the categorization so that newcomers could have a better overview of the characteristics of each sensing modality for HAR tasks and choose the proper approaches for their specific application. Finally, we summarized the presented sensing techniques with a comparison concerning selected performance metrics and proposed a few outlooks on the future sensing techniques used for HAR tasks.

1. Introduction

A better understanding of human behavior benefits individuals on a large scale, including healthcare, well-being, social interaction, life assistance, etc. Thus human activity recognition (HAR) has been tremendously explored in recent years, driven by the enormous technical advances in sensing, computation, and immense human-centric user scenarios. The explosive advancement in machine learning and hardware architecture has dramatically improved the accuracy and robustness of HAR tasks and enabled the technique to be deployed at the far edge near the body. Besides the computational ability, the sensing technique plays a fundamental and critical role in HAR tasks. Therefore, a broader range of sensing modalities has been explored in recent years, aiming to boost the development of reliable body activity digitalized recording. The proposed sensing modalities range from traditional motion sensing methods such as accelerometers, to novel TOF-based sensing such as mmWave, from neural network-aided image processing for activity abstraction to very straightforward proximity detection approaches such as RF-tags.
To acquire a comprehensive overview of the state-of-the-art sensing modalities in human activity recognition, categorization of the adopted sensors is an efficient approach for a deeper understanding of the sensing medium. Researchers have already categorized related sensors into different classes, such as active and passive sensors depending on the need for external excitation [1], or intrusive and non-intrusive sensors depending on the interference of the sensors in the process flow [2,3]. With a further step, we elaborately categorized the HAR-related sensing modalities into five classes depending on the following sensing principles: kinematic sensing, field-based sensing, wave-based sensing, physiological sensing, and hybrid or other approaches, as Figure 1 presents. We enumerated most of the sensing modalities within each class with an in-depth description of the sensing tricks in the HAR tasks.

1.1. Relevant Surveys

Despite the enormous scope of sensing modalities in HAR tasks, related survey works are limited. The existing surveys on HAR sensing are primarily focused either on a specific scenario (such as wearable sensing or video-based sensing) or on full-stack presentation of both sensing and data processing techniques, which results in a weak focus on HAR-related sensing techniques. Table 1 lists the latest HAR sensing-related surveys in recent years from the literature. Those high-related surveys (as well as other references listed in this paper) are first searched using keywords such as human activity recognition, survey, overview, and sensing technique, from platforms including Google Scholar, IEEE Xplore, Microsoft Academic, etc. Second, the survey papers cited in the searched surveys were also considered. As can be seen, nearly all the exiting surveys only focused on a specific domain of HAR sensing techniques, such as device-free sensors [4], smartphone sensors [5], radar sensors [6], etc. Such surveys could supply a detailed research result on the particular sensing domain but lack focus on the adopted sensors in HAR. In contrast, there are only a few surveys [7,8] that supply thorough sensor modalities. However, an in-depth introduction and comparison of the sensing tricks is still lacking.

1.2. Paper Aims and Contribution

This work tries to fill the gap by presenting an extensive and in-depth survey on the state-of-the-art sensing modalities in HAR tasks, aiming to supply a solid understanding of most sensing modalities for researchers in the community. Overall, we provide the following contributions in this survey:
  • For a clear overview of the multifaceted nature of HAR tasks, we firstly sorted the human activities into three types: body position-related services (“where”), body action-related services (“what”), and body status-related services (“how”). Such sorting coarsely but briefly introduces the final objective of the utilized sensing technique, which supplies the readers with an elementary step for the sensing concept.
  • We then categorized the sensing techniques in HAR tasks into five classes based on the underlying physical principle: mechanical kinematic sensing, wave-based sensing, field-based sensing, physiological sensing, and hybrid or others. We enumerated broadly the adopted sensing modalities within each category and supplied an in-depth description of the underlying technical tricks. Such a sensor-oriented categorization supplies the readers a further understanding of the distinct HAR tasks.
  • We gave each sensing modality an in/cross-class comparison with eight metrics to better understand each modality’s limitation and dominant properties and its typical applications in HAR. Finally, we provided a few insights regarding its future development.
This survey is constructed as follows: in Section 1, we briefly stated the motivation of this survey considering the existing works and the development of the state-of-the-art HAR sensing techniques. We then summarized and categorized all human activities in the research scope according to the activity attribute in Section 2, followed by a brief description of the general process of the HAR task. Section 3 showed our categorization of the current HAR sensing techniques and gave an in-depth and extensive description of each sensing modality, followed by a summary regarding eight critical sensing performance metrics. Section 4 and Section 5 presented a few outlooks into the future development of the HAR-related sensing techniques and the conclusion of our work.

2. Background

2.1. Object of Human Activities Recognition (HAR)

Human activities refer to human behaviors concerning the body or the environment. The recognition of human activity aims to capture the action/status of the agents from a series of observations. A successful recognition could provide personalized support in plenty of human-centric applications [16,17]. Since the HAR tasks cover a wide range of activities, it is necessary to sort the related topics in an impressive and compact way. Most research works assort the task into a few levels according to the activity complexity [4,7] (from gestures to actions), followed by human object/human interaction. Group activities [18,19] are the most complicated ones, requiring multiple people and essentially composed of series of gestures, actions, and interactions. In this work, we sorted the human activity recognition into three problems (Figure 2) according to the attributes of the targeted task: body position-related problem, body action-related problem, and body status-related problem corresponding to the questions of “where”, “what”, and “how”, respectively. The“where” problem addresses the position-related recognition, such as indoor positioning [20], tracking [21], proximity [22], etc. The “what” problem deals with the action-related recognition, which belongs to the most widely researched section under the HAR task. Examples are fall detection [23], gait analysis [24], ADL (activity of daily life) [25], etc. The last one is the “how” problem, inferring the body status-related research, such as emotion-sensing [26], respiration/heartrate sensing [27], healthcare [28], etc. This task-oriented categorization aims to supply a basic concept of the objectives of human activity recognition. As can be seen, HAR is a multifaceted topic covering almost all human-related activities and needs interdisciplinary knowledge to understand the behaviors and provide assistance properly.

2.2. General Process of Human Activity Recognition

Human activity recognition explains comprehensive body behaviors aiming to supply ethical-respect assistance. A complete recognition task is generally composed of three steps (Figure 3): sensing, data processing, and decision making. Sensing techniques play a fundamental role in the procedure, trying to perceive as much contextual knowledge as possible so that a reliable recognition becomes possible. A successful HAR task depends firstly on the data quality perceived from the applied sensors and secondly on the processing skills of the acquired data. With the developments in physics, electronics, and other fundamental subjects, novel sensors and devices are emerging to supply more efficient signal patterns for human activity recognition [29,30,31]. The revolution of the ToF camera, as an example, has enabled the camera to move from simply capturing the streamed images to providing additional depth information to the images, thus provoking a wide range of recognition tasks such as hand gestures [32] and facial expressions [33]. Recently, significant advances in detection accuracy and range, and the power consumption of the ToF sensor, have continued to boost novel applications in both industrial automation [34] and consumer electronics [35]. Diverse sensing techniques have been utilized for specific HAR scenarios and have provided outperforming recognition performance, which motivated us to write this survey focusing on those state-of-the-art sensing techniques with in-depth exploration and extensive analysis. After getting the knowledge from the sensing approaches, the second step is to process the data. According to the data quality and the deployed algorithms, a pre-processing step such as normalization or calibration might be needed. For small volume data such as single-dimensional ECG data or RSSI-based positioning data, a rule-based [36] or multi-lateration [37] algorithm will supply good results. For extensive volume data such as image and speech, the algorithms deployed on the pre-processed data have been dominated by deep neural network-based models [38] with a training process due to its cutting-edge recognition performance compared with traditional approaches [39], such as feature descriptors in object detection. Currently, a large amount of HAR tasks are conducted based on the image streams captured from different kinds of cameras. Those works were focused on spatio-temporal relations of the individuals in the scene. Traditionally, researchers handcraft the features [40] to deduce the target activity, but such approaches firstly heavily rely on the individual experience for the selection of high-relative features; secondly, the handcrafted features might be inefficient and lack generalization in dynamic and environments. The last decade’s exploration of machine learning has impressively influenced the processing pipeline in HAR applications. Network models based on convolutional computing [41] or attentional mechanisms [42] for feature abstraction have dominated the approaches for data processing and presented the state-of-the-art recognition performance. The corresponding general framework comprises steps including data acquisition from the applied sensing technique, feature abstraction with distinct network models, and target decision-making based on the inference result of the network model [43,44]. After the patterns of the activities are acquired from the data in the processing step, a decision on the activity recognition could be concluded as a final step. This survey will, however, not cover the recognition algorithms adopted in the data processing step and the final inference step based on the network models. The aim of this survey is to supply a detailed explanation of the physical principles under the applied sensing techniques in HAR tasks and discuss the differences between them so that researchers can choose the right one for their applications. As the first component in the pipeline of the HAR application, the sensing techniques transform human physical activities into numerical information that could be further processed. The following section will extensively present the related HAR-targeted sensing approaches and the behind.

3. Sensing Techniques

As Figure 1 depicts, we categorized the sensing techniques into five classes according to the sensing principles: mechanical kinematic sensing, field sensing, wave sensing, physiological sensing, and the hybrid or others. Compared with other categorization approaches such as the deployment approach (wearable, object, environmental, etc.), the principle-based categorization gives a better understanding of the sensing technique’s physical background. In the following subsections, we will enumerate the leading sensing modalities in each class with their sensing tricks and related state-of-the-art research works. After the enumeration, we also provide an evaluation and comparison of the sensing modalities with the following performance metrics:
  • Cost. Low: less than 10 USD. High: hundreds to thousands of USD.
  • Power efficiency. Low: level of mW. High: level of W.
  • Signal source. Active or passive according to the source of the measured physical characteristics (naturally or emitted by the sensing system).
  • Robustness. The ability to tolerate perturbations that might affect the performance of the sensor.
  • Privacy concern. If the sensing approach records individual information beyond the need of interest.
  • Computational load. The demand of the hardware resources for successful decision making.
  • Typical application. A list of HAR tasks being addressed by the sensing approach.
  • Other criterion. Such as installing/maintaining complexity, environmental dependency (line of sight, etc.), accuracy and sensitivity, etc.

3.1. Mechanical Kinematic Sensing

Mechanical sensing refers to mechanical mobility and deformation when a force is deployed on/from the target. The mobility and deformation are perceived by the mechanical sensors, which transform the mechanical variation into electric signals. Mechanical sensors have been widely used to monitor body activity such as the kinematic senors.
In physics and maths, kinematics is a field of study exploring geometrical motion. Kinematics sensing in HAR is based on the human body-related motion properties such as velocity, acceleration, rotation, etc. Since the recognition of body motion activities is the most related object of HAR tasks compared to other objects such as positioning, status monitoring, etc., kinematic sensors have become the dominant sensing approach in scientific research and industry application. The most popular deployed sensors are inertial ones such as accelerometers and gyroscopes. Another reason for the massive usage of inertial sensors is the power effectiveness and small size, enabling a pervasive embedding of the sensing unit into personal assistant devices such as smartphones and wearable devices such as fitness bands.
Nearly all of the current commercial wearable devices are embedded with inertial sensors that deliver motion signals of a distinct body part without much concern about power consumption and comfort. Both academic and industrial researchers have developed plenty of works with inertial-sensor-embedded wearable applications. For example, Hristijan et al. [45] explored a weighted ensemble learning algorithm with data from head-mounted inertial sensors to recognize eight everyday activities. Tobias et al. [46] proposed a respiration rate monitoring using an in-ear headphone inertial sensor. Wrist-, hand-, finger-worn inertial sensors are primarily used for gesture recognition as a means of human–machine interface [47,48,49]. Related wearables are smart gloves, smart watches, smart rings, wristbands, etc. Another popular motion-recognition-enabled wearable modality is the smart garment. Kang et al. [50] designed an IMU and conductive-yarn-integrated clothes to prevent spinal disease by continuous posture monitoring. Zhang [51] evaluated an innovative full-body wearable garment system based on IMUs for motion analysis during different exercises. Wang et al. [52] evaluated stroke patients’ acceptance of an IMU-embedded smart garment for supporting upper extremity rehabilitation and received positive responses in a clinical setting. Besides the wearable electric devices and smart garments, inertial sensors could also be integrated into shoes and soles for foot- and leg-related motion-based research, such as gait analysis [53], indoor pedestrian navigation [54], workout recognition [55], injury prevention [56], etc.
Besides the advantage in wearability (power consumption, small size, low cost, pervasiveness), inertial sensors also outperform in data quality regarding sensitivity and accuracy. A high-resolution accelerometer could sense minor vibrations on bodies. Cesareo et al. [57] assessed breathing parameters using the IMU-based system. With the proposed algorithm, they reconstructed respiration-induced movement and precisely perceived the respiratory rate through an automatic method. Huang et al. [58] demonstrated a novel method for 3D pose reconstruction with six IMUs, which outperformed the camera-based methods in situations such as heavy occlusions and fast motion.
Regarding the above-listed advantages, inertial sensors currently play the most critical role in HAR tasks, even in the unique cases of commercial wearable targeting motion-related applications [59]. However, inertial sensors need to be mounted on the target part to sense the part motion pattern, which might be annoying regarding the user habit when long-term continuous motion monitoring is demanded and might cause burden and discomfort for users. For highly accurate motion reconstruction, the inertial sensor also faces the challenge of accumulated errors, which need to addressed by constant recalibration.

3.2. Wave Sensing

Wave sensing is a non-contact sensing technique based on the propagation properties of waves. Three kinds of wave sensing approaches are mainly used for HAR tasks. The first is the RF signals such as WiFi, BT, mmWave, etc., referring to a wireless electromagnetic signal with identified radio frequencies ranging from 3 kHz to 300 GHz. The propagation of the wireless electromagnetic wave is based on the electric and magnetic fields that are orthogonal to each other. The second wave signal is the acoustic signal, a mechanical wave that includes vibration, sound, ultrasound, and infrasound. The third is the optical signal, an electromagnetic signal with the typical extremely high frequency in THz order. In HAR, those wave sensing approaches have been explored widely and deeply. For example, image-based activity recognition analyzes the target actions in the images from the video and can supply recognition with high accuracy. Since video information is captured by a camera that takes all light rays and focuses it via the lens onto a grid of tiny light-sensitive photosites, it is essentially optic-enabled sensing. RF and acoustic signals, as ambient sensors, offer advantages in both privacy protection and reducing the extra burden of objects.
Two kinds of sensing methods exist in wave-based human-centric sensing: active and passive sensing. Figure 4 shows the essential difference between the two methods. Active sensing requires an external source of energy. The source emits waves to the measured object and receives the wave’s reflection, transmission, and absorption. Features abstracted from the received information are then utilized for object description. On the other hand, passive sensing does not need an active wave source and perceives the object variables by receiving a measured wave signal from the object.
(A)
RF Signal
RF-based HAR is a non-intrusive approach that can bypass the burden and discomfort caused by wearable activity monitoring sensors. The basic principle of the RF-based HAR system is that the propagation path of the RF wave will be affected by the intrusiveness of the human body. The resulting variations in the received wave can then be used as features to deduce different activities.
A series of RF signals were explored for HAR tasks, such as WiFi, UWB, mmWave, etc. Among them, WiFi is the most popular due to its pervasiveness in the indoor environment. The critical intuition of WiFi-based HAR is that motions of the human body introduce different multipath distortions in WiFi signals and generate different patterns in the time series of channel state information. Li et al. [60] proposed a system named Wi-Motion, being able to jointly leverage the amplitude and phase information extracted from the channel state information sequence, and to achieve a mean accuracy of 96.6% in the line-of-sight environment and 92% in not line-of-sight environment regarding five predefined typical human activities (bend, half squat, step, stretch leg, and jump). Liu et al. [61] designed a WiFi-based sleep monitoring system to abstract fine-grained sleep information such as a person’s respiration, sleeping postures and rollovers by continuously collecting the fine-grained wireless channel state information. Besides the activity recognition, the WiFi signal can be leveraged for indoor location tasks. An example work is from Wang et al. [62] where the authors proposed a dual-task residual convolutional neural network with one-dimensional convolutional layers for the joint task of activity recognition and indoor localization. Bluetooth technology is another RF approach to perform HAR tasks. However, compared with the WiFi signal, the Bluetooth signal is relatively weak [63]. Thus the accuracy and reaching range is limited. However, it enjoys advantages in cost and ease of use. Therefore, Bluetooth technology is mainly used for indoor locations by deploying plenty of small form-factor, power-saving, cost-efficient tags with high density [64].
Besides the WiFi and BT wave signal, the mmWave technology, which operates in the frequency range of 30 GHz and 300 GHz, recently exhibited high attraction to researchers. Since a higher frequency means a smaller antenna size, thus the mmWave radar is compact in form factor. Many antennas could be packaged into a small space to enable highly directional beams. Moreover, the mmWave signal enjoys a larger bandwidth than WiFi signals and higher range resolution. Recent advances in small and low-cost single-chip consumer radar systems operating at mmWave frequencies have opened up many new applications, such as automotive radar, health monitoring, etc. HAR has also been explored with mmWave-based approaches and has received outstanding results with fine-grained classifiers. Zhang et al. [65] predicted the target behavior by using the micro-Doppler effect (induced by micromotion dynamics of a target or its structure) from mmWave radar [65]. Using a neural network work-based classifier, they got 95.19% accuracy of bulk motion of the body and the micromotions from arms and legs. Zhao et al. [66] proposed a system named mBeats, where a robot mounted with mmWave radar system is used to provide periodic heart rate measurements under different user poses. A fall-detection system based on mmWave radar was also presented by Sun et al. [67] with the support of a recurrent neural network with long short-term memory units. Li et al. [68] designed another interesting mmWave radar-enabled system called ThuMouse, which regressively tracks the position of a finger aided by a deep neural network. MmWave-related exploration is still at an early stage and will have an explosive growth period in the following years triggered by its unusual behavior compared to WiFi, BT, and the large-scale chip-level commercialization.
Another greatly promising and widely used RF wave signal is the ultra wide band (UWB), which is a decades-old wireless technology used for short-range, high-bandwidth communication with a high data rate. Now it is also as a standard for high-accuracy location services. According to FiRa, a consortium founded by the dominating companies for UWB standards, the reborn UWB will mainly be focused on three use cases: hands-free access control, location-based services, and peer-to-peer communication, which will be complementary to current dominant wireless solutions. Recently, UWB support has started to appear in high-end smartphones. There is no question that the UWB will boost another wave on related applications. Figure 5 shows the wide spectrum of UWB compared with others, allowing UWB to operate at a shallow power state and build stable connectivity with other devices in a crowded radio environment. Thanks to the higher base frequency, UWB devices can provide higher accuracy in position with the level of around 10 cm [69], which is highly dominant compared with WiFi or BT-based positioning with accuracy of meter-level [70]. Another key feature is that UWB is resistive to the multipath effect, a common issue for most RF-based wave sensing technology. The multipath effect refers to the received radio signal from more than one path because of the reflection of retraction caused by objects near the main signal path. The large bandwidth of UWB provides frequency diversity that can make the time-modulated ultra-wideband (TM-UWB) signal resistant to the multipath effects [71]. Researchers have explored plentiful HAR-related applications with UWB, such as activity recognition in smart homes [72], gesture recognition [73], sleep postural transition recognition [74], healthcare monitoring [75], etc. With the popularization of low-cost UWB chips in wearable devices, there will be more short distance-based novel applications based on the UWB technique, such as swarm intelligence, social distancing, etc. However, despite the above-described advantages of UWB, there will still be some time for a wide deployment of UWB, considering its higher cost. Moreover, regarding the data streaming rate, UWB is not a good option for large data interaction between devices compared with other narrowband radio systems.
(B)
Acoustic Signal
An acoustic signal is a mechanical wave resulting from an oscillation of pressure and travels through the solid, liquid, or gas in the form of a wave. A clear, well-known acoustic signal is the audible sound from a speaker by the vibration of vocal folds. The vibration travels through air and reaches the outer ear and the eardrum. There are two kinds of sound outside the range of audible sound frequency (20–20 Khz): infrasound and ultrasound. An example of infrasound is the atmospheric infrasound caused by the earthquake when the earth’s surface near the epicenter and surrounding regions oscillates in a low frequency. Ultrasound is an acoustic signal with a higher frequency than the upper audible limit of human hearings. A widely used example of ultrasound is medical imaging, where the ultrasound waves travel through the body and create a sonogram of organs, tissues, etc.
As an ambient sensor, ultrasound could firstly supply mm level positioning accuracy indoors based on the time of flight [76,77]. Such a positioning system is based on several wireless ultrasonic beacons with fixed and known coordination under an indoor environment, and receives or emits ultrasonic signals which are finally used for position deduction. The wireless module (WiFi, Bluetooth, or others) is used for data interaction and time synchronization. Finger motion recognition is another application based on ultrasound by leveraging the characteristic of detected morphological changes of deep muscles and tendons. Yang et al. [78] had obtained an accuracy of 95.4% for real-time finger motion recognition. Mokhtari et al. [79] proposed a resident identification system as an innovative home platform by using ultrasound arrays to detect the height of the moving resident and other sensors such as pyroelectric infrared to detect the moving direction. Wang et al. proposed a novel contactless respiration monitoring approach using ultrasound signals with off-the-shelf audio devices. Unlike other works based on chest displacement where false detection may often occur, they monitor the respiration by directly sensing the exhaled airflow from breathing. The principle is that the exhaled airflow from breathing can be regarded as air turbulence, scattering the sound wave and resulting in the doppler effect. The experiment’s results showed an accuracy of 0.3 breaths/min (2%), and it was concluded that the ambient noise and the variation of respiration rate, respiration style, sensing distance, and transmitted signal frequency have little effect on respiration monitoring accuracy of the system.
Previous works on sound (captured by the microphone on a smartphone) are mainly focused on the following application cases: environment assessment [80,81], proximity sensing [82,83], or indoor positioning [84,85]. The sources of sound are either from fine-tuned tags or from the surroundings. In the work of Benjamin et al. [82], an algorithm using inaudible sound patterns was explored to accurately detect whether two mobile phones are within a few meters from each other. The method can be implemented as a standard smartphone application with real-time inferencing, enabling smartphone-based collaborative activity detection and other embedded sensors.
Overall, acoustic signals provide an alternative and competitive approach for highly accurate human or robot positioning and distance-related activity recognition. The method is non-intrusive, thus reducing users’ extra burden and protecting privacy security. However, it still suffers from the computational load and is limited by complex environmental acoustic sources. For example, the accuracy and robustness of ultrasound-based indoor positioning enormously decrease when a collision-like sound occurs, or when a significant barrier between tags exists.
(C)
Optic Signal
Optical signals for HAR tasks mainly refer to deep learning-enabled image processing with the images captured by the photosensitive elements in cameras. Most related works focused on spatio-temporal relations among the objects in the scene. Those works involved tracking multi-agents spots, evaluating their appearance, aggregating independent and joint features, segmenting their movements, extracting their actions, and then perceiving their activities. Image-based systems could cover almost every HAR task and achieve very high recognition accuracy because of the complete view of data captured in the scene. The covered tasks include positioning, navigation, body-part monitoring, full-body monitoring, individual activity recognition, group activity recognition, etc. Sathyamoorthy et al. [86] designed a system named COVID-robot for social distancing monitoring in crowded scenarios. With the help of an RGB-D camera and a 2-D Lidar, the mobile robot can avoid collision in a crowd and estimate distance between all detected individuals among the camera view during self-navigating. Lee et al. [87] presented a innovative wearable navigation system based on an RGBD camera to help the visually impaired. A glass-mounted RGBD camera collected the environment information, which is as a input to their navigation algorithm of real-time 6-DOF feature-based visual odometry. Kim et al. [88] proposed a hand gesture control system based on the tactile feedback to the user’s hand. Amit et al. [89] proposed an approach to analyze a user’s body posture during a workout and compare it to a professional’s reference workout, thus getting visual feedback while performing a workout. The system aims to assist people in completing the exercises independently and prevent incorrectly performed motions that may eventually cause severe long-term injuries. Meng et al. [90] addressed the problem of recognizing person–person interaction by depth cameras providing multi-view data. They divided each person–person interaction into body part interactions at first. Then the pairwise features of these body part interactions were used to analyze the person–person interaction. The method was demonstrated in three public datasets. As can be seen, the image-based HAR tasks are profoundly dependent on the neural-network-based algorithms. Most of the researcher’s effort in this field is in the advanced algorithm exploration to reach the state of the art.
Undoubtedly, camera-based HAR systems have succeeded in different scenarios, including indoor monitoring and outdoor surveillance. However, the problem is that the approach might not be well accepted due to severe privacy concerns. This is one reason that sensor-based HAR is still prevalent in research communities and has led to many research contributions recently. Another significant disadvantage of an image-based solution is located in the computation load. Since the image-based HAR needs strong hardware support (GPU, CPU, memory, bus) for running the millions of parameters (weights and activations) from the trained deep neural network, the cost of hardware resources, power, and maintenance is enormous. Additionally, since this is an optic sensing solution, the performance is deeply influenced by environmental conditions such as light, temperature, air quality, etc.

3.3. Physiological Sensing

The term “physiological sensing” refers to both the natural physiological signals and the kinematic signals activated from the organism. Physiological variables have been widely used in diagnosis, drug discovery, healthcare monitoring, etc. In human activity recognition, the human body, a compound of biochemistry, has a rich set of electrophysiological and kinematic variables that could be measured on the body to indicate the status and action of the object. Figure 6 summarizes the biological variables used in the task of HAR.
(A)
Electrophysiological Signals
Electrophysiology focuses on the electrical properties of the neurons, molecular and cellular, of living beings. The behavior of neurons is essentially based on the electrical and chemical signals inside the physical body. A series of high-level expressions and actions could be interpreted by monitoring those signals. EMG (electromyography), ECG (electrocardiogram), EEG (electroencephalogram), and EOG (electrooculography) are commonly monitored electrophysiological signals in clinical scenarios. Research works in the last decade showed a significant contribution of electrophysiological signals in human behavior interpretation. For example, electromyography is a diagnostic procedure that monitors the electrical signals of muscles and motor neurons. Pancholi et al. [91] developed a low-cost EMG sensing system to recognize the arm activities such as hand open/close or wrist extension/flexion. Srikanth et al. [92] focused on the recognition of complex construction activities with wearable EMG and IMU sensors in a neural network-based way. Similar work has been explored for hand gesture recognition [93,94], human–computer interaction [95,96], etc. ECG records the electrical signal during the heartbeat. With up to twelve electrodes, ECG signals are commonly used to check different heart conditions. The ECG signal is also a popular explored signal for HAR and commonly combined with other inertial sensors [97,98]. Since the cells in the brain communicate through fast electrical impulses, researchers developed EEG equipment to record the brain’s electrical activity by using small metal electrodes attached to the scalp [99]. The signal was also explored in HAR such as eyes open/close [100], emotion recognition [101], etc. EOG is a technique for recording the capitalization on the eyes’ cornea–retina potential difference. Typical basic applications of EOG signals are ophthalmological diagnosis and eye movement recording. However, researchers have already explored the potential of EOG signals in HAR [102]. Lu et al. [103] also proposed a dual model to achieve EOG-based human activity recognition with an average recognition accuracy of 88.15% according to three types of activities (i.e., reading, writing, and resting). Besides the above-listed commonly used electrophysiological signals, many other related signals describing various electrical body-related variables could be explored for HAR tasks. Electrophysiological signals need more effort for activity interpretation compared with other sensing approaches because of the complexity of body anatomy and are used mostly as an auxiliary role. However, they have advantages such as ubiquity and the on-body measurement, indicating the potential of wearables in the implementation stage.
(B)
Other physiological signals
An example is from Paolo Palatini’s study [104] exploring the relation between sports and blood pressure. One of the conclusions is that both systolic and diastolic blood pressure increase significantly during weight lifting, which is a solid support to the current belief that people with hypertension should not take isometric sports. Besides the blood pressure observation, monitoring kinematic signals such as respiration and heart rate plays a critical role in sleep studies, sports training, patient monitoring, etc. Lu et al. [105] designed a wearable sensor system with the fusion of heart rate, respiration, and motion measurement sensors to enhance the energy expenditure estimation. Their study shows that the fusion design supplies more stable estimation than existing systems. Brouwer et al. [106] improved real-life emotion estimates based on heart rate. Li et al. [107] proposed a sleep and wake classification model with heart rate and respiration signals for long-term sleep studies and reached 88% classification accuracy. Plenty of research work utilized the two sensing modalities in wearable configuration to monitor medicine and health state [108,109]. Phonation is when the vocal folds produce certain sounds through vibration, which has also been explored to help disabled and unhealthy individuals for a better expression or understanding. Lee et al. [110] developed a lip-reading algorithm using optical flow and properties of articulatory phonation for hearing-impaired people, supplying them with continuous feedback on their pronunciation and phonation through lip-reading training, aiming for more effective communication with people without hearing disabilities. Gomez et al. [111] proposed a monitoring approach of Parkinson’s disease leveraging biomechanical instability of phonation for the frequent evaluation at a distance. Muscle (either on facial or other body parts) and joint movement monitoring is a more straightforward way for human activity recognition. The movement can be perceived by a series of sensors such as fabric stretch sensors, capacitive sensors, laser doppler vibrometry, etc. Applications based on muscle/joint movement monitoring include hand gesture recognition [112], physical stress [113], gait cycle estimation [114], chronic pain level recognition [115], etc. As electrophysiological sensing, kinematic biological sensing is an on-body approach that the monitoring can be placed near the body, enabling continuous observation and remote feedback, especially for healthcare, diagnosis, and rehabilitation applications.

3.4. Field Sensing

The field is a concept in physics, inferring a region in which each point will be affected by force. For example, electric charges will form an electric field. When another charged particle is placed in the electric field, it will bear an electric force that either repels or attracts it. A magnet will generate a magnetic field surrounding it, and a paper clip in the range of the field will be pulled towards the magnet. Two like magnetic poles will also repel each other when they are close enough to be in the range of either magnetic field. Any object with a quality on Earth will fall to the ground because of its gravity, as it is affected by the force of Earth’s gravitational field.
The field strength means the magnitude of a vector-valued field. For example, in the electric field, the strength is represented by the unit of volts per meter (V/m). In the magnetic field, the field is represented by Oersted*Ampere/meter (Oe*A/m). Moreover, when the flux density defines the strength, the Gaus (G) units or Tesla (T) are used. The gravitational field strength is measured in meters per second squared (m/s2) or Newtons per kilogram (N/kg). All the units used to represent the field strength are vector-valued. Another approach to know the field strength is to look at the field contour lines. The closer the lines are, the stronger the forces in that part of the field are, and the stronger the field strength is.
Figure 7, Figure 8 and Figure 9 show an electric field of a parallel plate capacitor, a magnetic field activated by a Helmholtz coil, and the gravitational field of the Earth, respectively. Field-based sensing is based either on the field strength measurement (such as magnetic field strength) or the strength variation caused by characteristics indirectly (such as the potential change of the capacitor, the pressure of object caused by the gravity).
(A)
Electric Field
The electric field is ubiquitous in our environment since any potential difference will construct an electric field. Either powered objects (such as appliances, walled power cables, etc.) or non-powered conductive items (such as metal frames near the power cable in a building, the human body, etc.) will activate an electric field to near objects that have a different potential level (especially the ground). The potential difference is essentially a difference in charge distribution. A typical example is that people sometimes feel mildly shocked when touching an appliance, even when the appliance is powered off. This is because there is a possibility of residual charge remaining inside the capacitors of the electronic circuits, which takes a little time to discharge. When the appliance is not appropriately grounded, touching it will cause a mild shock as the charge is transferred to the neutral body.
There are mainly two kinds of electric field-based HAR applications—active or passive—depending on the emitter of the field. An active electric field-based HAR application delivers the field variation as a signal source when the field is emitted from the environment and the human acts as an intruder. A passive one delivers the field variation when considering the electric field emitted from the body itself to the ground since the human body is a perfect conductor and can store the charges. The passive electric field describes a biological signal of the body, the human body capacitance, which will be introduced in the following subsection of the hybrid sensing technique in HAR. Here we firstly focused on the active electric field-based HAR application.
A very representative work is from Zhang et al. [116], where they introduced room-scale interactive and context-aware applications with a system named Wall++, which is a low-cost sensing technique that turns ordinary walls into smart infrastructures. The system can first track users’ touch and gestures and estimate body pose when close with the principle of active mutual capacitance sensing, which measures the capacitance between two electrodes (namely the electric field strength between the electrodes). When a body part is near a transmitter–receiver pair, it interferes with the projected electric field, reducing the received current, which can be measured for inferencing. On the other hand, if the user’s body touches an electrode, it dramatically increases the capacitance and the received current. Secondly, the system could also work in a passive airborne electromagnetic sensing mode to detect and track the active appliances and users when wearing an electromagnetic emitter. Another typical work is from Cheng et al. [117], where the authors used conductive textile-based electrodes that are easy to be integrated into garments to measure changes in the electric field strength (in capacitance) inside the human body. Since those changes are related to motions and shape changes of muscle, skin, and other tissue, the authors thus abstracted high-level knowledge from the changes and inferenced a broad range of activities and physiological parameters. For example, they embedded the prototype into a collar and performed quantitative evaluations of the recognition accuracy of actions such as chewing, swallowing, speaking, sighing (taking a deep breath), and different head motions and positions. There are other similar works based on active electric field sensing, such as touch detection [118], body tracking based on smart floor [119], respiration, heart rate, stereotyped motor behavior recording [120], hand gesture recognition [121], etc.
Active electric field sensing is non-intrusive, low-cost, has low power consumption, and has excellent potential for pervasive privacy-respecting environmental sensing. However, it is still more complex in hardware construction compared with the passive electric field sensing mode. Furthermore, it can be affected by electromagnetic interference. Thus its reliable operation has a demand in environmental conditions.
(B)
Magnetic Field
Magnetic field sensing is an active approach for distance-based motion sensing. There are mainly two magnetic field-based motion-sensing systems depending on whether the magnetic field was generated by the direct current (DC) or alternative (AC) current.
In DC magnetic field motion sensing systems, electromagnets or permanent magnets are often used to generate the magnetic field. A magnetic sensor (magnetometer) senses the magnetic field strength. Since the magnetometer is widely embedded into wearable devices, the DC motion sensing system has been extensively explored for finger/hand tracking to enable a novel machine input approach. Chen et al. [122] designed a system named uTrack, which converts the thumb and fingers into a 3D input system using magnetic field sensing. A permanent magnet was affixed to the back of the thumb, and a pair of magnetometers were worn on the back of the fingers. A continuous data stream was obtained by moving the thumb across the fingers and was used for 3D pointing. The system shows a tracking accuracy of 4.84 mm in 3D space. Similar works [123,124] were conducted using a permanent magnet as the field generator for motion tracking.
In contrast, AC magnetic field sensing is mostly composed of oscillation-based magnetic field transmitters and receivers. The transmitter mostly uses coils to generate an alternating magnetic field. The receiver is also integrated with a coil to sense the strength of the magnetic field at different distances from the transmitter coil. This principle is that the oscillating magnetic flux through the receiver coils will induce an oscillating voltage with the same frequency. The voltage is later used for distance or pose estimation. Oscillating magnetic field has been explored in a variety of HAR tasks, such as indoor location [125], finger tracking [126], human–computer-interaction [127], wearable social distance monitoring [128,129,130], etc. It could also be implemented for underwater positioning to enable the tracking or navigation of underwater-unmanned vehicles or divers [131].
The advantage of the DC magnetic field motion sensing system is that the magnet used for field generating is easy to access. The sensing unit is at the chip level, thus enjoying the pervasiveness regarding the wide use of smart wearable devices. Moreover, the tracking accuracy can reach up to mm level. The disadvantage of such a system is located in the short sensing range. Since the field attenuates quickly, the detection range is limited to several centimeters. The AC magnetic field sensing’s performance in range and accuracy mainly depends on coil design. The detection range could reach up to ten meters with a larger transmitter coil. Ordinary everyday used furniture made of wood and textile will not deform the distribution of the activated field. However, the drawback is that the metallic objects will cause magnetic field distortions. Fortunately, researchers have tried to address this issue by a secondary calibration (either with a look-up table or with neural network-based calibration) step and achieved outstanding results [132].
(C)
Gravitational Field
A gravitational field explains gravitational phenomena when a massive body produces a force on another massive body. Earth’s gravity is denoted by g, describing the net acceleration imparted to the physical objects caused by the combined effect of gravitation (caused by the mass distribution within Earth) and the centrifugal force (caused by Earth’s rotation). On Earth, gravity gives weight to physical objects. The weight is calculated by multiplying the gravitational acceleration by the mass. Gravitational field-based HAR tasks mainly utilize the pressure sensed by pressure sensors caused by the body’s weight. Different pressure sensors are presented for HAR tasks, such as the commercially available force-sensitive resistor (FSR), resistive textile, etc. By analyzing the pressure patterns caused by the motion of the body, extensive HAR applications are explored, such as gait analysis [133], workout recognition and user identification [134], indoor location [135], smart furniture [136], rehabilitation [137], etc. The textile-resistive pressure sensor is composed of a matrix of resistive units. By sensing the pressure of each unit from the matrix, the user motion patterns can be delivered. For a small number of resistive units, such as a few FSR units integrated into the insole, a one-dimensional data stream is used for action recognition. For a large number resistive units such as would be found on a mat-like surface, the data stream is usually converted to pressure images as two-dimensional arrays, which can be processed by a neural network-based algorithm used in computer version tasks for more accurate activity recognition.
One of the advantages of a pressure-based sensor is that the sensing component can be customized to any shape and size. Thus it is suitable for a large scale of surface types that needs to be sensed. The sensing precision could also be adjusted by arranging the density of the sensing units. Cheap, commercially available layer-wise films commonly construct the sensing unit. Thus the overall system is affordable to build. However, the cost comes into the system’s deployment in a large area (such as floors for location and tracking) since the sensing only occurs during contact, which is a drawback compared with other sensing modalities such as RF-based sensing with no limitation of contact. In summary, gravitational field-based HAR is a non-intrusive and straightforward motion action monitoring and analysis method. It can be extensively deployed for intelligent ambient sensing but is limited by the contact constraints and cost of deployment in a large area.

3.5. Hybrid / Others

(A)
Human Body Capacitance
Human body capacitance (HBC) is essentially a biological variable describing the capacitance between the human body and the environment, mainly the ground. It is also a passive electric field-based sensing approach since the capacitance model comprises two conductive plates that store charges (corresponding to body and environment in the human electric field model) and a dielectric medium (corresponding to the air between body and ground). Figure 10 depicts the human body capacitance in a living room, where multiple electric fields exist, for example, the field between the appliance and the ground, between the metal frames of the window/door to the ground, as well as the human body capacitance between the body and the environment. person–person is a ubiquitous biological parameter that could be explored for a wide range of human-centric motion-related applications based on its sensitivity to both the body’s motion and the variation of the environment.
Unlike other biological features, such as ECG, EMG, etc., HBC is a feature that interacts with surroundings, especially the ground. Being insulated by the wearing, the body and the surroundings form a natural capacitor. HBC is used to describe the charges stored in the body. A series of studies [138,139,140,141] indicate a value of 100–400 pF of the body capacitance. The value varies with respect to skin state [142,143], garment [144], body postures [145], etc. Researchers have explored applications such as communication [146], cooperation perceiving [147], motion monitoring [117,148,149], etc., based on the concept, which has continued attracting the attention of researchers recently. Since HBC is a passive signal, the sensing units were mostly designed in a small form factor with small power consumption [149,150]. Wilmsdorff et al. [151] explored this passive capacitive sensing technique with a wide range of applications indoors and outdoors. In [152], the authors presented an HBC-based capacitive sensor for full-body gym exercise recognition and counting; by sensing the local potential variation of the body, different kinds of body actions could be classified. Besides motion sensing, HBC could also be used for proximity and joint activity recognition [147] by exploring the human body capacitance variation caused by the proximity and motion of an intruder.
As a passive motion-sensing approach, the systems based on human body capacitance enjoy the advantage of low cost, low power consumption, portability, and full-body sensing ability. However, although the sensitivity in motion and environmental variation forms the potential ability of this variable, at the same time they also limit the development of it, since any action, either from the body or from the environment, will induce an efficient signal, and there is difficulty in recognizing the source of the signal.
(B)
Infrared
Infrared is electromagnetic radiation with wavelengths longer than visible light. The heat energy from the objects with a temperature above absolute zero is emitted as electromagnetic radiation, which is caused by the constant motion of molecules embodying heat. The electrons jump to higher energy band when they absorb energy by colliding with another. They can also release energy in the form of photons when falling to a lower energy band again. A hot molecule moves fast and generates higher frequencies (shorter wavelengths) of electromagnetic waves. Usually, the human eye cannot sense this radiation with infrared wavelengths, which can be measured by specific electronic sensors. Sensing the human body’s infrared could deliver information such as body temperature, motion trajectory, etc. Two kinds of sensors are commonly utilized for this purpose: the passive infrared sensor (PIR) and the thermographic camera.
The electronic sensor PIR is designed to measure infrared (IR) light from objects. The term passive indicates that this sensor does not emit energy during the detection process. Instead, it detects the energy of infrared radiation from objects. It is widely used from motion detection to automatic lighting applications. In the field of HAR, PIR has been widely explored in the application of indoor positioning [153,154], device-free activity recognition [155,156], etc. The sensor is widely available in the market with low cost and low power consumption. The built system is privacy-secure and easy to deploy and maintain. However, a PIR sensor only detects general movement. It does not give information on who or what moved. For that purpose, a thermographic camera for imaging IR is required.
A thermographic camera generates an image by infrared radiation, which is different from a common camera sensing visible light. The objects with a temperature above absolute zero can be detected by the thermographic camera, and an object with higher temperature emits more radiation. Thus from the thermography, the temperature variations are also visible. For example, humans and other warm-blooded animals stand out very well against the environment, regardless of whether it is day or night. Thermography has been widely used in medical diagnosis, in the military, etc. In HAR applications, it has also been developed with image-processing algorithms for activity detection in residential spaces [157,158], muscle activity evaluation [159], respiration monitoring [160,161], etc. For detection in dark lighting conditions, namely in the work performed by Uddin et al. [162], the authors used the OpenPose framework for thermal images to check the possibility of body skeleton extraction. Their result shows that the thermal images can monitor humans in dark environments where the other typical RGB cameras fail. Although thermographic sensing could supply more detailed information on body action than the PIR approach, it suffers lightly from the cost and the computing load.

3.6. Summary

Depending on the targeted application, researchers have explored different sensing modalities to accomplish their tasks in HAR. Table 2 summarized the mainstream of the sensing modalities and compared them with aspects of cost, power consumption level, working type (active or passive), privacy concern, computing load, typical applications, and their critical advantages and shortcomings. We also supplied some of the publicly available datasets of each sensing modality for HAR tasks in the table, so that readers can check and have a better understanding of the data properties of each sensing technique, or try their own mining approach on the dataset. The cost and power consumption express the practicability of a sensing modality, such as the IMU as a low-cost and low-power-consumption approach, which is the most widely explored aspect of HAR tasks. The compute load and robustness, ranging from high to low, were ranked with specific references. Computations that require large memory (over hundreds of megabytes) and complex instruction (such as multiplication of float point data) are regarded as having a high compute load. A low compute load needs simply a few instructions for one inference on weak devices such as the micro-controller. High robustness indicates that the signal could hardly be interfered with by surroundings, such as the gravitational field. Bluetooth signal strength, as an example, could be easily affected by a variation in the nearby environment. The typical application lists the activities coarsely at a high level, such as the activity of daily living (ADL), which includes all fundamental actions of a human in everyday life, such as sleeping, eating, dressing, etc. The positioning includes the location of the whole body and the body part such as the hand and finger. Gesture Recognition implies gestures performed by hand, finger, arm, etc. Active/passive sensors indicate the complexity of the sensing modality because of the existence of the signal sources. A privacy-respect sensor does not abstract identity-sensitive messages from users, thus being more acceptable. The computing load and robustness show the sensor’s working performance and are categorized into three levels: “low”, “medium”, and “high”. Depending on the usage scenarios, each sensing modality could be deployed targeting different tasks among “where”, “what”, and “how”. A passive electric field, as an example, can be used for both positioning and action sensing.
IMU sensor and optic approaches (mainly video-based) are the two most popular sensing modalities in the community, since the IMU sensor is pervasively deployed in smart devices and outperforms in power consumption, cost, size, and the visual modality can supply high accuracy for activity recognition benefiting from the advanced deep neural network models for feature abstraction. They both are utilized to target a much wider range of human activity recognition tasks than other sensors. However, there are still certain limitations, such as that they both suffer from computational load. Especially for the vision-based approach, which deals with 2D or 3D high resolution and high frame data stream with hundreds of thousands of conventional operations challenging the hardware resources, the computational load is high compared with other sensing modalities. Since the images from a video capture massive identity messages, the privacy issues need to be considered. The IMU sensors face accumulated errors, which results in the configuration demand for each new start for positioning applications with the demand of high accuracy.
Wave-based sensing modalities (RF waves and acoustic waves) are active approaches demanding signal sources from the sensing system and are mainly used for ambient intelligence. The corresponding systems are generally weak in robustness since the wave signal could be affected by the multipath effect (except for the UWB) and environmental noises. However, they are particularly efficient in privacy-respect scenarios since no other information beyond the wave property is collected. The cost and power consumption of such systems are much higher than the IMU-based solution, but still lower than the visual approach.
The electrophysiological signals (ExG) are perceived mostly by devices with high-resolution analog-to-digital chips for healthy monitoring such as mental state, stress level, sports quality evaluation, etc. The cost of such a system is relatively high compared with IMU and most field-based approaches. Since the signal sensing units are mainly at the chip-level design, the power consumption is an obvious advantage of those approaches. Depending on the channel numbers, the computational load of electrophysiological signals is distinct. The ECG signal, as an example, a simple rule-based approach that needs only a few computer instructions, can be used to detect the critical features from it efficiently. The EEG signal, on the other hand, requires a more complex algorithm to abstract the features from multiple channels to uncover the messages behind it.
Pressure sensing is versatile since the sensing unit (mainly composed of conductive layers) is highly customizable. Since the weight signal perceived from such a sensing system is quite straightforward, the detection accuracy of a certain human actions is high. However, maintaining such systems is costly because of the deployment complexity and the limited lifetime caused by the long-term stressful contact.
A magnetic field is a robust distance-based approach that can deliver reliable distance information with a lower computational load. The approach is low-cost and wearable (after minimizing), without limitation of multipath and line-of-sight. More importantly, it can be used for positioning in the underwater environment, which blocks most of the positioning techniques because of the quick attenuation of the adopted medium (such as RF-signal) in water. However, the detection range is limited by a few meters with the active magnetic field and a few decimeters with the passive magnetic field.
The electric field has recently become a novel sensing approach for HAR tasks, distinguished by its ability in full-body motion sensing and environmental electric sensing. It also enjoys the advantages of low power consumption and wearability. Since electrons exist anywhere in the environment where people live, including the human body, the body’s motion will deform the distribution of the electric field. Therefore, human activity could be deduced by perceiving the electric field variation, either on the environmental side or on the body side. However, the environmental noise is a big challenge for electric field-based sensing and is hard to overcome because of the pervasiveness of the surrounding objects acting as noise sources.

4. Outlooks

HAR relates to a wide range of tasks that deal with daily life with digitalization, aiming to assist people to have a better quality of life. As the keystone, sensing skills for HAR tasks are still under intensive development. Based on the surveyed most prominent sensing techniques in this manuscript, we further conclude some outlooks on the development of sensing skills for HAR tasks.
  • Sensor fusion: The sensor fusion method has great potential to improve sensing robustness by fusing different sensor data. Each sensor modality has inherent strengths and weaknesses. By merging data from various sensor modalities, a robust data model can be generated. For example, the long-term positioning tasks with a high-rate IMU sensor will be disturbed by the integration errors, which could be addressed by a lower rate sensor that provides absolute anchor points (such as visual features). Some classic and efficient algorithms could be designed for sensor fusing, such as Kalman Filter [180]. As another example, the electric field sensors can perceive the straightforward proximity information of an individual. Meanwhile, they are sensitive to environmental variation, resulting in multi-source issues. By deploying motion sensors such as IMU on both individuals and the environment, the electric field signal source could be recognized. Such fusion approaches could not only address the weakness of a particular sensing modality but also provide a more holistic appreciation of the system being monitored.
  • Smart sensors: Driven by the pervasive practical user scenarios and power-efficient data processing techniques, as well as the chip manufacturing technology, there is an apparent trend that sensors are becoming smarter with the ability to process the signal data locally. Compared to conventional sensor systems, smart sensors take advantage of emerging machine learning algorithms and modern computer hardware to create sophisticated, intelligent systems tailored to specific sensing applications. In recent years, many smart sensors have been proposed for HAR tasks such as the pedometer integrated IMUs (BMI270), gesture recognition integrated sensors (PAJ7620), etc. All the recognition, classification, and decision processes are executed on the smart sensor system locally instead of uploading the raw data to the cloud for inferencing. Thus, the user’s privacy is well protected, and the computing load of the central processing unit is significantly reduced.
  • Novel sensors: With the development of materials and fabrication technology, novel sensing techniques and devices emerge to provide a broader perceiving ability towards the body and environment where people live. Novel sensors for HAR offer an alternative or complementary approach to existing solutions. More importantly, they supply a new method for body or environment knowledge collection that the current sensing technique cannot supply. An example is a microelectrode-based biosensor, which has been proposed for long-term monitoring of sweat glucose levels [181]. The multi-function microelectrode-based biosensor is fabricated on a flexible substrate, which offers greater wearing comfort than rigid sensors, thus providing long-term on-skin healthy monitoring.
Besides that, sensors are becoming more compact and power-efficient to provide always-on monitoring, or the sensors will be only in an active state triggered by a specific event. The energy harvesting techniques also provide ambient energy for sensors to extend the power life. With the growing number of wearable devices, the health monitoring sensors [182] are being deployed more near the body for continuous and real-time analysis of sweat, blood, etc., such as EEG monitoring by smart watches, or stress sensing by the Fitbit smart band, which uses an electrodermal activity skin response sensor to obtain a reading when the palm of user’s hand is pressing the metal outer rim, and then the corresponding app will analyze the overall stress.

5. Conclusions

This work focused on the mainstream sensing techniques for HAR tasks, aiming to supply a concrete understanding of the variant sensing principles for younger community researchers. We categorized the human activities into three classes: where, what, and how, for body position-related, body action-related, and body status-related services. This task-oriented categorization aims to supply a basic concept of the objectives of human activity recognition. We also categorized the HAR-related sensing modalities into five classes: mechanical kinematic sensing, field-based sensing, wave-based sensing, physiological sensing, and hybrid/others, based on the properties of the sensing medium, aiming to give a better understanding of the sensing technique’s physical background. Specific sensing modalities were presented in each category with state-of-the-art publications and a discussion of the modality’s advantages and limitations. A summary and an outlook of the sensing techniques were also discussed. We hope this survey can help newcomers have a better overview of the characteristics of each sensing modality for HAR tasks and choose the proper approaches for their specific applications.

Author Contributions

Manuscript writing: S.B., M.L.; project administration: P.L. Formal analysis: B.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the European Union project HUMANE-AI-NET (H2020-ICT-2019-3 #952026).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ToFTime of Flight
RSSReceived Signal Strength
LiDARLight and Radar
UWBUltra-wideband
RFRadio Frequency
IDIdentification
MAEMean Absolute Error
IMUInertial Measurement Unit
PWMPulse-Width Modulation
AUVAutonomous Underwater Vehicles
SLAMSimultaneous Localization and Mapping

References

  1. Sigg, S.; Shi, S.; Buesching, F.; Ji, Y.; Wolf, L. Leveraging RF-channel fluctuation for activity recognition: Active and passive systems, continuous and RSSI-based signal features. In Proceedings of the International Conference on Advances in Mobile Computing & Multimedia, Vienna, Austria, 2–4 December 2013; pp. 43–52. [Google Scholar]
  2. Ramos, R.G.; Domingo, J.D.; Zalama, E.; Gómez-García-Bermejo, J. Daily Human Activity Recognition Using Non-Intrusive Sensors. Sensors 2021, 21, 5270. [Google Scholar] [CrossRef] [PubMed]
  3. Samadi, M.R.H.; Cooke, N. EEG signal processing for eye tracking. In Proceedings of the 2014 22nd European Signal Processing Conference (EUSIPCO), Lisbon, Portugal, 1–5 September 2014; pp. 2030–2034. [Google Scholar]
  4. Hussain, Z.; Sheng, Q.Z.; Zhang, W.E. A review and categorization of techniques on device-free human activity recognition. J. Netw. Comput. Appl. 2020, 167, 102738. [Google Scholar] [CrossRef]
  5. Yuan, G.; Wang, Z.; Meng, F.; Yan, Q.; Xia, S. An overview of human activity recognition based on smartphone. Sens. Rev. 2019, 39, 288–306. [Google Scholar] [CrossRef]
  6. Li, Y.; Wang, S.; Jin, C.; Zhang, Y.; Jiang, T. A survey of underwater magnetic induction communications: Fundamental issues, recent advances, and challenges. IEEE Commun. Surv. Tutor. 2019, 21, 2466–2487. [Google Scholar] [CrossRef]
  7. Dang, L.M.; Min, K.; Wang, H.; Piran, M.J.; Lee, C.H.; Moon, H. Sensor-based and vision-based human activity recognition: A comprehensive survey. Pattern Recognit. 2020, 108, 107561. [Google Scholar] [CrossRef]
  8. Fu, B.; Damer, N.; Kirchbuchner, F.; Kuijper, A. Sensing technology for human activity recognition: A comprehensive survey. IEEE Access 2020, 8, 83791–83820. [Google Scholar] [CrossRef]
  9. Raval, R.M.; Prajapati, H.B.; Dabhi, V.K. Survey and analysis of human activity recognition in surveillance videos. Intell. Decis. Technol. 2019, 13, 271–294. [Google Scholar] [CrossRef]
  10. Mohamed, R.; Perumal, T.; Sulaiman, M.N.; Mustapha, N. Multi resident complex activity recognition in smart home: A literature review. Int. J. Smart Home 2017, 11, 21–32. [Google Scholar] [CrossRef]
  11. Bux, A.; Angelov, P.; Habib, Z. Vision based human activity recognition: A review. Adv. Comput. Intell. Syst. 2017, 341–371. [Google Scholar]
  12. Ma, J.; Wang, H.; Zhang, D.; Wang, Y.; Wang, Y. A survey on wi-fi based contactless activity recognition. In Proceedings of the International Conference on Ubiquitous Intelligence & Computing, Advanced and Trusted Computing, Scalable Computing and Communications, Cloud and Big Data Computing, Internet of People and Smart World Congress (UIC/ATC/ScalCom/CBDCom/IoP/SmartWorld), Toulouse, France, 18–21 July 2016; pp. 1086–1091. [Google Scholar]
  13. Lioulemes, A.; Papakostas, M.; Gieser, S.N.; Toutountzi, T.; Abujelala, M.; Gupta, S.; Collander, C.; Mcmurrough, C.D.; Makedon, F. A survey of sensing modalities for human activity, behavior, and physiological monitoring. In Proceedings of the PETRA ’16: 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Corfu Island, Greece, 29 June–1 July 2016; pp. 1–8. [Google Scholar]
  14. Vrigkas, M.; Nikou, C.; Kakadiaris, I.A. A review of human activity recognition methods. Front. Robot. AI 2015, 2, 28. [Google Scholar] [CrossRef] [Green Version]
  15. Mukhopadhyay, S.C. Wearable sensors for human activity monitoring: A review. IEEE Sens. J. 2014, 15, 1321–1330. [Google Scholar] [CrossRef]
  16. Jalal, A.; Sarif, N.; Kim, J.T.; Kim, T.S. Human activity recognition via recognized body parts of human depth silhouettes for residents monitoring services at smart home. Indoor Built Environ. 2013, 22, 271–279. [Google Scholar] [CrossRef]
  17. Javed, A.R.; Faheem, R.; Asim, M.; Baker, T.; Beg, M.O. A smartphone sensors-based personalized human activity recognition system for sustainable smart cities. Sustain. Cities Soc. 2021, 71, 102970. [Google Scholar] [CrossRef]
  18. Li, R.; Chellappa, R.; Zhou, S.K. Learning multi-modal densities on discriminative temporal interaction manifold for group activity recognition. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 2450–2457. [Google Scholar]
  19. Cho, N.G.; Kim, Y.J.; Park, U.; Park, J.S.; Lee, S.W. Group activity recognition with group interaction zone based on relative distance between human objects. Int. J. Pattern Recognit. Artif. Intell. 2015, 29, 1555007. [Google Scholar] [CrossRef]
  20. Brena, R.F.; García-Vázquez, J.P.; Galván-Tejada, C.E.; Muñoz-Rodriguez, D.; Vargas-Rosales, C.; Fangmeyer, J. Evolution of indoor positioning technologies: A survey. J. Sens. 2017, 2017, 2630413. [Google Scholar] [CrossRef]
  21. Fuchs, C.; Aschenbruck, N.; Martini, P.; Wieneke, M. Indoor tracking for mission critical scenarios: A survey. Pervasive Mob. Comput. 2011, 7, 1–15. [Google Scholar] [CrossRef]
  22. Navarro, S.E.; Mühlbacher-Karrer, S.; Alagi, H.; Zangl, H.; Koyama, K.; Hein, B.; Duriez, C.; Smith, J.R. Proximity perception in human-centered robotics: A survey on sensing systems and applications. IEEE Trans. Robot. 2021, 38, 1599–1620. [Google Scholar] [CrossRef]
  23. Mubashir, M.; Shao, L.; Seed, L. A survey on fall detection: Principles and approaches. Neurocomputing 2013, 100, 144–152. [Google Scholar] [CrossRef]
  24. Akhtaruzzaman, M.; Shafie, A.A.; Khan, M.R. Gait analysis: Systems, technologies, and importance. J. Mech. Med. Biol. 2016, 16, 1630003. [Google Scholar] [CrossRef]
  25. Verbunt, J.A.; Huijnen, I.P.; Köke, A. Assessment of physical activity in daily life in patients with musculoskeletal pain. Eur. J. Pain 2009, 13, 231–242. [Google Scholar] [CrossRef] [PubMed]
  26. Wang, Z.; Ho, S.B.; Cambria, E. A review of emotion sensing: Categorization models and algorithms. Multimed. Tools Appl. 2020, 79, 35553–35582. [Google Scholar] [CrossRef]
  27. Arakawa, T. A Review of Heartbeat Detection Systems for Automotive Applications. Sensors 2021, 21, 6112. [Google Scholar] [CrossRef] [PubMed]
  28. Alemdar, H.; Ersoy, C. Wireless sensor networks for healthcare: A survey. Comput. Netw. 2010, 54, 2688–2710. [Google Scholar] [CrossRef]
  29. Oguntala, G.A.; Abd-Alhameed, R.A.; Ali, N.T.; Hu, Y.F.; Noras, J.M.; Eya, N.N.; Elfergani, I.; Rodriguez, J. SmartWall: Novel RFID-enabled ambient human activity recognition using machine learning for unobtrusive health monitoring. IEEE Access 2019, 7, 68022–68033. [Google Scholar] [CrossRef]
  30. Gaglio, S.; Re, G.L.; Morana, M. Human activity recognition process using 3-D posture data. IEEE Trans. Hum.-Mach. Syst. 2014, 45, 586–597. [Google Scholar] [CrossRef]
  31. Jiang, S.; Gao, Q.; Liu, H.; Shull, P.B. A novel, co-located EMG-FMG-sensing wearable armband for hand gesture recognition. Sens. Actuators A Phys. 2020, 301, 111738. [Google Scholar] [CrossRef]
  32. Elniema Abdrahman Abdalla, H. Hand Gesture Recognition Based on Time-of-Flight Sensors. Ph.D Thesis, Politecnico di Torino, Turlin, Italy, 2021. [Google Scholar]
  33. Nahler, C.; Plank, H.; Steger, C.; Druml, N. Resource-Constrained Human Presence Detection for Indirect Time-of-Flight Sensors. In Proceedings of the 2021 Digital Image Computing: Techniques and Applications (DICTA), Gold Coast, Australia, 29 November–1 December 2021; pp. 1–5. [Google Scholar]
  34. Hossen, M.A.; Zahir, E.; Ata-E-Rabbi, H.; Azam, M.A.; Rahman, M.H. Developing a Mobile Automated Medical Assistant for Hospitals in Bangladesh. In Proceedings of the 2021 IEEE World AI IoT Congress (AIIoT), Seattle, WA, USA, 10–13 May 2021; pp. 366–372. [Google Scholar]
  35. Lin, J.T.; Newquist, C.; Harnett, C. Multitouch Pressure Sensing with Soft Optical Time-of-Flight Sensors. IEEE Trans. Instrum. Meas. 2022, 71, 7000708. [Google Scholar] [CrossRef]
  36. Bortolan, G.; Christov, I.; Simova, I. Potential of Rule-Based Methods and Deep Learning Architectures for ECG Diagnostics. Diagnostics 2021, 11, 1678. [Google Scholar] [CrossRef] [PubMed]
  37. Ismail, M.I.M.; Dzyauddin, R.A.; Samsul, S.; Azmi, N.A.; Yamada, Y.; Yakub, M.F.M.; Salleh, N.A.B.A. An RSSI-based Wireless Sensor Node Localisation using Trilateration and Multilateration Methods for Outdoor Environment. arXiv 2019, arXiv:1912.07801. [Google Scholar]
  38. Chen, K.; Zhang, D.; Yao, L.; Guo, B.; Yu, Z.; Liu, Y. Deep learning for sensor-based human activity recognition: Overview, challenges, and opportunities. ACM Comput. Surv. (CSUR) 2021, 54, 1–40. [Google Scholar] [CrossRef]
  39. Aggarwal, J.K.; Cai, Q. Human motion analysis: A review. Comput. Vis. Image Underst. 1999, 73, 428–440. [Google Scholar] [CrossRef]
  40. Kellokumpu, V.; Pietikäinen, M.; Heikkilä, J. Human activity recognition using sequences of postures. In Proceedings of the MVA, Tsukuba, Japan, 16–18 May 2005; pp. 570–573. [Google Scholar]
  41. Münzner, S.; Schmidt, P.; Reiss, A.; Hanselmann, M.; Stiefelhagen, R.; Dürichen, R. CNN-based sensor fusion techniques for multimodal human activity recognition. In Proceedings of the 2017 ACM International Symposium on Wearable Computers, Maui, HI, USA, 11–15 September 2017; pp. 158–165. [Google Scholar]
  42. Yang, X.; Cao, R.; Zhou, M.; Xie, L. Temporal-frequency attention-based human activity recognition using commercial WiFi devices. IEEE Access 2020, 8, 137758–137769. [Google Scholar] [CrossRef]
  43. Köping, L.; Shirahama, K.; Grzegorzek, M. A general framework for sensor-based human activity recognition. Comput. Biol. Med. 2018, 95, 248–260. [Google Scholar] [CrossRef] [PubMed]
  44. Bustoni, I.A.; Hidayatulloh, I.; Ningtyas, A.; Purwaningsih, A.; Azhari, S. Classification methods performance on human activity recognition. J. Phys. Conf. Ser. 2020, 1456, 012027. [Google Scholar] [CrossRef] [Green Version]
  45. Gjoreski, H.; Kiprijanovska, I.; Stankoski, S.; Kalabakov, S.; Broulidakis, J.; Nduka, C.; Gjoreski, M. Head-AR: Human Activity Recognition with Head-Mounted IMU Using Weighted Ensemble Learning. In Activity and Behavior Computing; Springer: Singapore, 2021; pp. 153–167. [Google Scholar]
  46. Röddiger, T.; Wolffram, D.; Laubenstein, D.; Budde, M.; Beigl, M. Towards respiration rate monitoring using an in-ear headphone inertial measurement unit. In Proceedings of the 1st International Workshop on Earable Computing, London, UK, 9 September 2019; pp. 48–53. [Google Scholar]
  47. Kim, M.; Cho, J.; Lee, S.; Jung, Y. IMU sensor-based hand gesture recognition for human-machine interfaces. Sensors 2019, 19, 3827. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Georgi, M.; Amma, C.; Schultz, T. Recognizing Hand and Finger Gestures with IMU based Motion and EMG based Muscle Activity Sensing. In Biosignals; Citeseer: Princeton, NJ, USA, 2015; pp. 99–108. [Google Scholar]
  49. Mummadi, C.K.; Leo, F.P.P.; Verma, K.D.; Kasireddy, S.; Scholl, P.M.; Kempfle, J.; Laerhoven, K.V. Real-time and embedded detection of hand gestures with an IMU-based glove. Informatics 2018, 5, 28. [Google Scholar] [CrossRef] [Green Version]
  50. Kang, S.W.; Choi, H.; Park, H.I.; Choi, B.G.; Im, H.; Shin, D.; Jung, Y.G.; Lee, J.Y.; Park, H.W.; Park, S.; et al. The development of an IMU integrated clothes for postural monitoring using conductive yarn and interconnecting technology. Sensors 2017, 17, 2560. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Zhang, Q. Evaluation of a Wearable System for Motion Analysis during Different Exercises. Master’s Thesis, ING School of Industrial and Information Engineering, Milano, Italy, 2019. Available online: https://www.politesi.polimi.it/handle/10589/149029 (accessed on 30 April 2022).
  52. Wang, Q.; Timmermans, A.; Chen, W.; Jia, J.; Ding, L.; Xiong, L.; Rong, J.; Markopoulos, P. Stroke patients’ acceptance of a smart garment for supporting upper extremity rehabilitation. IEEE J. Transl. Eng. Health Med. 2018, 6, 1–9. [Google Scholar] [CrossRef] [PubMed]
  53. Kim, H.; Kang, Y.; Valencia, D.R.; Kim, D. An Integrated System for Gait Analysis Using FSRs and an IMU. In Proceedings of the 2018 Second IEEE International Conference on Robotic Computing (IRC), Laguna Hills, CA, USA, 31 January–2 February 2018; pp. 347–351. [Google Scholar]
  54. Abdulrahim, K.; Hide, C.; Moore, T.; Hill, C. Aiding MEMS IMU with building heading for indoor pedestrian navigation. In Proceedings of the 2010 Ubiquitous Positioning Indoor Navigation and Location Based Service, Kirkkonummi, Finland, 14–15 October 2010; pp. 1–6. [Google Scholar]
  55. Wahjudi, F.; Lin, F.J. IMU-Based Walking Workouts Recognition. In Proceedings of the 2019 IEEE 5th World Forum on Internet of Things (WF-IoT), Limerick, Ireland, 15–18 April 2019; pp. 251–256. [Google Scholar]
  56. Nagano, H.; Begg, R.K. Shoe-insole technology for injury prevention in walking. Sensors 2018, 18, 1468. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  57. Cesareo, A.; Previtali, Y.; Biffi, E.; Aliverti, A. Assessment of breathing parameters using an inertial measurement unit (IMU)-based system. Sensors 2019, 19, 88. [Google Scholar] [CrossRef] [Green Version]
  58. Huang, Y.; Kaufmann, M.; Aksan, E.; Black, M.J.; Hilliges, O.; Pons-Moll, G. Deep inertial poser: Learning to reconstruct human pose from sparse inertial measurements in real time. ACM Trans. Graph. (TOG) 2018, 37, 1–15. [Google Scholar] [CrossRef] [Green Version]
  59. Younas, J.; Margarito, H.; Bian, S.; Lukowicz, P. Finger Air Writing-Movement Reconstruction with Low-cost IMU Sensor. In Proceedings of the MobiQuitous 2020—17th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, Online, 7–9 December 2020; pp. 69–75. [Google Scholar]
  60. Li, H.; He, X.; Chen, X.; Fang, Y.; Fang, Q. Wi-motion: A robust human activity recognition using WiFi signals. IEEE Access 2019, 7, 153287–153299. [Google Scholar] [CrossRef]
  61. Liu, X.; Cao, J.; Tang, S.; Wen, J. Wi-sleep: Contactless sleep monitoring via wifi signals. In Proceedings of the 2014 IEEE Real-Time Systems Symposium, Rome, Italy, 2–5 December 2014; pp. 346–355. [Google Scholar]
  62. Wang, F.; Feng, J.; Zhao, Y.; Zhang, X.; Zhang, S.; Han, J. Joint activity recognition and indoor localization with WiFi fingerprints. IEEE Access 2019, 7, 80058–80068. [Google Scholar] [CrossRef]
  63. Bahle, G.; Fortes Rey, V.; Bian, S.; Bello, H.; Lukowicz, P. Using privacy respecting sound analysis to improve bluetooth based proximity detection for COVID-19 exposure tracing and social distancing. Sensors 2021, 21, 5604. [Google Scholar] [CrossRef]
  64. Hossain, A.M.; Soh, W.S. A comprehensive study of bluetooth signal parameters for localization. In Proceedings of the 2007 IEEE 18th International Symposium on Personal, Indoor and Mobile Radio Communications, Athens, Greece, 3–7 September 2007; pp. 1–5. [Google Scholar]
  65. Zhang, R.; Cao, S. Real-time human motion behavior detection via CNN using mmWave radar. IEEE Sens. Lett. 2018, 3, 1–4. [Google Scholar] [CrossRef]
  66. Zhao, P.; Lu, C.X.; Wang, B.; Chen, C.; Xie, L.; Wang, M.; Trigoni, N.; Markham, A. Heart rate sensing with a robot mounted mmwave radar. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 2812–2818. [Google Scholar]
  67. Sun, Y.; Hang, R.; Li, Z.; Jin, M.; Xu, K. Privacy-Preserving Fall Detection with Deep Learning on mmWave Radar Signal. In Proceedings of the 2019 IEEE Visual Communications and Image Processing (VCIP), Sydney, Australia, 1–4 December 2019; pp. 1–4. [Google Scholar]
  68. Li, Z.; Lei, Z.; Yan, A.; Solovey, E.; Pahlavan, K. ThuMouse: A micro-gesture cursor input through mmWave radar-based interaction. In Proceedings of the 2020 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 4–6 January 2020; pp. 1–9. [Google Scholar]
  69. Cheng, Y.; Zhou, T. UWB indoor positioning algorithm based on TDOA technology. In Proceedings of the 2019 10th International Conference on Information Technology in Medicine and Education (ITME), Qingdao, China, 23–25 August 2019; pp. 777–782. [Google Scholar]
  70. Lee, S.; Kim, J.; Moon, N. Random forest and WiFi fingerprint-based indoor location recognition system using smart watch. Hum.-Centric Comput. Inf. Sci. 2019, 9, 1–14. [Google Scholar] [CrossRef]
  71. Porcino, D.; Hirt, W. Ultra-wideband radio technology: Potential and challenges ahead. IEEE Commun. Mag. 2003, 41, 66–74. [Google Scholar] [CrossRef] [Green Version]
  72. Bouchard, K.; Maitre, J.; Bertuglia, C.; Gaboury, S. Activity recognition in smart homes using UWB radars. Procedia Comput. Sci. 2020, 170, 10–17. [Google Scholar] [CrossRef]
  73. Ren, N.; Quan, X.; Cho, S.H. Algorithm for gesture recognition using an IR-UWB radar sensor. J. Comput. Commun. 2016, 4, 95–100. [Google Scholar] [CrossRef] [Green Version]
  74. Piriyajitakonkij, M.; Warin, P.; Lakhan, P.; Leelaarporn, P.; Kumchaiseemak, N.; Suwajanakorn, S.; Pianpanit, T.; Niparnan, N.; Mukhopadhyay, S.C.; Wilaiprasitporn, T. SleepPoseNet: Multi-view learning for sleep postural transition recognition using UWB. IEEE J. Biomed. Health Inform. 2020, 25, 1305–1314. [Google Scholar] [CrossRef]
  75. Bharadwaj, R.; Parini, C.; Koul, S.K.; Alomainy, A. Effect of Limb Movements on Compact UWB Wearable Antenna Radiation Performance for Healthcare Monitoring. Prog. Electromagn. Res. 2019, 91, 15–26. [Google Scholar] [CrossRef] [Green Version]
  76. Qi, J.; Liu, G.P. A robust high-accuracy ultrasound indoor positioning system based on a wireless sensor network. Sensors 2017, 17, 2554. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  77. Hoeflinger, F.; Saphala, A.; Schott, D.J.; Reindl, L.M.; Schindelhauer, C. Passive indoor-localization using echoes of ultrasound signals. In Proceedings of the 2019 International Conference on Advanced Information Technologies (ICAIT), Yangon, Myanmar, 6–7 November 2019; pp. 60–65. [Google Scholar]
  78. Yang, X.; Sun, X.; Zhou, D.; Li, Y.; Liu, H. Towards wearable A-mode ultrasound sensing for real-time finger motion recognition. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 1199–1208. [Google Scholar] [CrossRef] [Green Version]
  79. Mokhtari, G.; Zhang, Q.; Nourbakhsh, G.; Ball, S.; Karunanithi, M. BLUESOUND: A new resident identification sensor—Using ultrasound array and BLE technology for smart home platform. IEEE Sens. J. 2017, 17, 1503–1512. [Google Scholar] [CrossRef]
  80. Rossi, M.; Feese, S.; Amft, O.; Braune, N.; Martis, S.; Tröster, G. AmbientSense: A real-time ambient sound recognition system for smartphones. In Proceedings of the 2013 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops), San Diego, CA, USA, 18–22 March 2013; pp. 230–235. [Google Scholar]
  81. Garg, S.; Lim, K.M.; Lee, H.P. An averaging method for accurately calibrating smartphone microphones for environmental noise measurement. Appl. Acoust. 2019, 143, 222–228. [Google Scholar] [CrossRef]
  82. Thiel, B.; Kloch, K.; Lukowicz, P. Sound-based proximity detection with mobile phones. In Proceedings of the Third International Workshop on Sensing Applications on Mobile Phones, Toronto, ON, Canada, 6 November 2012; pp. 1–4. [Google Scholar]
  83. Ward, J.A.; Lukowicz, P.; Troster, G.; Starner, T.E. Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 1553–1567. [Google Scholar] [CrossRef] [Green Version]
  84. Murata, S.; Yara, C.; Kaneta, K.; Ioroi, S.; Tanaka, H. Accurate indoor positioning system using near-ultrasonic sound from a smartphone. In Proceedings of the 2014 Eighth International Conference on Next Generation Mobile Apps, Services and Technologies, Oxford, UK, 10–12 September 2014; pp. 13–18. [Google Scholar]
  85. Rossi, M.; Seiter, J.; Amft, O.; Buchmeier, S.; Tröster, G. RoomSense: An indoor positioning system for smartphones using active sound probing. In Proceedings of the 4th Augmented Human International Conference, Stuttgart, Germany, 7–8 March 2013; pp. 89–95. [Google Scholar]
  86. Sathyamoorthy, A.J.; Patel, U.; Savle, Y.A.; Paul, M.; Manocha, D. COVID-robot: Monitoring social distancing constraints in crowded scenarios. arXiv 2020, arXiv:2008.06585. [Google Scholar]
  87. Lee, Y.H.; Medioni, G. Wearable RGBD indoor navigation system for the blind. In Proceedings of the European Conference on Computer Vision, Zurich, Switzerland, 6–12 September 2014; pp. 493–508. [Google Scholar]
  88. Kim, K.; Kim, J.; Choi, J.; Kim, J.; Lee, S. Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions. Sensors 2015, 15, 1022–1046. [Google Scholar] [CrossRef] [Green Version]
  89. Nagarkoti, A.; Teotia, R.; Mahale, A.K.; Das, P.K. Realtime indoor workout analysis using machine learning & computer vision. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 1440–1443. [Google Scholar]
  90. Li, M.; Leung, H. Multi-view depth-based pairwise feature learning for person-person interaction recognition. Multimed. Tools Appl. 2019, 78, 5731–5749. [Google Scholar] [CrossRef]
  91. Pancholi, S.; Agarwal, R. Development of low cost EMG data acquisition system for Arm Activities Recognition. In Proceedings of the 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Jaipur, India, 21–24 September 2016; pp. 2465–2469. [Google Scholar]
  92. Bangaru, S.S.; Wang, C.; Busam, S.A.; Aghazadeh, F. ANN-based automated scaffold builder activity recognition through wearable EMG and IMU sensors. Autom. Constr. 2021, 126, 103653. [Google Scholar] [CrossRef]
  93. Zhang, X.; Chen, X.; Li, Y.; Lantz, V.; Wang, K.; Yang, J. A framework for hand gesture recognition based on accelerometer and EMG sensors. IEEE Trans. Syst. Man Cybern.-Part A Syst. Humans 2011, 41, 1064–1076. [Google Scholar] [CrossRef]
  94. Kim, J.; Mastnik, S.; André, E. EMG-based hand gesture recognition for realtime biosignal interfacing. In Proceedings of the 13th International Conference on Intelligent User Interfaces, Gran Canaria, Spain; 2008; pp. 30–39. [Google Scholar]
  95. Benatti, S.; Farella, E.; Benini, L. Towards EMG control interface for smart garments. In Proceedings of the 2014 ACM International Symposium on Wearable Computers, Seattle, WA, USA, 13–17 September 2014; pp. 163–170. [Google Scholar]
  96. Ahsan, M.R.; Ibrahimy, M.I.; Khalifa, O.O. EMG signal classification for human computer interaction: A review. Eur. J. Sci. Res. 2009, 33, 480–501. [Google Scholar]
  97. Jia, R.; Liu, B. Human daily activity recognition by fusing accelerometer and multi-lead ECG data. In Proceedings of the 2013 IEEE International Conference on Signal Processing, Communication and Computing (ICSPCC 2013), KunMing, China, 5–8 August 2013; pp. 1–4. [Google Scholar]
  98. Liu, J.; Chen, J.; Jiang, H.; Jia, W.; Lin, Q.; Wang, Z. Activity recognition in wearable ECG monitoring aided by accelerometer data. In Proceedings of the 2018 IEEE international symposium on circuits and systems (ISCAS), Florence, Italy, 27–30 May 2018; pp. 1–4. [Google Scholar]
  99. Zhang, X.; Yao, L.; Zhang, D.; Wang, X.; Sheng, Q.Z.; Gu, T. Multi-person brain activity recognition via comprehensive EEG signal analysis. In Proceedings of the 14th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, Melbourne, Australia, 7–10 November 2017; pp. 28–37. [Google Scholar]
  100. Kaur, B.; Singh, D.; Roy, P.P. Eyes open and eyes close activity recognition using EEG signals. In Proceedings of the International Conference on Cognitive Computing and Information Processing, Bengaluru, India, 15–16 December 2017; pp. 3–9. [Google Scholar]
  101. Liu, Y.; Sourina, O.; Nguyen, M.K. Real-time EEG-based human emotion recognition and visualization. In Proceedings of the 2010 International Conference on Cyberworlds, Singapore, 20–22 October 2010; pp. 262–269. [Google Scholar]
  102. Ishimaru, S.; Kunze, K.; Uema, Y.; Kise, K.; Inami, M.; Tanaka, K. Smarter eyewear: Using commercial EOG glasses for activity recognition. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA, 13–17 September 2014; pp. 239–242. [Google Scholar]
  103. Lu, Y.; Zhang, C.; Zhou, B.Y.; Gao, X.P.; Lv, Z. A dual model approach to EOG-based human activity recognition. Biomed. Signal Process. Control 2018, 45, 50–57. [Google Scholar] [CrossRef]
  104. Palatini, P. Blood pressure behaviour during physical activity. Sport. Med. 1988, 5, 353–374. [Google Scholar] [CrossRef] [PubMed]
  105. Lu, K.; Yang, L.; Seoane, F.; Abtahi, F.; Forsman, M.; Lindecrantz, K. Fusion of heart rate, respiration and motion measurements from a wearable sensor system to enhance energy expenditure estimation. Sensors 2018, 18, 3092. [Google Scholar] [CrossRef] [Green Version]
  106. Brouwer, A.M.; van Dam, E.; Van Erp, J.B.; Spangler, D.P.; Brooks, J.R. Improving real-life estimates of emotion based on heart rate: A perspective on taking metabolic heart rate into account. Front. Hum. Neurosci. 2018, 12, 284. [Google Scholar] [CrossRef] [Green Version]
  107. Li, W.; Yang, X.; Dai, A.; Chen, K. Sleep and wake classification based on heart rate and respiration rate. IOP Conf. Ser. Mater. Sci. Eng. 2018, 428, 012017. [Google Scholar] [CrossRef]
  108. cheol Jeong, I.; Bychkov, D.; Searson, P.C. Wearable devices for precision medicine and health state monitoring. IEEE Trans. Biomed. Eng. 2018, 66, 1242–1258. [Google Scholar] [CrossRef]
  109. Tateno, S.; Guan, X.; Cao, R.; Qu, Z. Development of drowsiness detection system based on respiration changes using heart rate monitoring. In Proceedings of the 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Nara, Japan, 11–14 September 2018; pp. 1664–1669. [Google Scholar]
  110. Lee, M. A Lip-reading Algorithm Using Optical Flow and Properties of Articulatory Phonation. J. Korea Multimed. Soc. 2018, 21, 745–754. [Google Scholar]
  111. Gomez-Vilda, P.; Palacios-Alonso, D.; Rodellar-Biarge, V.; Álvarez-Marquina, A.; Nieto-Lluis, V.; Martínez-Olalla, R. Parkinson’s disease monitoring by biomechanical instability of phonation. Neurocomputing 2017, 255, 3–16. [Google Scholar] [CrossRef]
  112. Benalcázar, M.E.; Motoche, C.; Zea, J.A.; Jaramillo, A.G.; Anchundia, C.E.; Zambrano, P.; Segura, M.; Palacios, F.B.; Pérez, M. Real-time hand gesture recognition using the Myo armband and muscle activity detection. In Proceedings of the 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), Salinas, Ecuador, 16–20 October 2017; pp. 1–6. [Google Scholar]
  113. Li, X.; Hong, K.; Liu, G. Detection of physical stress using facial muscle activity. J. Opt. Technol. 2018, 85, 562–569. [Google Scholar] [CrossRef]
  114. Caulcrick, C.; Russell, F.; Wilson, S.; Sawade, C.; Vaidyanathan, R. Unilateral Inertial and Muscle Activity Sensor Fusion for Gait Cycle Progress Estimation. In Proceedings of the 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), Enschede, The Netherlands, 26–29 August 2018; pp. 1151–1156. [Google Scholar]
  115. Fasih Haider, P.A.; Luz, S. Automatic Recognition of Low-Back Chronic Pain Level and Protective Movement Behaviour using Physical and Muscle Activity Information. In Proceedings of the 5th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020), Buenos Aires, Argentina, 16-20 Nov, 2020; pp. 834–838. [Google Scholar]
  116. Zhang, Y.; Yang, C.; Hudson, S.E.; Harrison, C.; Sample, A. Wall++ room-scale interactive and context-aware sensing. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–15. [Google Scholar]
  117. Cheng, J.; Amft, O.; Lukowicz, P. Active capacitive sensing: Exploring a new wearable sensing modality for activity recognition. In Proceedings of the International Conference on Pervasive Computing, Helsinki, Finland, 17–20 May 2010; pp. 319–336. [Google Scholar]
  118. Zhang, Y.; Laput, G.; Harrison, C. Electrick: Low-cost touch sensing using electric field tomography. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1–14. [Google Scholar]
  119. Valtonen, M.; Maentausta, J.; Vanhala, J. Tiletrack: Capacitive human tracking using floor tiles. In Proceedings of the 2009 IEEE International Conference on Pervasive Computing and Communications, Galveston, TX, USA, 9–13 March 2009; pp. 1–10. [Google Scholar]
  120. Noble, D.J.; MacDowell, C.J.; McKinnon, M.L.; Neblett, T.I.; Goolsby, W.N.; Hochman, S. Use of electric field sensors for recording respiration, heart rate, and stereotyped motor behaviors in the rodent home cage. J. Neurosci. Methods 2017, 277, 88–100. [Google Scholar] [CrossRef] [PubMed]
  121. Wong, W.; Juwono, F.H.; Khoo, B.T.T. Multi-Features Capacitive Hand Gesture Recognition Sensor: A Machine Learning Approach. IEEE Sens. J. 2021, 21, 8441–8450. [Google Scholar] [CrossRef]
  122. Chen, K.Y.; Lyons, K.; White, S.; Patel, S. uTrack: 3D input using two magnetic sensors. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, St. Andrews, UK, 8–11 October 2013; pp. 237–244. [Google Scholar]
  123. Reyes, G.; Wu, J.; Juneja, N.; Goldshtein, M.; Edwards, W.K.; Abowd, G.D.; Starner, T. Synchrowatch: One-handed synchronous smartwatch gestures using correlation and magnetic sensing. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 1, 1–26. [Google Scholar] [CrossRef]
  124. Lyons, K. 2D input for virtual reality enclosures with magnetic field sensing. In Proceedings of the 2016 ACM International Symposium on Wearable Computers, Heidelberg, Germany, 12–16 September 2016; pp. 176–183. [Google Scholar]
  125. Pirkl, G.; Lukowicz, P. Robust, low cost indoor positioning using magnetic resonant coupling. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 431–440. [Google Scholar]
  126. Parizi, F.S.; Whitmire, E.; Patel, S. Auraring: Precise electromagnetic finger tracking. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2019, 3, 1–28. [Google Scholar] [CrossRef]
  127. Huang, J.; Mori, T.; Takashima, K.; Hashi, S.; Kitamura, Y. IM6D: Magnetic tracking system with 6-DOF passive markers for dexterous 3D interaction and motion. ACM Trans. Graph. (TOG) 2015, 34, 1–10. [Google Scholar] [CrossRef]
  128. Bian, S.; Zhou, B.; Lukowicz, P. Social distance monitor with a wearable magnetic field proximity sensor. Sensors 2020, 20, 5101. [Google Scholar] [CrossRef]
  129. Bian, S.; Zhou, B.; Bello, H.; Lukowicz, P. A wearable magnetic field based proximity sensing system for monitoring COVID-19 social distancing. In Proceedings of the 2020 International Symposium on Wearable Computers, Virtual Event, 12–16 September 2020; pp. 22–26. [Google Scholar]
  130. Amft, O.; González, L.I.L.; Lukowicz, P.; Bian, S.; Burggraf, P. Wearables to fight COVID-19: From symptom tracking to contact tracing. IEEE Pervasive Comput. 2020, 19, 53–60. [Google Scholar] [CrossRef]
  131. Bian, S.; Hevesi, P.; Christensen, L.; Lukowicz, P. Induced Magnetic Field-Based Indoor Positioning System for Underwater Environments. Sensors 2021, 21, 2218. [Google Scholar] [CrossRef]
  132. Kindratenko, V.V.; Sherman, W.R. Neural network-based calibration of electromagnetic tracking systems. Virtual Real. 2005, 9, 70–78. [Google Scholar] [CrossRef]
  133. Shu, L.; Hua, T.; Wang, Y.; Li, Q.; Feng, D.D.; Tao, X. In-shoe plantar pressure measurement and analysis system based on fabric pressure sensing array. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 767–775. [Google Scholar] [PubMed] [Green Version]
  134. Zhou, B.; Sundholm, M.; Cheng, J.; Cruz, H.; Lukowicz, P. Measuring muscle activities during gym exercises with textile pressure mapping sensors. Pervasive Mob. Comput. 2017, 38, 331–345. [Google Scholar] [CrossRef]
  135. Kaddoura, Y.; King, J.; Helal, A. Cost-precision tradeoffs in unencumbered floor-based indoor location tracking. In Proceedings of the Third International Conference On Smart Homes and Health Telematic (ICOST), Montreal, QC, Canada, 4–6 July 2005. [Google Scholar]
  136. Nakane, H.; Toyama, J.; Kudo, M. Fatigue detection using a pressure sensor chair. In Proceedings of the 2011 IEEE International Conference on Granular Computing, Taiwan, China, 8–10 November 2011; pp. 490–495. [Google Scholar]
  137. Goetschius, J.; Feger, M.A.; Hertel, J.; Hart, J.M. Validating center-of-pressure balance measurements using the MatScan® pressure mat. J. Sport Rehabil. 2018, 27, 1–14. [Google Scholar] [CrossRef] [PubMed]
  138. Aliau Bonet, C.; Pallàs Areny, R. A fast method to estimate body capacitance to ground. In Proceedings of the Proceedings of XX IMEKO World Congress 2012, Busan, Korea, 9–14 September 2012; pp. 1–4. [Google Scholar]
  139. Aliau-Bonet, C.; Pallas-Areny, R. A novel method to estimate body capacitance to ground at mid frequencies. IEEE Trans. Instrum. Meas. 2013, 62, 2519–2525. [Google Scholar] [CrossRef]
  140. Buller, W.; Wilson, B. Measurement and modeling mutual capacitance of electrical wiring and humans. IEEE Trans. Instrum. Meas. 2006, 55, 1519–1522. [Google Scholar] [CrossRef]
  141. Bian, S.; Lukowicz, P. A Systematic Study of the Influence of Various User Specific and Environmental Factors on Wearable Human Body Capacitance Sensing. In Proceedings of the EAI International Conference on Body Area Networks, Virtual Event, 25–26 October 2021; pp. 247–274. [Google Scholar]
  142. Goad, N.; Gawkrodger, D. Ambient humidity and the skin: The impact of air humidity in healthy and diseased states. J. Eur. Acad. Dermatol. Venereol. 2016, 30, 1285–1294. [Google Scholar] [CrossRef] [PubMed]
  143. Egawa, M.; Oguri, M.; Kuwahara, T.; Takahashi, M. Effect of exposure of human skin to a dry environment. Skin Res. Technol. 2002, 8, 212–218. [Google Scholar] [CrossRef] [PubMed]
  144. Jonassen, N. Human body capacitance: Static or dynamic concept? [ESD]. In Proceedings of the Electrical Overstress/Electrostatic Discharge Symposium Proceedings 1998 (Cat. No. 98TH8347), Reno, NV, USA, 6–8 October 1998; pp. 111–117. [Google Scholar]
  145. Bian, S.; Yuan, S.; Rey, V.F.; Lukowicz, P. Using human body capacitance sensing to monitor leg motion dominated activities with a wrist worn device. In Proceedings of the Sensor-and Video-Based Activity and Behavior Computing: 3rd International Conference on Activity and Behavior Computing (ABC 2021), Virtual Event, 20–22 October 2021; p. 81. [Google Scholar]
  146. Cohn, G.; Morris, D.; Patel, S.; Tan, D. Humantenna: Using the body as an antenna for real-time whole-body interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 1901–1910. [Google Scholar]
  147. Bian, S.; Rey, V.F.; Younas, J.; Lukowicz, P. Wrist-Worn Capacitive Sensor for Activity and Physical Collaboration Recognition. In Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Kyoto, Japan, 11–15 March 2019; pp. 261–266. [Google Scholar]
  148. Bian, S.; Lukowicz, P. Capacitive Sensing Based On-board Hand Gesture Recognition with TinyML. In Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, Virtual, 21–26 September 2021; pp. 4–5.
  149. Cohn, G.; Gupta, S.; Lee, T.J.; Morris, D.; Smith, J.R.; Reynolds, M.S.; Tan, D.S.; Patel, S.N. An ultra-low-power human body motion sensor using static electric field sensing. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 99–102. [Google Scholar]
  150. Pouryazdan, A.; Prance, R.J.; Prance, H.; Roggen, D. Wearable electric potential sensing: A new modality sensing hair touch and restless leg movement. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, Heidelberg, Germany, 12–16 September 2016; pp. 846–850. [Google Scholar]
  151. von Wilmsdorff, J.; Kirchbuchner, F.; Fu, B.; Braun, A.; Kuijper, A. An exploratory study on electric field sensing. In Proceedings of the European Conference on Ambient Intelligence, Malaga, Spain, 26–28 April 2017; pp. 247–262. [Google Scholar]
  152. Bian, S.; Rey, V.F.; Hevesi, P.; Lukowicz, P. Passive Capacitive based Approach for Full Body Gym Workout Recognition and Counting. In Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom), Kyoto, Japan, 11–15 March 2019; pp. 1–10. [Google Scholar]
  153. Yang, D.; Xu, B.; Rao, K.; Sheng, W. Passive infrared (PIR)-based indoor position tracking for smart homes using accessibility maps and a-star algorithm. Sensors 2018, 18, 332. [Google Scholar] [CrossRef] [Green Version]
  154. Yang, B.; Luo, J.; Liu, Q. A novel low-cost and small-size human tracking system with pyroelectric infrared sensor mesh network. Infrared Phys. Technol. 2014, 63, 147–156. [Google Scholar] [CrossRef]
  155. Kashimoto, Y.; Hata, K.; Suwa, H.; Fujimoto, M.; Arakawa, Y.; Shigezumi, T.; Komiya, K.; Konishi, K.; Yasumoto, K. Low-cost and device-free activity recognition system with energy harvesting PIR and door sensors. In Proceedings of the Adjunct Proceedings of the 13th International Conference on Mobile and Ubiquitous Systems: Computing Networking and Services, Hiroshima, Japan, 28 November–1 December 2016; pp. 6–11. [Google Scholar]
  156. Kashimoto, Y.; Fujiwara, M.; Fujimoto, M.; Suwa, H.; Arakawa, Y.; Yasumoto, K. ALPAS: Analog-PIR-sensor-based activity recognition system in smarthome. In Proceedings of the 2017 IEEE 31st International Conference on Advanced Information Networking and Applications (AINA), Taiwan, China, 27–29 March 2017; pp. 880–885. [Google Scholar]
  157. Naik, K.; Pandit, T.; Naik, N.; Shah, P. Activity Recognition in Residential Spaces with Internet of Things Devices and Thermal Imaging. Sensors 2021, 21, 988. [Google Scholar] [CrossRef] [PubMed]
  158. Hossen, J.; Jacobs, E.L.; Chowdhury, F.K. Activity recognition in thermal infrared video. In Proceedings of the SoutheastCon 2015, Fort Lauderdale, FL, USA, 9–12 April 2015; pp. 1–2. [Google Scholar]
  159. Chudecka, M.; Lubkowska, A.; Leźnicka, K.; Krupecki, K. The use of thermal imaging in the evaluation of the symmetry of muscle activity in various types of exercises (symmetrical and asymmetrical). J. Hum. Kinet. 2015, 49, 141. [Google Scholar] [CrossRef] [Green Version]
  160. Al-Khalidi, F.; Saatchi, R.; Elphick, H.; Burke, D. An evaluation of thermal imaging based respiration rate monitoring in children. Am. J. Eng. Appl. Sci. 2011, 4, 586–597. [Google Scholar]
  161. Ruminski, J.; Kwasniewska, A. Evaluation of respiration rate using thermal imaging in mobile conditions. In Application of Infrared to Biomedical Sciences; Springer: Singapore, 2017; pp. 311–346. [Google Scholar]
  162. Uddin, M.Z.; Torresen, J. A deep learning-based human activity recognition in darkness. In Proceedings of the 2018 Colour and Visual Computing Symposium (CVCS), Gjovik, Norway, 19–20 September 2018; pp. 1–5. [Google Scholar]
  163. Baha’A, A.; Almazari, M.M.; Alazrai, R.; Daoud, M.I. A dataset for Wi-Fi-based human activity recognition in line-of-sight and non-line-of-sight indoor environments. Data Brief 2020, 33, 106534. [Google Scholar]
  164. Guo, L.; Wang, L.; Lin, C.; Liu, J.; Lu, B.; Fang, J.; Liu, Z.; Shan, Z.; Yang, J.; Guo, S. Wiar: A public dataset for wifi-based activity recognition. IEEE Access 2019, 7, 154935–154945. [Google Scholar] [CrossRef]
  165. Tian, J.; Yongkun, S.; Yongpeng, D.; Xikun, H.; Yongping, S.; Xiaolong, Z.; Zhifeng, Q. UWB-HA4D-1.0: An Ultra-wideband Radar Human Activity 4D Imaging Dataset. Lei Da Xue Bao 2022, 11, 27–39. [Google Scholar]
  166. Delamare, M.; Duval, F.; Boutteau, R. A new dataset of people flow in an industrial site with uwb and motion capture systems. Sensors 2020, 20, 4511. [Google Scholar] [CrossRef] [PubMed]
  167. Ahmed, S.; Wang, D.; Park, J.; Cho, S.H. UWB-gestures, a public dataset of dynamic hand gestures acquired using impulse radar sensors. Sci. Data 2021, 8, 1–9. [Google Scholar] [CrossRef] [PubMed]
  168. Singh, A.D.; Sandha, S.S.; Garcia, L.; Srivastava, M. Radhar: Human activity recognition from point clouds generated through a millimeter-wave radar. In Proceedings of the 3rd ACM Workshop on Millimeter-wave Networks and Sensing Systems, Los Cabos, Mexico, 25 October 2019; pp. 51–56. [Google Scholar]
  169. Liu, H.; Zhou, A.; Dong, Z.; Sun, Y.; Zhang, J.; Liu, L.; Ma, H.; Liu, J.; Yang, N. M-gesture: Person-independent real-time in-air gesture recognition using commodity millimeter wave radar. IEEE Internet Things J. 2021, 9, 3397–3415. [Google Scholar] [CrossRef]
  170. Kay, W.; Carreira, J.; Simonyan, K.; Zhang, B.; Hillier, C.; Vijayanarasimhan, S.; Viola, F.; Green, T.; Back, T.; Natsev, P.; et al. The kinetics human action video dataset. arXiv 2017, arXiv:1705.06950. [Google Scholar]
  171. Soomro, K.; Zamir, A.R.; Shah, M. UCF101: A dataset of 101 human actions classes from videos in the wild. arXiv 2012, arXiv:1212.0402. [Google Scholar]
  172. Chaquet, J.M.; Carmona, E.J.; Fernández-Caballero, A. A survey of video datasets for human action and activity recognition. Comput. Vis. Image Underst. 2013, 117, 633–659. [Google Scholar] [CrossRef] [Green Version]
  173. Mohino-Herranz, I.; Gil-Pita, R.; Rosa-Zurera, M.; Seoane, F. Activity recognition using wearable physiological measurements: Selection of features from a comprehensive literature study. Sensors 2019, 19, 5524. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  174. Casale, P.; Pujol, O.; Radeva, P. Personalization and user verification in wearable systems using biometric walking patterns. Pers. Ubiquitous Comput. 2012, 16, 563–580. [Google Scholar] [CrossRef]
  175. Zhang, M.; Sawchuk, A.A. USC-HAD: A daily activity dataset for ubiquitous activity recognition using wearable sensors. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 1036–1043. [Google Scholar]
  176. Hanley, D.; Faustino, A.B.; Zelman, S.D.; Degenhardt, D.A.; Bretl, T. MagPIE: A dataset for indoor positioning with magnetic anomalies. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017; pp. 1–8. [Google Scholar]
  177. zhaxidelebsz. Gym Workouts Data Set. 2021. Available online: https://github.com/zhaxidele/Toolkit-for-HBC-sensing (accessed on 30 April 2022).
  178. Pouyan, M.B.; Birjandtalab, J.; Heydarzadeh, M.; Nourani, M.; Ostadabbas, S. A pressure map dataset for posture and subject analytics. In Proceedings of the 2017 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), Orlando, FL, USA, 16–19 February 2017; pp. 65–68. [Google Scholar]
  179. Chatzaki, C.; Skaramagkas, V.; Tachos, N.; Christodoulakis, G.; Maniadi, E.; Kefalopoulou, Z.; Fotiadis, D.I.; Tsiknakis, M. The smart-insole dataset: Gait analysis using wearable sensors with a focus on elderly and Parkinson’s patients. Sensors 2021, 21, 2821. [Google Scholar] [CrossRef] [PubMed]
  180. Assa, A.; Janabi-Sharifi, F. A Kalman Filter-Based Framework for Enhanced Sensor Fusion. IEEE Sens. J. 2015, 15, 3281–3292. [Google Scholar] [CrossRef]
  181. Han, J.; Li, M.; Li, H.; Li, C.; Ye, J.; Yang, B. Pt-poly(L-lactic acid) microelectrode-based microsensor for in situ glucose detection in sweat. Biosens. Bioelectron. 2020, 170, 112675. [Google Scholar] [CrossRef] [PubMed]
  182. Cheng, S.; Gu, Z.; Zhou, L.; Hao, M.; An, H.; Song, K.; Wu, X.; Zhang, K.; Zhao, Z.; Dong, Y.; et al. Recent Progress in Intelligent Wearable Sensors for Health Monitoring and Wound Healing Based on Biofluids. J. Front. Bioeng. Biotechnol. 2021, 9, 765987. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Sensing techniques in human activity recognition.
Figure 1. Sensing techniques in human activity recognition.
Sensors 22 04596 g001
Figure 2. Categorization of human activities.
Figure 2. Categorization of human activities.
Sensors 22 04596 g002
Figure 3. The general process of an HAR task.
Figure 3. The general process of an HAR task.
Sensors 22 04596 g003
Figure 4. Wave-based human-centric sensing in two methods: active and passive.
Figure 4. Wave-based human-centric sensing in two methods: active and passive.
Sensors 22 04596 g004
Figure 5. The wide UWB power spectrum results in a low power consumption compared to other technologies. (Source: FiRa Consortium).
Figure 5. The wide UWB power spectrum results in a low power consumption compared to other technologies. (Source: FiRa Consortium).
Sensors 22 04596 g005
Figure 6. Physiological sensing modalities for HAR.
Figure 6. Physiological sensing modalities for HAR.
Sensors 22 04596 g006
Figure 7. Eletric field (parallel plate capacitor).
Figure 7. Eletric field (parallel plate capacitor).
Sensors 22 04596 g007
Figure 8. Magnetic field (Helmholtz coils).
Figure 8. Magnetic field (Helmholtz coils).
Sensors 22 04596 g008
Figure 9. Gravitational field of Earth.
Figure 9. Gravitational field of Earth.
Sensors 22 04596 g009
Figure 10. Human body capacitance: the static electric field between the body and the environment.
Figure 10. Human body capacitance: the static electric field between the body and the environment.
Sensors 22 04596 g010
Table 1. Surveys on HAR sensing techniques.
Table 1. Surveys on HAR sensing techniques.
Focused SubjectRefYearContribution
Device-free sensors[4]2020
  • Categorized sensors into wearable, object-tagged, device-free, etc.
  • Focused on device-free sensing approaches for 10 kinds of activities.
  • Extensive analysis based on 10 important metrics of each sensing approach.
Full-stack (sensors and algorithms)[7]2020
  • Categorized sensors into wearable, object, environmental, and video-based.
  • Focused on data processing approaches.
Overall sensors[8]2020
  • Categorized sensors by physical principles (acoustic, optical, etc.).
  • Summarized publicly available databases and common evaluation metrics to evaluate and compare the performance of the developed algorithms and systems.
Smartphone sensors[5]2019
  • Enumeration and description of embedded sensors.
  • Data labeling, processing, etc.
Surveillance video[9]2019
  • Summarized the general process of human action recognition in video processing domain.
  • Surveyed different features and models used in video surveillance, and the related datasets.
Radar sensors[6]2019
  • Overview of various radar systems adopted to recognize human activities.
  • Overview of DL techniques applied to radar-based HAR tasks.
Bespoke sensors in smart home[10]2017
  • Highlighted that smart home intelligence involved sensing technology.
  • Highlighted the multi-resident activity recognition including concurrent, interleave, and cooperative interaction activity.
Vision-based[11]2017
  • Comprehensive survey of different phases of vision-based HAR (image segmentation, feature extraction, activity classification).
WiFi-based[12]2016
  • Survey of the WiFi-based contactless HAR from four aspects including historical overview, theories and models, and key techniques for applications.
Non-invasive sensors[13]2016
  • Survey of technologies that are close to entering the commercial market or have only recently become available.
Vision-based[14]2015
  • Proposed categorization of human activities into unimodal and multimodal according to the nature of sensor data they employ.
  • Reviewed various human activity recognition methods and analyzed the strengths and weaknesses of each category separately.
Wearable sensors[15]2014
  • Reviewed the latest reported systems on activity monitoring of humans based on wearable sensors.
  • Forecasted the light-weight physiological sensors that lead to comfortable wearable devices to tackle the challenges.
Table 2. Sensing techniques in HAR tasks.
Table 2. Sensing techniques in HAR tasks.
ModalityCost (USD)Power LevelActive/
Passive
PrivacyCompute
Load
RobustnessTargetTypical ApplicationCommentAccessible
Dataset
WiFitens≈tens Wattactivenomediumlowwhere, whatpositioning, ADL, ambient intelligencepervasiveness, environmental sensitivity[163,164]
UWBtens≈mWactivenolowlowwhere, whatpositioning, proximity, ADL, gesture recognition, ambient intelligencemulti-path resistive, high accuracy, costly for massive consumer usage[165,166,167]
mmWavetens≈Wactivenomediumlowwhere, what, howpositioning, proximity, ADL, gesture recognition, health monitoring, ambient intelligencehigh accuracy, low power efficiency for massive consumer usage[168,169]
Ultrasonichundreds≈mW to Wactivenolowlowwhere, whatpositioning, proximity, ambient intelligencehigh accuracy, weak robustness-
Optictens of hundreds≈W and abovepassiveyeshighmediumwhere, what, howpositioning, proximity, ADL, gait analysis, gesture recognition, surveillancecomprehensive approach, high resource consumption[170,171,172]
ExGhundreds≈tens mWpassivenomediumhighhow, whatsports, healthcare monitoring, ADLhigh resolution, noise sensitive[173]
IMUa few≈mWpassivenohighmediumwhere, whatpositioning, ADL, gesture recognition, healthcare monitoring, gait analysis, sportsdominant sensing modality, accumulated bias[174,175]
Magnetic Field(AC)tens≈hundreds mWactivenolowhighwhere, whatpositioning, proximity,high robustness, limited detection range-
Magnetic Field(DC)a few≈mWpassivenolowhighwhatproximity, gesture recognitionhigh accuracy, short detection range[176]
Electric Field(active)tens≈mWactivenolowlowwhere, whatpositioning, proximity, ambient intelligencehigh sensitivity, noise sensitive-
Electric Field(Passive)a few≈sub-mWpassivenolowlowwhere, whatpositioning, proximity, sports, gait analysis, ambient intelligencehigh sensitivity, noise sensitive[177]
Gravitational Fieldtens of hundredsarea dependentpassivenodependshighwhere, whatpositioning, sports, gait analysis, ambient intelligenceversatility/customizability, costly maintenance[178,179]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bian, S.; Liu, M.; Zhou, B.; Lukowicz, P. The State-of-the-Art Sensing Techniques in Human Activity Recognition: A Survey. Sensors 2022, 22, 4596. https://doi.org/10.3390/s22124596

AMA Style

Bian S, Liu M, Zhou B, Lukowicz P. The State-of-the-Art Sensing Techniques in Human Activity Recognition: A Survey. Sensors. 2022; 22(12):4596. https://doi.org/10.3390/s22124596

Chicago/Turabian Style

Bian, Sizhen, Mengxi Liu, Bo Zhou, and Paul Lukowicz. 2022. "The State-of-the-Art Sensing Techniques in Human Activity Recognition: A Survey" Sensors 22, no. 12: 4596. https://doi.org/10.3390/s22124596

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop