You are currently viewing a new version of our website. To view the old version click .
AI Sensors
  • Feature Paper
  • Review
  • Open Access

10 December 2025

Wearable Intelligent Human–Machine Interfaces Ready for Sustainable Edge Computing Systems

,
,
and
1
Jiangsu Provincial Key Laboratory of Advanced Robotics, School of Mechanical and Electrical Engineering, Soochow University, Suzhou 215123, China
2
School of Future Science and Engineering, Soochow University, Suzhou 215299, China
3
Department of Electrical & Computer Engineering, National University of Singapore, 4 Engineering Drive 3, Singapore 117576, Singapore
4
National University of Singapore (Suzhou) Research Institute, Suzhou 215128, China

Abstract

To better serve human life with smart and harmonic communication between the real and digital worlds, wearable human–machine interfaces (HMIs) with edge computing capabilities indicate the path to the next revolution of information technology. In this review, we focus on wearable HMIs and highlight several key aspects which are worth investigating. Firstly, we review wearable HMIs powered by commercial-ready technologies, highlighting some limitations. Next, to establish a dual-way interaction for exchanging comprehensive information, sensing and feedback functions on the human body need to be customized based on specific scenarios. Power consumption is another primary issue that is critical to wearable applications due to limited space, one that is possible to be solved by energy harvesting techniques and self-powered data transmission approaches. To further improve the data interpretation with higher intelligence, machine learning (ML)-assisted analysis is preferred for multi-dimensional data. Eventually, with the presence of edge computing systems, those data can be pre-processed locally for downstream applications. Generally, this review offers an overview of the development of intelligent wearable HMIs with edge computing capabilities and self-sustainability, which can greatly enhance the user experience in healthcare, industrial productivity, education, etc.

1. Introduction

Currently, the burgeoning development of the micro-nano fabrication process and of soft material facilitates the miniaturization of sensors, microprocessors, power supply units, and wireless transmission units, which form the foundations of modern wearable systems [1,2,3,4]. Compared with the past decades, the spread of wearable systems and the internet of things (IoTs) has drastically increased the volume of data communication via massively deployed devices [5,6]. Cloud computing is seamlessly integrated into the aforementioned systems to help to process the data. However, due to the rapid growth of the users and the diversification of multi-functional devices, the networks frequently experience a heavy burden caused by redundant data transmissions [7,8,9]. On the other hand, edge computing, which can process the raw data immediately within the device and only sends out the necessary data during external communication, can be a promising solution for wearable systems, featuring low latency, local data processing and better privacy protection. Furthermore, the high mobility of humans urgently requires the assistance of edge computing [10,11]. Meanwhile, the development of a desired edge computing system relies on the cooperation of those individual components which are indicated above [12,13].
For edge computing wearable systems, power consumption control is crucial for extending the working time [14], especially when the system integration level continues to increase. One power-consuming component is the sensor, which is required to capture a multimodal signal (e.g., kinematic, tactile, physiological, etc.) to perform complex tasks [15,16,17]. Common sensors employed in wearable systems include inertial, piezoresistive, capacitive, piezoelectric, and triboelectric, etc. [18,19,20]. Among these sensors, self-powered sensors, such as triboelectric and piezoelectric sensors [21,22], are gaining much interest because their self-sustaining feature can extend the working time of edge computing wearable systems. In addition to self-powered sensors, the study of wearable energy harvesters is also drawing increasing attention [23,24,25,26]. Diverse energies offered by the human body ensure the feasibility of realizing self-sustainable wearable edge computing systems [27,28,29,30]. Moreover, wireless transmission modules usually consume a considerable amount of energy. The solutions of conducting self-powered or low-power wireless transmission are also critical to further boost the operation time.
Powered by edge computing technology, real-time and local data processing decreases the system latency, which is essential for applications like medical monitoring, virtual/augmented reality, etc. Feedback components also benefit from the low processing latency because real-time haptic feedback can significantly enhance the engagement of users, and hence, to achieve higher efficiency and immersive experience [31,32,33,34]. A number of strategies are proposed to replicate real stimuli to human body [35], such as vibration, wire actuators, pneumatic actuators, dielectric elastomer actuators, as well as electroresistive- or thermoelectric-based temperature feedback units [36,37]. With the aid of edge computing capability, the fidelity of the response can also be improved, powered by locally deployed neural networks. The intelligence of the wearable system with edge computing capability is showing tremendous growth [38,39]. Furthermore, since most personal and physiological data are processed locally, the risk of the exposure of sensitive information is minimized, leading to better privacy protection.
In this review, the typical components of the wearable system are discussed, as shown in Figure 1. Section 2 serves as the baseline of further discussion, where conventional and commercial-ready wearable technologies are introduced. Section 3 describes basic sensing and energy harvesting mechanisms, laying the foundation for self-sustaining edge computing systems. In Section 4 and Section 5, wearable sensing and feedback technologies are introduced. Considering the power consumption issue in wearable edge computing systems, state-of-the-art energy harvesting and self-powered wireless transmission technologies are reviewed in Section 6. Furthermore, ML-assisted advanced data analysis for the wearable system is addressed in Section 7. The development of the edge computing paradigm is introduced in Section 8, highlighting neuromorphic computing technology. Finally, conclusions and perspectives are given, focusing on several key hurdles of wearable edge computing systems.
Figure 1. Overview of the fundamental components of a wearable human–machine interface (HMI) with edge computing.

2. Commercial-Ready Wearable Sensing Systems

2.1. Vision-Based Wearable Sensing Systems

Thanks to the thriving development of computer vision technologies in recent decades, vision-based wearable sensing systems for human–machine interaction are developing rapidly. For instance, researchers have reported numerous wearable systems [40,41,42,43] to perform motion capture (i.e., to estimate the full-body pose), emotion recognition [44,45,46,47,48,49], facial movement sensing [50,51,52,53,54], etc. Furthermore, hand gesture recognition or even sign language translation is achieved by systems powered by leap motion [55] or AR glasses [56]. Tracking a 3D hand pose is also feasible with wearable devices [57,58]. In addition to hands, the recognition of mouth movement can also facilitate many mobile computing applications. For instance, a smart necklace [59] that can recognize bilingual (English and Chinese) silent speech commands has been presented. Moreover, a complex 3D video conference system [60] can be implemented by Microsoft Azure Kinect cameras, shedding light on the potential of vision-based wearable sensing.
However, there are still two major concerns relating to vision-based sensing. On one hand, in order to capture the complete information of the users, cameras must traditionally be fixed at certain locations and occlusions should be avoided whenever possible. The aforementioned conditions constrain the scenarios where vision-based sensing can apply—typically an indoor, uncrowded environment is preferred. On the other hand, vision-based sensing must take photos of users and store data for analysis, causing potential privacy problems.
One major application of vision-based wearable sensing is full-body pose estimation. A mature commercial motion capture system for full-body pose estimation usually takes an “outside-in” method [61], which means cameras are placed externally and capture the subject (i.e., the user), such as Vicon [62], OptiTrack [63], etc. This type of setting limits the application scenarios: an indoor environment with minimal occlusion and controlled lighting is preferred. To enable outdoor full-body pose estimation, Shiratori et al. [61] implemented a portable motion capture system via the “inside-out” method. In the “inside-out” approach, cameras are mounted on the user and capture the environment. This system, however, requires a static background to estimate the pose and suffers from privacy problems.
EgoCap [40] addressed most of the aforementioned problem with an “inside-in” approach. In this system, a stereo pair of fisheye cameras are attached to a cycling helmet or a head-mounted display (HMD). Thanks to the large field of view (FOV) of the cameras, they can capture the full body of the user. The local skeleton pose is estimated by solving an optimization problem, where the alignment of a projected 3D human model with the human in left and right fisheye views is maximized at each time step. The estimation error is reported to be 7 cm in challenging scenarios. Incorporating machine learning algorithms, EgoCap can estimate the user’s pose even with some severe self-occlusions in the image. However, the two cameras and the wooden rig attached to the helmet/HMD still constrain the movement of the head and makes it difficult to put on the device as well. Additionally, due to the large FOV of the pair of cameras, the problem of privacy is mitigated but not solved.
To further improve comfort while wearing, Mo2Cap2 [41], a more lightweight system than EgoCap, has been proposed (Figure 2). This system achieved full-body pose estimation with only one fisheye camera mounted on the brim of a baseball cap. The 3D joint position is estimated by a 2D joint location heatmap and the distance between the camera and each joint. Specifically, their method makes a more accurate estimation on lower body joints with a zoomed-in image only focusing on the lower body. The estimation error is 6.14 cm for indoor scenarios and 8.06 cm for outdoor environment.
Later, Tome et al. [42] presented SelfPose (Figure 2), which produces even higher joint position estimation accuracy than Mo2Cap2 [41]. The downward-looking camera is installed on the rim of a virtual reality (VR) HMD, adding a small amount of extra weight to the user. To compute the 3D full-body pose, a 2D pose detector is used to predict the heatmap of 2D joint positions, followed by a multi-branch autoencoder to generate 3D joint positions and other auxiliary outputs required in the training phase. The average joint position error is 4.66 cm in an indoor environment and 5.46 cm in an outdoor environment.
Cameras mounted around the head of the user translate into extra moment when the head moves so the user may feel uncomfortable wearing the device for a long time. Thus, some researchers seek to install the camera to other body parts, or even externally, while still obtaining an egocentric view. For instance, Hwang et al. [64] reported a full-body pose estimation system with a single ultra-wide fisheye camera mounted on the chest of the user. The average joint position error is 8.49 cm, slightly larger than that of [40,41,42], with no extra weight added to the head. Moreover, Lim et al. implemented BodyTrak [65] to capture a full-body pose by installing four miniature RGB cameras on a wristband. Employing a four-branch CNN with late fusion, the system estimates the 3D joint position with an average error of 6.34 cm (6.9 cm if using only one camera). Cameras installed on the body of the user inevitably add some extra difficulty to the movement of the user. To address this problem, Ahuja et al. presented ControllerPose [66] with two fisheye cameras mounted on each of two VR controllers (Figure 2). The system fuses the results from 2D joint position estimations and the IMU data from HMD and two controllers to compute the 3D joint positions. As the cameras are not mounted onto the user, the average position error is reported to be 8.59 cm, slightly larger than previous methods.
The previously mentioned “inside-in” method usually requires cameras with an ultra-wide FOV to capture the user. Ng et al. [43] (Figure 2) proposed a full-body pose estimation system with normal forward-looking cameras when the user is interacting with another person (i.e., the interactee). Their system exploits the pose of the interactee, believing it is inherently related to the pose of the user. The system combines the information from the homography, the static scene features and the 2D pose of the interactee and feeds them into a long short-term memory (LSTM) network to compute the 3D pose of the user. The average errors in joint positions are 14.3 cm and 8.6 cm when compared with the ground truths produced by Microsoft Kinect V2 and Panoptic Studio [67], respectively. A comparison between these full-body pose estimation wearable systems is demonstrated in Table 1.
Figure 2. Vision-based wearable systems for full-body pose estimation. Reproduced with permission [41,42,43,66]. Copyright 2019, IEEE. Copyright 2020, IEEE. Copyright 2020, CVF. Copyright 2022, Association for Computing Machinery.
Table 1. Comparison of vision-based wearable systems for body pose estimation.
Table 1. Comparison of vision-based wearable systems for body pose estimation.
YearRef.SensorsSensor LocationMethodsError Mean ± Std. (cm)Power
Consumption
Keypoint No.FPS
2016[40]Two fisheye camerasHelmet or HMDThree-dimensional generative pose estimation7.00 ± 1.00~5 w1710–15
2019[41]One fisheye cameraBaseball capTwo-dimensional pose estimation + joint depth estimation6.14 (indoor)
8.06 (outdoor)
~2 w16N.A.
2020[43]One camera (GoPro)ChestHomography + two-dimensional pose estimation with LSTM14.3~3.5 w25N.A.
2020[42]One fisheye cameraVR HMDTwo-dimensional pose detection + two-dimensional-to-three-dimensional mapping4.66 (indoor)
5.46 (outdoor)
~3 w16N.A.
2020[64]One fisheye cameraChestTwo-dimensional joint heat map + three-dimensional joint position8.49~2 w15N.A.
2022[66]Four fisheye camerasVR controllersTwo-dimensional pose estimation + three-dimensional joint angle regression8.59 ± 5.20~6.5 w177.2
2022[65]Four camerasWristFour-branch CNN with late fusion6.34~1.5 w14<5
Facial movement sensing is another popular research field for vision-based wearables. There are generally two different research categories here: the first is emotion recognition, which falls into a classification problem based on the facial movement of the user; the second is facial movement reconstruction, which seeks to track the face of the user and is also more challenging.
When wearing a VR headset, the user’s eyes are blocked, making it difficult for others to recognize the user’s emotion. This can interfere with the normal social engagement of the user and undermine the experience of using a VR device. To tackle this problem, Hickson et al. proposed Eyemotion [44] (Figure 3a), a system that can recognize the emotion of the user employing IR gaze-tracking cameras integrated in the VR HMD. The system classifies the emotion of the user with a CNN combined with a personalization method, where the raw image of each user is subtracted by the mean image of the user in neutral emotion before it is fed into the network. The system achieves a mean accuracy of 0.74 over five different categories of emotion. Note that the emotion recognition is not performed in real time. Later, Wu et al. [45] presented a wearable system which can classify the emotion of the user in real time. Moreover, all of the computation is undertaken in their embedded system. In their system, an image of the right eye of the user is fed into a CNN-based feature extractor, followed by a personalized classifier for emotion recognition.
Figure 3. Vision-based wearable devices for facial movement sensing. (a) A VR goggle-based HMI for classifying facial expressions. Reproduced with permission [44]. Copyright 2019, IEEE. (b) A glass-type wearable device achieving emotion recognition via multimodal sensors. Reproduced with permission [46]. Copyright 2021, IEEE. (c) A lightweight, wireless, and low-cost glasses-based wearable platform for emotion sensing and bio-signal acquisition. Reproduced with permission [47]. Copyright 2020, IEEE. (d) A facial expression tracking device based on infrared sensors. Reproduced with permission [50]. Copyright 2016, IEEE. (e) A facial expression tracking head mounted display via photo-reflective sensors. Reproduced with permission [51]. Copyright 2017, IEEE.
Glass-type wearable devices can also be a suitable solution for emotion recognition. For instance, Kwon et al. proposed a glass-type HMI that can perform continuous emotion recognition by multimodal sensors [46] (Figure 3b). Furthermore, Yan et al. [47,49] reported a glass-type HMI with edge computing ability. A lithium battery and a Raspberry Pi are seamlessly fused into the glasses. The system adopts a CNN with regional attention blocks to classify the emotion of the user based on an image focusing on one side of the face. Nie et al. [48] (Figure 3c) also proposed an emotion recognition system built on eyewear. The system employs a CNN to capture the landmarks of the eye and the eyebrow of the user based on the image from an infrared (IR) camera. Then, these landmarks and the movement of the eyebrow detected by an optical-flow-based algorithm are combined to serve as the input to a decision tree, which classifies the emotion of the user. The average accuracy over five different categories of emotion is 0.84. Combined with a proximity sensor and an IMU, the system is capable of detecting the affective state and performing mental health monitoring. However, the computation is not performed in the embedded system only and their emotion classification is undertaken in semi-real time.
With respect to the problem of facial movement tracking, Cha et al. [50] proposed a facial expression tracking system based on infrared sensors (Figure 3d). The system is integrated into a head-mounted display. In addition, Suzuki et al. [51] reported another HMD-based HMI which can not only track facial expressions but also map them onto avatars (Figure 3e). The application of these two works, however, is constrained by the relatively low spatial resolution of the facial tracking. To this end, Thies et al. [52] presented FaceVR, a wearable system that can construct the face with high resolution in real time. The system uses an infrared camera installed in the VR HMD to capture the eye movement of the user and the eye tracking is performed by a classification approach based on random ferns. Note that to track the whole facial expressions, an externally located RGB-D camera is used so the system is not fully portable.
To track the facial movement, wearing a VR headset is not an ideal solution, as the user’s face is blocked. To achieve facial movement reconstruction without an HMD, Chen et al. proposed C-Face [53] and NeckFace [54], which are able to estimate the feature points on the whole face with cameras installed in earphones/headphones and a necklace/neckband, respectively. The key insight in C-Face and NeckFace is that they exploit the correlation between the contours of the face (also the chin in NeckFace) and the movement of the whole face. In other words, they use deep neural networks to infer the facial movement from the subtle changes in the contours of the face. Moreover, as only the contours of the face are captured, the privacy concerns are mitigated in these systems.

2.2. Non-Vision-Based Commercial-Ready Wearable Sensing Systems

Despite the facile fabrication and satisfying performance of visual sensory wearable systems, the concerns of privacy leakage triggered by these devices are a constant concern for users. Other commercial-ready sensors, such as inertial measurement units (IMUs), are employed as alternatives to visual sensors. The motion information captured by IMUs is less sensitive yet still sufficient to extract key features for downstream tasks. Nowadays, sensing based on IMUs is ubiquitous in daily life: inertial sensors (e.g., accelerometer, gyroscope, magnetometers, etc.) can be found in smartphones, smart watches, wristbands, earphones, etc. The prevalence of IMUs facilitates various fundamental applications, including human–machine interfaces [68,69], sign language translation [70,71,72], sports training [73] and rehabilitation [74]. From the perspective of cost, however, the IMU is relatively expensive and not an ideal choice to form a body area network (BAN). Some low-cost wearable sensing systems based on other commercial-ready sensory technologies are also reported [75].
One of the popular research topics in IMU-based wearable sensing is hand gesture recognition. Fang et al. [68] proposed a glove equipped with 18 inertial and magnetic measurement units (IMMUs) to perform motion capture and hand gesture recognition. The data are collected by a microcontroller unit (MCU) and transmitted via a Bluetooth module. An extreme learning machine (ELM) [76] is trained to do the gesture classification, reporting the classification accuracies to be 89.6% and 82.5% for static gestures and dynamic gestures, respectively. Nevertheless, the system exhibits some limitations because of the large number of sensors and wires, making it difficult for the users to move their upper limbs freely. A hand gesture recognition system (Figure 4a) based on smart watches or armbands is introduced by Vásconez et al. [77]. Based on reinforcement learning algorithms, the system achieves good performance with online learning ability. Moreover, a gesture recognizing system based on the Myo armband for training sports referees (Figure 4b) is reported by Pan et al. [73]. The data are collected by an eight-channel surface electromyography sensor and an IMU. To classify the gestures, a cascaded framework is employed. In the first step, large motion gestures and subtle motion gestures are distinguished by a support vector machine (SVM) based on hand-crafted features; in the next step, another SVM based on hand-crafted features and features learned by a deep belief network (DBN) is used to finalize the classification results. The average classification accuracies over 11 participants is 92.2% and 90.2% for large and subtle motion gestures, respectively.
In addition to recognizing basic hand gestures, researchers also seek to recognize and even translate sign languages to help the hearing/speech-impaired people. Hou et al. reported SignSpeaker [70], a sign language translation system utilizing a commercialized smartwatch and a smartphone which runs a text-to-speech (TTS) program to read the translated sentence. The data generated from an accelerometer and a gyroscope are fed into a multi-layer LSTM to learn the semantic representations of sign languages, followed by a classification layer with a connectionist temporal classification (CTC) loss function. The CTC loss function helps to segment the sentence. The reported average word error rate (WER) is 1.04%.
SignSpeaker is based purely on inertial sensors, which are capable of capturing the motion of the hand but unable to recognize the subtle finger movement [78]. To deal with this limitation, Zhang et al. proposed WearSign [71], a multimodal wearable system fusing the signals from inertial sensors and electromyography (EMG) sensors to better track the motion of fingers. A CNN is deployed to exploit both intra-/inter-modality features. Then an encoder–decoder network performs end-to-end learning, outputting translated sentences directly. Thanks to the multimodal sensing ability and a data synthesis approach, the translation accuracy is greatly improved.
In addition to applications in gesture recognition, inertial sensors are also used in human activity recognition (HAR) [79,80,81]. HAR plays an important role in health care, rehabilitation and senior care [82,83,84]. For instance, Bianchi et al. [85] implemented a wearable system for long-term personalized HAR (Figure 4c). The system consists of an inertial sensor (MPU9250) and a controller unit (Cortex-M4), which can transmit the data through Wi-Fi. Combined with a convolutional neural network, the system can recognize nine different human activities with single IMU. In a relatively large dataset containing 15 subjects and 15,616 instances, an overall accuracy of 97% is achieved over 9 different activities.
Figure 4. Commercial-ready non-vision based wearable HMIs. (a) A hand gesture recognition system based on inertial signals and reinforcement learning. Reproduced with permission [77]. Copyright 2022, MDPI. (b) A Myo armband-based gesture recognition system. Reproduced with permission [73]. Copyright 2022, IEEE. (c) A wearable system for long-term personalized human activity recognition (HAR). Reproduced with permission [85]. Copyright 2019, IEEE. (d) A wearable HAR system based on recurrent neural networks. Reproduced with permission [86]. Copyright 2022, IEEE. (e) A low-cost wearable HAR system based on magnetic induction signals. Reproduced with permission [75]. Copyright 2020, Springer Nature.
Apart from convolutional neural networks, recurrent neural networks are widely used in HAR applications because of their good performance for handling time-series data. Tong et al. [86] reported a wearable system integrated with six IMUs and a bidirectional-gated recurrent unit-inception (Bi-GRU-I) network (Figure 4d), which exhibits good performance on self-collected datasets and other public datasets. The system features a novel deep learning network, the Bi-GRU-I network, which is a combination of a recurrent neural network (RNN) and a convolutional network (CNN). The RNN part, a two-layer bidirectional GRU (Bi-GRU), is responsible for extracting temporal features from the data while the CNN part, three inception modules, is responsible for extracting spatial features. Because of this network structure, the classification performance is enhanced, resulting in an average accuracy of 97.76% in both self-collected and public datasets.
IMU-based wearable systems demonstrate good performance in HAR. Attaching IMUs around the whole human body, however, is often unfeasible due to the cost of IMUs. Therefore, to achieve a wireless body area network (WBAN) [87,88], a low-cost HAR solution is required. To this end, Golestani et al. [75] introduced a wearable HAR system based on magnetic induction signals (Figure 4e). The system encompasses a receiving coil as a belt around the waist of the user and eight transmitting coils fixed at both the upper and lower limbs of the user. When the user moves, the relative position and alignment of the coils change, and the coupling non-propagating magnetic fields created by the coils also change. By measuring the variations in induction signals, the relative movement of the limbs can be measured. Compared with conventional radiating magnetic fields, the non-propagating magnetic fields decay more quickly, and this feature reduces the interference and improves the privacy safety. The most notable advantage of the system lies in the low cost, because cheap coils replace the expensive IMUs. To classify the human activity, a recurrent neural network based on long short-term memory (LSTM) is implemented and an accuracy of 98.9% is reported for the Berkeley Multimodal Human Action Database (MHAD).
In summary, the commercial-ready solutions (vision-based or non-vision-based) for wearable HMI systems illustrate good performance, but power consumption remains a challenging issue, limiting the long-time usage of these devices. Furthermore, these large and rigid commercial-ready sensors make building a miniature-sized and conformal wearable HMI system impossible. Therefore, the development of highly integrated, self-sustaining and flexible HMI systems based on self-powered sensors or energy harvesting technologies is highly desired and investigated, and these will be introduced in the following section.

3. Wearable Sensing and Energy Harvesting Mechanisms

Owing to the features of conformable or intimate contact with human body, wearable systems have the advantages of the continuous and accurate detection of multi-modal information regarding motion status and health conditions [89,90] (Figure 5). Although the research into wearable systems has obtained great achievements in recent decades, devices with an increasing number of sensors and high-performance microprocessors also suffer from concerns around reduced battery power, as the size and the capacity of the battery need to be balanced for comfort. For edge computing devices in particular, to further extend the operation time of the wearable system, researchers have begun to develop energy harvesters against different energy sources from the human body, so that these harvesters can be integrated into the wearable systems to continuously supply power [91,92]. A performance comparison of different wearable energy harvesting mechanisms is demonstrated in Table 2. Interestingly, as the variations of the outputs from the harvesters are observed in response to the fluctuations of the energy sources, e.g., body motions, temperature changes, and sweating, etc., the energy harvesters can be further modified to detect those parameters with self-generated signals, and these are typically known as self-powered sensors [93]. Hence, the power consumption of the sensors can be reduced. Eventually, wearable sensory systems will migrate from the use of conventional sensors with a power supply to sensor fusion technology, in which some sensors are replaced by self-powered sensors.
Figure 5. Working mechanisms of common wearable sensors and energy harvesters. (a) Piezoresistive. (b) Capacitive. (c) Piezoelectric. (d) Triboelectric. (e) Electromagnetic. (f) Thermoelectric.

3.1. Sensors with Power Supply

For wearable HMIs, the mature sensing technologies available on the market are piezoresistive and capacitive sensors [94]. The piezoresistive effect converts the mechanical force or deformation into the variation of resistance via the changing of the carrier mobility or the conduction path. Piezoresistive sensors generally possess good sensitivity and working ranges, as well as simple readout circuits, but still show the weaknesses of hysteresis and temperature creep. While, for capacitive sensors, they determine the mechanical stimulus through the response of the capacitance, as the parameters of the dielectric layer between two electrodes can be altered. Capacitive sensors offer good sensitivity and frequency response, with no temperature influence. However, a signal-to-noise ratio issue is frequently encountered.
By leveraging the MEMS fabrication process, silicon-based piezoresistive or capacitive designs are studied for developing inertial sensors or micro tactile sensors with extreme high sensitivities and multi-directional sensing capabilities. Meanwhile, thin film and soft material techniques enable e-skin-like sensors for monitoring body motion and interactive events.

3.2. Sensors and Energy Harvesters with Self-Generated Signals

The daily activity of the human body is a natural mechanical energy source that contains energy ranging from a few watts to a few tens of watts corresponding to the different body parts. To retrieve these energies, a few mechanical energy harvesting mechanisms are utilized, including the piezoelectric effect [95,96,97], the triboelectric effect [98,99], and the electromagnetic effect [100,101]. The piezoelectric effect relies on the polarization of the dipole moment of the piezoelectric materials under applied force or deformation to collect electric charges [27]. However, materials with high piezoelectric coefficients are usually ceramic, which is less preferable for a wearable system. On the other hand, the triboelectric effect is defined as the charge transfer during the mechanical interaction of two surfaces with different electronegativities [102]. Unlike the limited piezoelectric materials, the wide choices of triboelectric material reveal its own superiority in developing wearable energy harvesters [103,104]. But the vulnerability of the output signal to environmental influences, such as humidity, electromagnetic interference, etc., is a major problem that needs to be concerned.
As mentioned earlier, the outputs from the piezoelectric and triboelectric effects all show a good correlation to the intensity of the mechanical stimulus. By the proper design of the sensory structure, both of the two effects can be applied to detect various motions, such as pressing, bending, walking, rotating, vibration, etc., without consuming power on the sensor itself [105,106].
In addition, the general performance of the piezoelectric (PENG)- and triboelectric (TENG)-based energy harvesters/nanogenerators at the current stage still lack enough output power within the limited space of the wearable system. Alternatively, electromagnetic energy harvesters with a much higher power density act as another popular option for powering a wearable edge computing system. Electromagnetic generators rely on Faraday’s law of electromagnetic induction. In wearable systems, electromagnetic generators typically consist of a movable magnet–coil pair, where relative motion (usually caused by body movements) between them produces an alternating current output. Compared with piezoelectric and triboelectric harvesters, electromagnetic generators generally deliver higher power density and efficiency. However, their integration in flexible and miniaturized platforms is restricted by the bulky magnetic and coil components and by a limited output under small-scale deformation.
In addition to mechanical energy, the human body can also be a good source of thermal and moisture-related energy that can be harvested through pyroelectric and moisture–electric mechanisms. The pyroelectric effect originates from materials possessing a spontaneous polarization that varies with temperature fluctuation (ΔT), thereby generating an electric current or voltage during thermal cycling [107]. When the temperature changes, the alteration of dipole alignment in the pyroelectric material leads to a net flow of charges across the electrodes. This mechanism has been further optimized through thermodynamic cycles such as the Olsen cycle to enhance conversion efficiency [108]. On the other hand, moisture electricity generators exploit gradients of humidity, potentially converting the Gibbs free energy change to electricity [109]. With asymmetric hydrophilicity, the asymmetrical distribution of mobile ions or protons establishes an internal electric field, generating a continuous current [110]. These two mechanisms serve a complementary role to the mechanical energy harvesters: temperature variations or breathing moisture can also contribute to energy generation. However, the two mechanisms also suffer from some challenges, such as low power density for pyroelectric generators and humidity gradient maintenance for moisture electricity generators.

3.3. Other Energy Harvesters

Except for mechanical energy, there are other types of energy sources which may be possible to be harvested [100,111]. The thermoelectric effect, which involves using the temperature difference between a heat source and a heat sink to generate electricity [112], is applied, as the human body has a relatively stable temperature of about 37 °C. Although the output power may not be sufficient, the operation time is more continuous than that of a mechanical energy harvester, meaning that the accumulated energy stored in the battery can still be acceptable [113].
Table 2. Comparison of wearable energy harvesting mechanisms.
Table 2. Comparison of wearable energy harvesting mechanisms.
Power DensityWorking RangeOperation Condition
Triboelectric58.82 W/m2A few HzPower backpack for energy harvesting [114]
107 mW/m20.5–3 HzCardiac monitoring via implantable triboelectric nanogenerator [115]
0.52 mW/cm24 HzSkin-touch-actuated textile-based triboelectric nanogenerator with black phosphorus [116]
Piezoelectric159.4 W/cm325 HzPiezoelectric energy harvester with frequency up-conversion [117]
11 mW/cm30.33–3 HzNanogenerator based on ZnO nanowire array [118]
Electromagnetic730 μW/cm36 HzHuman motion energy harvester [119]
79.9 W/m22 HzRotational pendulum-based electromagnetic/triboelectric hybrid generator for human motion applications [120]
Thermoelectric1.2 mW cm−250 K temperature differenceInorganic flexible thermoelectric power generator [121]

4. Wearable Sensing Applications

Based on the sensing mechanisms mentioned in the previous section, a number of innovations in structure design and material developments have been brought into the research field of wearable sensing applications [122,123,124]. As a result, various sensors with their own superiorities (Table 3), such as high sensitivity, wide sensing range, fast response, low hysteresis, multi-modal sensing, low power consumption, etc., are presented frequently to help to explore every information from the human body in a more convenient manner [125,126]. For edge computing wearable systems, sensors with low power consumption (or even self-powered sensors) are especially beneficial, as the operation time of the devices can be greatly extended with such sensors.

4.1. Sensors for Human–Machine Interaction

Inertial sensor-based HMIs are widely adopted in different commercial products. Those accelerometers and gyroscopes can accurately capture the orientation and the motional information. With the aid of additional algorithms, more information related to physical interactions can be detected. Meanwhile, these inertial sensors are usually integrated with other sensors for comprehensive sensing. In Figure 6a, the presented glove uses a combination of 20 bending sensors, 16 tri-axial accelerometers, and 11 force sensors on a flexible PCB to capture the hand motions without the requirement of calibration [127].
Recently, the development of a new material has introduced a new kind of flexible iontronic tactile sensor by utilizing a highly conductive polymer thin film decorated with microspheres (Figure 6b) [128]. This ultrasensitive tactile sensor can detect dynamic micromotions, such as touching sandpapers with different surface textures. By designing a smart glove with a sensor array, it successfully demonstrates fast and accurate Braille recognition.
For most tactile sensors, cross interference among different mechanical deformations is one of the main concerns. Several solutions are addressed to isolate the interferences, including reference sensors, strain relief structures, compensation units, etc. Based on wafer-scale stretchable graphene film and patterning techniques, a semitransparent stretchable TENG is prepared as self-powered tactile sensor array with a strain-insensitive characteristic, as depicted in Figure 6c [129]. An 8 × 8 stretchable tactile sensor array is fabricated to map the distribution and intensity of the applied pressure without the influence of strain. Moreover, the temperature fluctuation also affects the piezoresistive-based sensors, especially for liquid metal-based sensors [130]. To solve this problem, a microfluidic tactile sensor based on a diaphragm pressure sensor with an embedded Galinstan microchannel design is then reported (Figure 6d) [131]. With a design of four primary sets of sensing grids, including two tangential sensing grids and two radial sensing grids, the proposed sensor not only shows high sensitivity, linearity, low limit of detection, high resolution, etc., but also possesses temperature self-compensation at 20–50 °C by using an embedded equivalent Wheatstone bridge circuit. Similarly, an interference-free bimodal tactile sensor for pressure and temperature sensing is presented in Figure 6e. The variations of resistance and luminescence caused by the thermoresistive effect and the piezo/tribophotonic effect enable the sensing of pressure and temperature independently and without any crosstalk. It also allows the wireless transmission of pressure and temperature data via mobile phone [132]. These sensors, handling cross interference of different signals, pave the way for multimodal sensing on wearable edge computing devices. The powerful onboard processers can analyze the latent features of multimodal signals and perform complex downstream tasks (e.g., human activity recognition, emotion recognition, etc.) powered by neural networks.
By considering wearability, the abovementioned sensors are often designed on a flexible or stretchable substrate, and detect the information at a specific part [133]. To establish a whole-body sensory network, more platforms, such as exoskeleton, can be considered. In the meantime, the different body parts usually have their own motion patterns, which require different sensing mechanisms or designs. In Figure 6f, a triboelectric bi-directional (TBD) sensor, which can be universally applied on different parts of the customized exoskeleton for capturing the motions of the entire upper limbs is reported [134]. With a facile switch and a basic grating structure, it realizes bidirectional sensing for both rotational and linear motions, the single type of pulse-like signal greatly simplifies the back-end signal processing. The good consistency between the exoskeleton and the structure of the human body allows further kinetic analysis for other physical parameters, such as displacement, velocity, and force, etc. All the TBD sensors in the exoskeleton are self-powered, which greatly increases the possible operation time of the device.

4.2. Sensors for Healthcare and Sports Monitoring

Healthcare and sports-related monitoring are key reasons behind the popularity of wearable systems [135,136,137]. With the help of diversified wearable devices, such as wrist bands, watches, glasses, sleeves, and insoles, etc., the continuous monitoring of the body conditions is now becoming increasingly convenient [19,138,139]. Edge computing technology fits in the task of healthcare and sports monitoring, as the local processing of the data can decrease the system latency and achieve real-time monitoring and notification. To further boost the functionality and sustainability, more studies are required [140]. As shown in Figure 6g, an integrated stretchable device for continuous health monitoring is composed of a kirigami-based stretchable and self-powered sensing component, and a near-field communication (NFC) data transmission module. The as-fabricated devices can be mounted on different surfaces without mechanical irritation, and can hence measure the surface strain of a deforming balloon and pig heart. In terms of other types of energy harvester, a perspiration-powered electronic skin (PPES) that harvests energy from human sweat through lactate biofuel cells (BFCs) is developed [141], which can achieve the continuous self-powered health monitoring with both multiplexed sensing and wireless data transmission [142]. It shows a great capability for noninvasive metabolic monitoring and a human–machine interface of assistive robotic control.
Conventional heart rate and blood pressure monitoring of the wearable devices utilizes a photoplethysmogram (PPG) sensor that consists of an LED and a photo detector. Alternatively, this function can be performed by applying piezoresistive sensors [143,144]. A proposed device has a fast response when capturing the precise pulse waveform. Compared with PPG-based devices which consume 10–100 mW of power, this piezoresistive-based patch requires much lower power consumption of 3 nW (Figure 6h). Such ultralow power consumption directly benefits on board edge analytics, making continuous health monitoring possible.
Clothes are the important platforms for a wearable system. Various fabrication processes are utilized [145], including weaving, knitting, etc. The manner of functionalizing the fabric or textile can further improve the level of integration [146,147,148]. In Figure 6i, by using the conductive polymer coated cotton sock, a smart sock with walking pattern recognition and motion tracking functions based on the output of fabric-based TENG can be fabricated [149]. The multi-segmental pattern of the coating can even realize the gait analysis. The fusion of PZT chip-based PENGs with the TENG sock enables the qualitative detection of the variation of sweat levels due to the output response of TENG against the sweat absorbed by the sock. In terms of sweat analysis, a wearable microfluidic patch technology is introduced in Figure 6j [150]. The roll-to-roll processed microfluidic channels with hydrophobic materials can guide sweat via natural pressure associated with eccrine sweat excretion. The colorimetric sensors with a smartphone image processing platform can measure regional sweating rate and sweat.
Optical sensors have emerged as an alternative solution to health monitoring to avoid some drawbacks of electrical sensors [151]. Among these, optical fiber-based wearable sensing systems have gained strong interest [152]. A highly flexible and intelligent wearable device based on a wavy shaped polymer optical microfiber is reported in [153]. The device can perform cardiorespiratory and behavioral monitoring of the user. Powered by neural networks, voice-based word recognition is also achieved. Furthermore, optical sensors can also be used to perform sweat sensing, with analytical methods transducing chemical information into optical signals [154]. A skin-interfaced, wearable sensing system that can detect the concentrations of vitamin C, calcium, zinc and iron in sweat is introduced [155]. The sweat is collected and stored with passive microvalves, microchannels and microreservoirs. The concentration of the nutrients is determined by colorimetry.
In addition to physical sensors, electrochemical sensors also play an important role in wearable healthcare and sports monitoring. As a promising approach for non-invasive health monitoring, electrochemical sensor-based wearable sweat sensing systems have been intensively investigated [156,157,158,159,160]. For instance, a fully in-ear flexible wearable system, which can be attached to an earphone, is proposed [157]. With multimodal electrochemical and electrophysiological sensors, the device can monitor the lactate concentration via the ear’s exocrine sweat glands and the brain states via multiple electrophysiological signals. Apart from lactate, other important biomarkers can also be tracked by wearable systems. A fingertip biosensing system is introduced in [160]. The energy is generated by enzymatic biofuel cells, which are fueled by lactate, and is stored in AgCl-Zn batteries. With flexible and compact design, the device can monitor glucose, vitamin C, lactate and levodopa.
Figure 6. Wearable sensing applications for human–machine interaction, sports and healthcare. (a) Data glove with inertial sensor integrated on flexible print circuit board. Reproduced with permission [127]. Copyright 2013, IEEE. (b) Flexible capacitive iontronic tactile sensor for ultrasensitive pressure detection. Reproduced with permission [128]. Copyright 2022, Wiley-VCH. (c) Strain-insensitive self-powered tactile sensor arrays based on a graphene elastomer. Reproduced with permission [129]. Copyright 2022, Wiley-VCH. (d) Wearable microfluidic pressure sensor. Reproduced with permission [131]. Copyright 2017, Wiley-VCH. (e) Bimodal tactile sensor with tactile and temperature sensing information. Reproduced with permission [132]. Copyright 2022, American Chemical Society. (f) Exoskeleton manipulator using bidirectional triboelectric sensors. Reproduced with permission [134]. Copyright 2021, Springer Nature. (g) Stretchable piezoelectric sensing systems for health monitoring. Reproduced with permission [142]. Copyright 2019, Wiley-VCH. (h) Piezoresistive sensor patch enabling ultralow power cuffless blood pressure measurement. Reproduced with permission [144]. Copyright 2016, Wiley-VCH. (i) Smart sock using piezoelectric and triboelectric hybrid mechanism for healthcare. Reproduced with permission [149]. Copyright 2019, American Chemical Society. (j) Skin-interfaced microfluidic system with personalized sweat analytics. Reproduced with permission [150]. Copyright 2020, American Association for the Advancement of Science.
Table 3. Comparison of performances among different sensing mechanisms.
Table 3. Comparison of performances among different sensing mechanisms.
MaterialsSensitivitySensing RangeResponse
Time
ApplicationRef.
PiezoresistiveLiquid metal0.0835 kPa−1100 Pa–50 kPa90 msTactile, bending, and pulse sensor[131]
Carbon−1.10 kPa−1<21 kPa29 msTactile sensor array[161]
Silicon10.3 kPa−10.37–5.9 kPa200 msPressure sensors[162]
CapacitiveElastomer0.55 kPa−1<15 kPa Pressure sensor array[163]
Air gap0.068 fF/mN<1.7 N200 msTactile sensor[164]
Ionic solution29.8 nF/N<4.2 N12 msThree-dimensional force sensor[165]
TriboelectricPolymer2.82 V MPa−10.3–612.5 kPa40 msTactile sensor array[166]
Elastomer0.013 kPa−11.3–70 kPa Tactile sensor[167]
Fabric 10–160% Smart clothes[168]
PiezoelectricPVDF0.21 V kPa−1<1 kPa20–40 msMimic somatic cutaneous sensor[169]
PZT0.018 kPa−11–30 kPa60 msPulse monitoring[170]
BaTiO337.1–257.9 mV N−15–60 N Detecting air pressure and human vital signs[171]

5. Wearable Feedback Systems

In the real world, the interactive events of humans rely heavily on the perception of the receptors in the skin and the muscle spindles [172]. As a result, the applied forces, the textures, the temperature, etc., can be perceived for better recognition and manipulation (Table 4). Although many current HMIs are equipped only with different sensing units, the existence of feedback components may greatly enrich the awareness of the interactive scene, especially for wearable systems [173,174]. In terms of controlling the robot or virtual character, the comprehensive feedback functions replicate a more realistic sensation compared with pure vision-based monitoring, in order to make a better adjustment [175]. For rehabilitation purposes, the feedback system can act not only as an assistive tool to increase the physical power of the patients, but can also be applied as a stimulator or massager to facilitate the recovery process.
Table 4. Advantages and disadvantages of different feedback techniques.

5.1. Cutaneous Feedback

Human skin can sense the shape, texture, and softness of the touched object via mechanoreceptors and can also differentiate vibration and static pressing through fast adapting (FA) and slow adapting (SA) receptors, respectively. Additionally, temperature sensors in our skin can provide even more complex sensations. However, the densities of those receptors are varied throughout the human body.
Except for the frequently used vibrators, pneumatic actuation is one of the most popular feedback techniques. To minimize the number of air inlets and pathways, several switch array techniques are studied to increase the controllable air chambers. A reconfigurable pneumatic haptic array controlled by shape memory polymer (SMP) membrane is shown in Figure 7a [176]. The Young’s modulus of the membrane can be changed by the heater, so that the membrane deforms under varied air pressure to realize the reconfigurable tactile system. As the applied voltages with different potentials created on two electrodes can cause the attraction phenomenon, the electrostatic force-based feedback is also studied. To improve the feedback force, the electrostatic actuator can be hydraulically amplified, as illustrated in Figure 7b [177]. Here, the outer region of the chamber was compressed under electrostatic force, and the central membrane was then expanded under hydraulic pressure. Noticeably, electrical discharge can also be considered a kind of feedback, and TENGs with an output voltage from triboelectrification are thus utilized for discharge feedback, as shown in Figure 7c [178]. The direct contact of the ball electrodes to human skin can lead the TENG discharge to be delivered as electrical discharge feedback.
Another issue regarding the use of cutaneous feedback is the question of how to accurately reconstruct spatial or textural stimuli to the mechanoreceptors in order to perceive the recreated 3D shapes in a single system. Dielectric elastomers can provide mechanical deformation under the applied voltage. To fabricate the actuator with tunable feedback perception, a multi-layer PDMS-based dielectric elastomer actuator (DEA) sandwiched by carbon nanotube electrodes for fingertip cutaneous feedback is demonstrated in Figure 7d [179]. The surface area of the as-fabricated DEA can be modified to cause a feeling of stretching or compression on the skin. Moving forward, a soft pneumatic actuator skin (SPA–skin) is presented as a low-profile soft interface containing a PZT sensor layer, an SPA layer, and a controller with pneumatic system (Figure 7e) [180]. Owing to the integration of the sensing and feedback layer, the local environmental loading conditions can be acquired for tuning the output in coherent feedback, so that the natural texture of an orange peel can be recreated. The high-bandwidth capabilities of feedback enable two-stage actuations, including low-frequency stimulation when approaching the contour shape, and high-frequency vibration when reaching the actual contour of the object.
To further explore the piezoelectric actuator with the flexible and lightweight substrate, while maintaining the amplitude of actuation, the low-cost mass printing approach was reported for large area multilayer piezoelectric actuators with strong haptic sensations. In Figure 7f, the proposed device shows an extremely high deflection, and generates a blocking force of 0.6 N, which is sufficient to generate an indentation on the human skin [181]. This can also be incorporated with audible sound information based on its high sound pressure level, so that a fusion of touch and sound sensation is realized.

5.2. Kinesthetic Feedback

The perception of the kinesthetic feedback is undertaken through the neurosensory pathways of the muscle spindles, to feel the movement or position of body parts. Because of its 3D features, the reconstruction of the motion that is mimicking the pathway becomes a grant challenge.
Tendon-driven actuation is commonly utilized in kinesthetic feedback, as well as in delivering physical assistance to impaired people. With well-designed wire routines and a miniaturized motor, the tendon-driven actuator can offer strong kinesthetic feedback force by a compact package. In Figure 7g, a wearable tendon-driven haptic device is demonstrated to provide multiple kinesthetic feedback in a small form factor for the stiffness rendering of virtual objects [182]. The tendon-locking unit modifies the stiffness of the tendon using a selective locking mechanism based on either a rigid mode or an elastic mode. Additionally, the tension transmission component can transfer the resistive force from the locking unit to finger through the tendon. Interestingly, the previously mentioned electrostatic force can also be applied for offering the blocking force as well. In Figure 7h, a high force density electrostatic clutch is proposed for developing the kinesthetic feedback glove to reflect the virtual events [183]. A conductive textile with a high friction insulation layer forms a clutch that can generate a frictional shear stress of up to 21 N/cm2 at 300 V to initiate the blocking action. To further quantify the feedback performance, a wearable hand system can measure the motional information of each finger during the interaction for recording and evaluating the effectiveness of the kinesthetic feedback, as shown in Figure 7i [184].
In addition, several researchers are also studying the delivery of feedback to other body parts to further improve the experience. A jacket with contraction and extension actuators can be pneumatically actuated to enable kinesthetic motions of the arms [185]. In addition, a wearable haptic guidance system has been developed to help those visually impaired people to navigate a running track (Figure 7j) [186]. With an RGB-D camera and a microprocessor to detect the lanes and calculate the steering angles, the skin can be stretched by a belt via the steering angles for generating the navigation-related haptic feedback sensation around the waist.

5.3. Temperature Feedback

Temperature sensation is an extremely important function of skin, which can help to protect ourselves from the potential hazards. Meanwhile, for human–machine interactions, temperature feedback allows a better perception of the working environment via a replicated immersive feeling. Moreover, it also assists medical rehabilitation via specific stimulations.
Most of the time, temperature and mechanical stimuli are exerted on skin simultaneously during the interaction in real space. Multi-modal feedback is drawing much attention. As depicted in Figure 7k, a sleeve-type soft haptic device is presented with the capabilities of reproducing the feeling of a moving thermal sensation along with pressure stimulation using a single method [187]. The pneumatic and hydraulic systems generate pressure and temperature stimuli, respectively. A microblower in a small sphygmomanometer eliminates the requirement of a large air compressor or complicated tubes. The electroresistive heater is another main technical direction. A multi-modal sensing and feedback glove is developed as illustrated in Figure 7l [188]. The liquid metal is printed into a meandered shape to offer both strain sensing and thermal feedback functions via the power supply.
On the other hand, the electroresistive based units can only act as heaters to give thermal feedback. A wearable ear hook with a Peltier module can provide both hot and cold stimuli on the auricular skin area [189]. By adding multiple Peltier modules, the multi-point auricular thermal patterns can be perceived by the users with an average accuracy of 85.3%. Although the thermoelectric module with the Peltier effect possesses both heating and cooling capabilities, the solution of developing flexibility or even stretchability is still an issue that needs consideration. In Figure 7m, a skin-like thermo-haptic device with thermoelectric units shows a certain flexibility, with a design incorporating Cu serpentine electrodes as interconnectors and thermoelectric-based pellets. This device can create a temperature difference of 15 °C via the heating and cooling process [190].
Figure 7. Wearable feedback systems for cutaneous feedback, kinesthetic feedback, and temperature feedback. (a) Large reconfigurable arrays with shape memory feedback units. Reproduced with permission [176]. Copyright 2017, Wiley-VCH. (b) Hydraulically amplified electrostatic actuators for multi-modal feedback. Reproduced with permission [177]. Copyright 2020, Wiley-VCH. (c) Self-powered electro-tactile system for virtual reality. Reproduced with permission [178]. Copyright 2021, American Association for the Advancement of Science. (d) Feel-through feedback unit using a dielectric elastomer actuator. Reproduced with permission [179]. Copyright 2021, Wiley-VCH. (e) Finger touch sensation with soft pneumatic actuator. Reproduced with permission [180]. Copyright 2021, Wiley-VCH. (f) Printed multilayer piezoelectric actuators on paper for touch and sound sensation. Reproduced with permission [181]. Copyright 2022, MDPI. (g) Wearable haptic device for the stiffness rendering of virtual objects. Reproduced with permission [182]. Copyright 2021, MDPI. (h) Textile electrostatic clutch for virtual reality. Reproduced with permission [183]. Copyright 2020, Wiley-VCH. (i) Kinesthetic feedback evaluation system for virtual reality. Reproduced with permission [184]. Copyright 2019, IEEE. (j) Haptic guidance system based on skin stretch around the waist. Reproduced with permission [186]. Copyright 2022, IEEE. (k) Wearable temperature feedback device with fluidic thermal stimulation. Reproduced with permission [187]. Copyright 2021, Association for Computing Machinery. (l) Liquid metal-based multimodal sensor and haptic feedback glove for thermal and tactile sensation. Reproduced with permission [188]. Copyright 2021, Wiley-VCH. (m) Stretchable cooling/heating feedback device for artificial thermal sensation. Reproduced with permission [190]. Copyright 2020, Wiley-VCH.

6. Wireless Power and Signal Transmission and Energy Harvesting

Due to the concern of portability, the size and weight of the energy storage unit are strictly restrained to maintain comfortability. An increasing number of integrated modules further reduces the operation time of the wearable system, though the power density of the battery also increases (Table 5). This situation will become worse when the edge computing capability is introduced, as the computation in the wearable microprocessor will consume much more power. The manner of scavenging the energy from the human body or ambient environment can extend the operation time, or even enable self-sustainability.
Table 5. Comparison of the typical power consumptions of common components in wearable systems.
Piezoelectric materials have a relatively high power density and small size based on MEMS fabrication process. Hence, PENG devices are frequently reported for wearable and implantable applications. A broadband ultrasonic PENG (Figure 8a) is demonstrated to power implantable biomedical devices via an input ultrasound probe, while maintaining the broadband frequency response from 250 to 240 kHz [191]. However, those ceramic PENGs are not suitable to make large-size flexible energy harvesters, and the general output power is thus still limited. TENG, with many more material options and design feasibilities, is another potential technique. In Figure 8b, a body-integrated self-powered system (BISS) is then developed to scavenge energy from human motions, through triboelectrification between soles and floor and electrification of the human body [192]. With a piece of an electrode attached to the skin, the human body, as a good conductor, can deliver the power into the other wearable system, such as smart glasses. However, many TENGs still suffer the power density issue. As a complementary solution, a fusion approach with both TENG and EMG mechanisms is utilized to make a self-powered electronic watch (Figure 8c) [193]. The EMG part consists of the magnetic ball with embedded coils which can generate 2.8–4.0 mW power. The arch-shaped TENG made by nylon and PDMS can provide 0.1 mW power. On the other hand, the power transmission throughout the whole body is also preferred, so that the energy harvesters can be separately placed onto the position with rich mechanical energy.
Interestingly, humans are now living in an environment that is full of electromagnetic (EM) waves. To convert those EM waves into electricity, human body-coupled power delivery and ambient energy harvesting chips are proposed with the capability of harvesting ambient EM waves and delivering their power via body medium for full-body power sustainability [194] (Figure 8d). Moreover, the applications of new materials in fabricating smart textiles or e-skin are further increasing the solutions of wireless wearable energy harvesters or transmitters (Figure 8e,f) [195,196].
Wireless signal transmission is one of the most important foundations for a wearable edge computing system. According to the required transmitting range, speed, and data volume, various wireless communication protocols, such as WiFi, Bluetooth, ZigBee, XBee, etc., are adopted in the commercial market. As a major power-consuming unit, many of the wireless units developed strategies to save energy, such as sleep mode, event-based trigger, etc. Together with the wearable energy harvester, the sustainability of the whole system can be improved. In addition, attempts to use a self-powered output as a data transmission signal are also presented.
In order to monitor vital signals in real time, a wearable system includes MXene-enhanced TENGs, pressure sensors, and multifunctional circuitry. The outstanding conductivity and mechanical flexibility of MXene enable conformable energy harvesting the pressure sensing with a low detection limit of ~9 Pa, and a fast response time of ~80 ms [197]. The whole wearable monitoring system is powered by the TENG in order to continuously measure the peaks of the radial artery pulse from the wrist in real time. The capacitance-to-digital converter chip communicates with pressure sensors and LEDs to visualize the valleys and peaks of the output waveforms with the “on” and “off” states, respectively.
By rectifying and converting electric power from the magneto-mechano-triboelectric generator (MMTEG), a self-powered wireless indoor positioning system is proposed with continuous monitoring [198]. The device consists of a magnetic field harvester, power control circuit, storage element, and IoT Bluetooth beacon. The MMTEG device generates an output of a nearly 60 Hz power cable connected to home appliances, with a weak magnetic field.
A hybrid vibration energy harvester made by TENG and EMG is presented in [199], to power an active RFID tag of a wearable system embedded in shoes, so that the automatic long-distance identification for door access can be realized. In the meantime, the fabric-based TENG is also fabricated with a superhydrophobic coating (Figure 8g) [200]. The integration of the diode can significantly enhance the output used in the wireless transmission. The tunable oscillation frequency response offered by the tunable external capacitor achieves a wireless control system via the coils. In addition, an optical media is proposed for wireless communication. By integrating a TENG with a microswitch, an LC resonant circuit, and a coupling inductor, a red laser and a photodetector can act as the transmitter and receiver, respectively, for wireless data communication with identification undertaken by the external capacitors [201].
Figure 8. Self-powered power and data transmission. (a) Broadband piezoelectric ultrasonic energy harvester for powering implantable biomedical devices. Reproduced with permission [191]. Copyright 2016, Springer Nature. (b) Self-powered body sensory network for wearable and implantable applications. Reproduced with permission [192]. Copyright 2019, American Chemical Society. (c) An electronic watch powered by an electromagnetic–triboelectric nanogenerator. Reproduced with permission [193]. Copyright 2015, American Chemical Society. (d) Human body-coupled power delivery and ambient energy harvesting. Reproduced with permission [194]. Copyright 2020, IEEE. (e) Resonant inductive wireless power transfer glove using embroidered textile coils. Reproduced with permission [195]. Copyright 2020, IEEE. (f) Printed textile-based carbon antenna for wearable energy harvesting. Reproduced with permission [196]. Copyright 2022, IEEE. (g) Short-range self-powered wireless sensor network using triboelectric mechanism. Reproduced with permission [200]. Copyright 2020, Elsevier.

7. Machine Learning-Enabled Intelligent Wearable HMIs

The introduction of ML technology in the current sensory system is reshaping the concept of smart sensors in various fields, ranging from healthcare to industrial and environmental monitoring [202,203,204,205]. The extraction of specific features from the respective dimensions through the massive datasets can realize a more comprehensive interpretation of the raw sensing signals [206]. Diversified ML algorithms are reported for better analyzing the different types of information, such as vision, sound, chemical, and tactile [207]. By understanding whether the ML algorithms are learning the features with the labeled data or unlabeled data, the ML technique can also be categorized into supervised learning and unsupervised learning. For instance, supervised learning is frequently adopted in those wearable HMIs with classification functions. Owing to the edge computing capability, some of the primary ML processing can be undertaken locally to reduce the data volume in wireless communications.
Speech recognition and translation are the most popular applications of ML. A sign language recognition and communication system with a comfort textile-based glove can assist the communication between non-signer and signer (Figure 9a) [208]. This glove not only offers the recognition of words, e.g., single gestures, but also enables the translation of sentences, e.g., continuous multiple gestures, with accuracies above 90% via the non-segmentation frame. To reduce the computing power, dimension reduction, such as principal component analysis (PCA) and linear discriminant analysis (LDA), etc., is often used to preprocess the sensory data. To further expand the capability of recognizing new sentences without dataset training, the segmentation method is applied to divide all of the sentence signals into word fragments so that ML can learn them individually. Thus, the ML algorithm can inversely reconstruct and recognize the whole sentence through the established correlation of basic words and sentences. Furthermore, the segmentation approach renders new sentence recognition by recombining the trained word units in new orders. In Figure 9b, a TENG-based sensory floor mat fabricated by screen printing is shown, composed of reference electrode and coding electrodes for position tracking and gait recognition [209]. The ratio of voltage output is used as sensing data, which eliminates the humidity disturbance to the absolute amplitude of triboelectric output voltage. As a scalable smart home application, the recognition of 20 users is undertaken by training the datasets with a 1D convolutional neural network (CNN) mode. The preliminary recognition accuracy can reach above 90%. During the human–machine interactions, the collaboration of the recognition and manipulation functions can greatly improve the efficiency with better intelligence. A smart glove-based HMI is illustrated in Figure 9c [210]. With the triboelectric finger-bending sensors and palm shear sensors, this glove is able to capture the finger and hand motions with multiple degrees of freedom. More importantly, the CNN and SVM algorithms are applied to further analyze multichannel signals during the different interactions to achieve object and gesture recognition. A smart walking stick for assisting the elderly has been developed using a hybridized unit and a rotational unit made by TENGs and EMG, respectively (Figure 9d) [211]. The aluminum is divided into five electrodes in order for the bottom TENG to detect the entire contact process of the stick as the user walks. With the aid of a 1D CNN to analyze the multi-channel outputs from TENGs, the different statuses, such as sit, walk, and fall down, etc., of the different users can be identified through the featured patterns of outputs, and, hence, the obtained real-time status of the user can be projected in the spaces of visional monitoring and immediate assistance. The linear-to-rotary energy harvesting function of TENG and EMG can efficiently harvest the ultralow, under 1 Hz-driven frequency to serve as the power supply for a self-sustainable system with GPS tracking and multifunctional monitoring.
Figure 9. AI-enabled wearable devices. (a) Sign language recognition using triboelectric smart glove. Reproduced with permission [208]. Copyright 2021, Springer Nature. (b) Floor monitoring system with AI for smart home applications. Reproduced with permission [209]. Copyright 2021, American Chemical Society. (c) Machine learning-enhanced smart glove for virtual/augmented reality applications. Reproduced with permission [210]. Copyright 2020, American Association for the Advancement of Science. (d) Caregiving walking stick for walking status monitoring. Reproduced with permission [211]. Copyright 2021, American Chemical Society. (e) Soft modular glove with AI-enabled augmented haptic feedback. Reproduced with permission [212]. Copyright 2022, American Chemical Society. (f) Augmented tactile-perception and haptic-feedback rings. Reproduced with permission [213]. Copyright 2022, Springer Nature.
On the other hand, most of the current ML-based applications focus simply on the analysis of sensing information. There is almost no research on the fusion of ML-enabled sensing and feedback, due to the spatial inconsistency between two components. A modular soft functionalized by Tactile+ (tactile plus) units is proposed with the ability to provide both sensing and feedback functions from the same unit (Figure 9e) [212]. Specifically, the Tactile+ units on finger modules and the palm module possess triboelectric tactile and strain sensing, pneumatic actuation, triboelectric-based monitoring of pneumatic actuation, temperature sensing, and thermal feedback. In addition to the basic manipulation functions and the ML-assisted object recognition, the recognition data can be applied to initiate the multimodal-augmented haptic feedback via tunable feedback patterns, which indicates a potential direction for the intelligent fusion of sensing and feedback capabilities. For an ML-assisted wearable system, miniaturization is a constant topic in the research field. The augmented tactile-perception and haptic-feedback rings with multimodal sensing and feedback capabilities shown in Figure 9f indicates a possible direction [213]. All of the thermal and tactile sensing and feedback functions are integrated into a minimalist designed ring and driven by a custom IoT module with the potential of edge computing. The voltage integration process of the raw signals realizes both dynamic and static detection by TENG sensors. This type of signal also provides ML-based gesture recognition with a higher accuracy of 99.821%. The self-powered TENG and PENG sensing units further reduce the power consumption for long sustainability. An interactive metaverse platform with cross-space perception capability is demonstrated by projecting objects in the real space into the virtual space, and is simultaneously felt by another virtual reality user in remote real space for a more immersive experience.

8. Wearable HMIs with Edge Computing

The increasing number of sensing terminals in sensor networks induces a severe problem of redundant data exchange between the sensing terminals and the computing units [214]. To mitigate the problem, a straightforward solution is to incorporate edge computing technology, placing the computing units close to or even inside the sensor itself. With this computing architecture, the excessive data exchange between the wearable system and external computing hardware or cloud computing infrastructure is eliminated. As a result, the energy consumption and latency of the system is reduced [215] and the privacy safety of the user is reinforced [216]. The implementation of edge computing wearable HMI systems encompasses two approaches: on one hand, highly integrated systems based on the conventional von Neumann computing paradigm are built to ensure the computing unit’s proximity to sensing terminals [217,218,219,220]; on the other hand, novel neuromorphic computing paradigms are achieved via artificial sensing neurons like memristors [221,222,223,224,225], memtransistors [226,227] and other innovative devices [228,229,230,231,232].
Many researchers have proposed wearable intelligent sensing HMIs with near-sensor data processing functionality based on the conventional computing paradigm. For instance, Moin et al. [217] have reported a highly integrated, flexible bio-sensing system which can perform real-time gesture recognition locally (i.e., near sensor) via surface electromyography (sEMG) (Figure 10a). A machine learning model is deployed in a miniaturized printed circuit board, reducing the latency and power consumption of the system. The model can adapt to changes in biological conditions (e.g., sweat, fatigue, etc.), as well as changes in forearm position. The high-density sensing unit encompass 4 × 16 electrodes to capture sEMG signal. The sensing unit is interfaced with a custom PCB, which consists mainly of a system-on-a-chip (SoC) field-programmable gate array (FPGA) and a 2.4 GHz radio SoC for data transmission. The system is powered by a lithium-ion battery. The classification of gestures is performed by a simple clustering algorithm, with each cluster centroid representing each gesture. This simple yet efficient algorithm facilitates the local deployment of the model and makes online adaptation possible. The throughput of the system is reported to be 20 predictions per second, illustrating the advantage of edge computing. An accuracy of 97.12% is reported for 13 gestures in 2 participants and an accuracy of 92.87% is preserved for 21 gestures.
Figure 10. Wearable HMIs with edge computing functionality. (a) A highly integrated, edge computing system for real-time gesture recognition. Reproduced with permission [217]. Copyright 2021, Springer Nature. (b) A textile-based smart glove for local gesture recognition. Reproduced with permission [219]. Copyright 2024, Wiley-VCH. (c) A wearable, near-sensor computing system for full-pose reconstruction. Reproduced with permission [220]. Copyright 2022, Springer Nature. (d) A wearable, neuromorphic computing system based on the memristor array. Reproduced with permission [221]. Copyright 2022, Wiley-VCH. (e) A tactile sensing system with large memristor array for hand-written digits recognition. Reproduced with permission [222]. Copyright 2022, American Chemical Society. (f) A wearable system based on an array of organic electrochemical transistors for electromyography (EMG) signal analysis. Reproduced with permission [229]. Copyright 2024, Springer Nature.
Apart from sensors based on sEMG, resistive tactile sensors can also be used to perform the gesture recognition task. Duan et al. [219] presented a textile-based smart glove that is capable of classifying gestures locally (Figure 10b). The 10 high-sensitive tactile sensors (2 for each finger) can capture the bending of a finger joint. When the finger bends, the resistance between the interdigitated electrodes changes because of the double contact effect: the contact between the TPU/Ag electrodes and MXene/textile and the contact among MXene/textile fibers. Through silver wires sewn in the glove’s textile, the sensing signal is transmitted to the MCU on a flexible PCB, and the gesture recognition process based on a nearest neighbor (measured in minimum Euclidean distance) algorithm is performed. The algorithm is simple but can be easily updated to recognize new gestures. The scalability of the system can be a problem though. For the classification performance, the accuracies for the 14-gesture and 20-gesture datasets are 99.5% and 98.1%, respectively. Due to the near-sensor computing ability of the system, only the results of gesture classification are transmitted wirelessly, minimizing the power consumption.
In addition to gesture recognition, full-body pose reconstruction can also be achieved with edge computing technology for enhanced latency and personal information security. Yang et al. [220] implemented a wearable, near-sensor computing system based on piezoresistive strain sensors to reconstruct the full-body pose of the user (Figure 10c). The sensing module of the system contains seven piezoresistive nanosheets made of Ti3C2Tx MXene, single-walled carbon nanotubes (SWCTs) and polyvinyl alcohol (PVA), illustrating high electrical conductivity and superior mechanical properties. With four different topographic structures (i.e., fully planar, fully wrinkled and two hybrid structures), these sensors exhibit different linear sensing ranges (i.e., local strain at the joint) and high sensitivity in their own working range. With respect to the edge computing module, an Arduino board is used to process data and perform the pose reconstruction task with a convolutional neural network near the sensors. In the experiment, a reconstruction error of 3.5 cm is reported when compared with the results generated from a state-of-the-art pose tracking method (i.e., OpenPose [233]) based on vision data. Because the edge computing paradigm eliminates the huge data transmission between the wearable system and the external computing device, the power consumption is reported to reduce by 71% when compared with the non-edge computing paradigm.
For wearable devices, the energy efficiency of edge computing is of vital importance. Despite the low power consumption of the aforementioned devices, their conventional von Neumann computing paradigm prevents the achievement of higher computing efficiency [234]. To this end, researchers have combined the neuromorphic computing paradigm, which mimics the function of the human brain [235], with edge computing technology to further decrease the power consumption. For example, Wang et al. [221] proposed a wearable device equipped with a piezoresistive sensor array and memristor array for neuromorphic computing (Figure 10d). The device can generate post-processed (noise reduction and edge detection) pressure maps with ultralow computation time and power consumption. The sensing module is an m × n piezoresistive pressure sensor array. The pressure sensors are based on Ag nanowires (AgNWs) and are fabricated with pyramid structures to enhance sensitivity. Each sensor is connected to a memristor element fabricated using an HfO2-based bipolar resistive switching unit. The conductance of each memristor element is pre-programmed to a fixed value by a pulse height modulation method. These conductance values act like the values in a convolutional filter. The neuromorphic computation is performed by an analog process mimicking the vector-matrix multiplication (VMM). By setting a fixed value for every memristor, the device can perform pressure sensing with a smoothing effect (i.e., denoising); by setting the column of memristors like a Laplacian convolutional filter, the device can detect edges in the pressure map. Due to this purely analog, neuromorphic computing paradigm, only 400 ns is required to perform one sensing-computing operation. In addition, the power consumption for denoising and edge detection is reported to be 2 µW and 7.84 µW, respectively.
Unlike the small-scale memristor array fabricated in [221], a large-scale memristor array fabricated by CMOS process is also deployed in wearable devices. For instance, Zhao et al. [222] reported a large-scale tactile device with a 64 × 64 piezoresistive film-based sensor array and a neuromorphic computing chip with roughly 160,000 memristors (Figure 10e). To enable high spatial resolution, a 64 × 64 single-walled carbon nanotube thin-film transistor array is integrated with the piezoresistive film. To be specific, the transistor array is coated by the piezoresistive film by MWCNTs/TPU precursor agent. The tactile sensor array achieves a spatial resolution of 0.9 mm (i.e., 28.2 pixels per inch). The tactile sensing data are processed by a TiN/HfOx/TaOy/TiN memristor array, which constitutes a multi-layer perceptron network using roughly 160,000 memristors. The performance of the device is validated by experiments on hand-written digits and Chinese character recognition, with an accuracy of 98.8% and 97.3%, respectively. Due to the neuromorphic computing technology, the device exhibits a fast inference speed of 77 μs per image.
Apart from the memristor, organic electrochemical transistors (OECTs) can also be employed to perform neuromorphic computing. Liu et al. [229] introduced an intrinsically stretchable organic electrochemical transistors (ISOECTs) array, which can be integrated into a wearable system to perform gesture recognition based on an electromyography (EMG) signal (Figure 10f). The ISOECT array is fabricated by a facile process via a multichannel inkjet-printing approach and is suitable for scalable production. The ISOECT array is printed on the styrene–ethylene/butylene–styrene (SEBS) substrate. The channel between the source and drain electrodes is made of poly(3,4-ethylenedioxythiophene) polystyrene sulfonate (PEDOT:PSS) for better water stability and mixed ionic–electronic conductivity. The nonlinear output of the ISOECT under a four-digit pulse wave is exploited to build the neuromorphic computing module—the digitized EMG signal is fed into the ISOECT array, which serves as a block for reservoir computing, and the output of the ISOECT array is fed into another conventional fully connected layer executed in a traditional CPU to output the classification results. Therefore, the computing paradigm is actually a half-neuromorphic half-conventional structure. Nonetheless, a significant reduction in data processing costs is reported, indicating the advantage of edge computing.

9. Conclusions and Perspectives

Wearable HMIs with edge computing capability are reshaping the lifestyle of humans in the coming era. Based on the concept of “human in the loop”, diverse sensors and feedback units are the most important interfaces, which are responsible for enabling the digitized dual manner of communication between the human and digital worlds. This has thus brought the necessity to introduce the edge computing technique, so that wearable devices can operate individually with a certain intelligence. Thus, the advancement of those conformable and stretchable integrated circuits is considered a foundation of those e-skin like wearable edge computing systems, and the device is thus able to be functionalized for the whole body. To extend their working time, various energy harvesting technologies, self-powered sensors, as well as self-powered data or power transmission methods are devoted to demonstrating a self-sustainable system, which is ultimate the goal of wearable HMIs.
On the other hand, the deployment of edge computing in wearable devices is still confronted with some key hurdles, including power consumption control, privacy and data security issues and material reliability. Currently, the application of many edge computing wearable devices is curbed by their short battery life. The power consumption of wearable devices mainly consists of sensing, computation and communication power. For sensing power control, self-powered sensors (i.e., piezoelectric, triboelectric, etc.) and energy harvesting technologies discussed in previous chapters can serve as a promising solution. Reducing computational power, nonetheless, can be difficult due to the limitation of the conventional von Neumann computing paradigm, which is commonly seen in wearable devices [234]. To resolve the dilemma, neuromorphic computing technology, which mimics the function of human brain, has gained much interest [235]. Some wearable devices powered by neuromorphic computing hardware (i.e., memristor, organic electrochemical transistors, etc.) have reported an ultralow computational power consumption. Furthermore, to constrain the power for inter-device communication, many mature protocols are optimized to save energy. For instance, a Bluetooth low energy device powered by a coin battery is reported to have a theoretical battery life of 14.1 years [236]. In addition to power consumption control, data security, data encryption and corresponding privacy issues still concern users. Common attacks threatening data confidentiality include eavesdropping attacks and property inference attacks targeting machine learning models. With respect to eavesdropping attacks, the malicious attackers attempt to steal and break the encrypted data during data transmission. Meanwhile, property inference attacks attempt to infer the characteristics of the dataset, based on which the behavior of the user can be exposed. To strengthen the data security during transmission, signal-based cryptographic key generation can be a suitable solution for wearable devices [237]. For attacks targeting machine learning models, limiting the exposure of models can be effective [238]. Moreover, for flexible neuromorphic edge computing devices, uniformity and reliability of thin film materials can be a challenge. Small-molecule organic films fabricated through thermal evaporation can be a promising approach [239].
Generally speaking, by incorporating the abovementioned techniques, the wearable edge computing system will eventually initiate a new revolution of human–machine interactions in the environments of the smart home, smart industrial, and smart city.

Author Contributions

Conceptualization, M.Z., S.H., T.C. and C.L.; methodology, M.Z. and C.L.; validation, M.Z., S.H. and C.L.; investigation, M.Z., S.H. and C.L.; writing—original draft preparation, M.Z. and S.H.; writing—review and editing, M.Z., S.H. and C.L.; visualization, M.Z. and S.H.; supervision, T.C. and C.L.; project administration, C.L.; funding acquisition, T.C. and C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (Grant numbers 62303340, 62573309); the Natural Science Foundation of Jiangsu Province of China (Grant number BK20230480); Research Platform for Biomedical and Health Technology, NUS (Suzhou) Research Institute (RP-BHT-Prof. LEE Chengkuo).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors acknowledge the funding support of this research.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wallin, T.J.; Pikul, J.; Shepherd, R.F. 3D Printing of Soft Robotic Systems. Nat. Rev. Mater. 2018, 3, 84–100. [Google Scholar] [CrossRef]
  2. Xiong, J.; Chen, J.; Lee, P.S. Functional Fibers and Fabrics for Soft Robotics, Wearables, and Human–Robot Interface. Adv. Mater. 2021, 33, 2002640. [Google Scholar] [CrossRef] [PubMed]
  3. Tan, Y.J.; Susanto, G.J.; Anwar Ali, H.P.; Tee, B.C.K. Progress and Roadmap for Intelligent Self-Healing Materials in Autonomous Robotics. Adv. Mater. 2021, 33, 2002800. [Google Scholar] [CrossRef]
  4. Xu, S.; Wu, W. Ink-Based Additive Nanomanufacturing of Functional Materials for Human-Integrated Smart Wearables. Adv. Intell. Syst. 2020, 2, 2000117. [Google Scholar] [CrossRef]
  5. Zhao, X.; Askari, H.; Chen, J. Nanogenerators for Smart Cities in the Era of 5G and Internet of Things. Joule 2021, 5, 1391–1431. [Google Scholar] [CrossRef]
  6. Ahmed, A.; Hassan, I.; El-Kady, M.F.; Radhi, A.; Jeong, C.K.; Selvaganapathy, P.R.; Zu, J.; Ren, S.; Wang, Q.; Kaner, R.B. Integrated Triboelectric Nanogenerators in the Era of the Internet of Things. Adv. Sci. 2019, 6, 1802230. [Google Scholar] [CrossRef] [PubMed]
  7. Ometov, A.; Shubina, V.; Klus, L.; Skibińska, J.; Saafi, S.; Pascacio, P.; Flueratoru, L.; Gaibor, D.Q.; Chukhno, N.; Chukhno, O.; et al. A Survey on Wearable Technology: History, State-of-the-Art and Current Challenges. Comput. Netw. 2021, 193, 108074. [Google Scholar] [CrossRef]
  8. He, T.; Lee, C. Evolving Flexible Sensors, Wearable and Implantable Technologies Towards BodyNET for Advanced Healthcare and Reinforced Life Quality. IEEE Open J. Circuits Syst. 2021, 2, 702–720. [Google Scholar] [CrossRef]
  9. Meng, K.; Zhao, S.; Zhou, Y.; Wu, Y.; Zhang, S.; He, Q.; Wang, X.; Zhou, Z.; Fan, W.; Tan, X.; et al. A Wireless Textile-Based Sensor System for Self-Powered Personalized Health Care. Matter 2020, 2, 896–907. [Google Scholar] [CrossRef]
  10. Sun, Z.; Zhu, M.; Lee, C. Progress in the Triboelectric Human–Machine Interfaces (HMIs)-Moving from Smart Gloves to AI/Haptic Enabled HMI in the 5G/IoT Era. Nanoenergy Adv. 2021, 1, 81–120. [Google Scholar] [CrossRef]
  11. Kim, H.; Kwon, Y.-T.; Lim, H.-R.; Kim, J.-H.; Kim, Y.-S.; Yeo, W.-H. Recent Advances in Wearable Sensors and Integrated Functional Devices for Virtual and Augmented Reality Applications. Adv. Funct. Mater. 2021, 31, 2005692. [Google Scholar] [CrossRef]
  12. Lin, Y.; Bariya, M.; Javey, A. Wearable Biosensors for Body Computing. Adv. Funct. Mater. 2021, 31, 2008087. [Google Scholar] [CrossRef]
  13. Kang, S.; Cho, S.; Shanker, R.; Lee, H.; Park, J.; Um, D.-S.; Lee, Y.; Ko, H. Transparent and Conductive Nanomembranes with Orthogonal Silver Nanowire Arrays for Skin-Attachable Loudspeakers and Microphones. Sci. Adv. 2018, 4, eaas8772. [Google Scholar] [CrossRef]
  14. Hartmann, F.; Baumgartner, M.; Kaltenbrunner, M. Becoming Sustainable, The New Frontier in Soft Robotics. Adv. Mater. 2021, 33, 2004413. [Google Scholar] [CrossRef] [PubMed]
  15. Polygerinos, P.; Correll, N.; Morin, S.A.; Mosadegh, B.; Onal, C.D.; Petersen, K.; Cianchetti, M.; Tolley, M.T.; Shepherd, R.F. Soft Robotics: Review of Fluid-Driven Intrinsically Soft Devices; Manufacturing, Sensing, Control, and Applications in Human-Robot Interaction. Adv. Eng. Mater. 2017, 19, 1700016. [Google Scholar] [CrossRef]
  16. Zhang, Z.; He, T.; Zhu, M.; Sun, Z.; Shi, Q.; Zhu, J.; Dong, B.; Yuce, M.R.; Lee, C. Deep Learning-Enabled Triboelectric Smart Socks for IoT-Based Gait Analysis and VR Applications. npj Flex. Electron. 2020, 4, 29. [Google Scholar] [CrossRef]
  17. Jeong, J.-W.; Yeo, W.-H.; Akhtar, A.; Norton, J.J.S.; Kwack, Y.-J.; Li, S.; Jung, S.-Y.; Su, Y.; Lee, W.; Xia, J.; et al. Materials and Optimized Designs for Human-Machine Interfaces Via Epidermal Electronics. Adv. Mater. 2013, 25, 6839–6846. [Google Scholar] [CrossRef] [PubMed]
  18. Yu, A.; Zhu, Y.; Wang, W.; Zhai, J. Progress in Triboelectric Materials: Toward High Performance and Widespread Applications. Adv. Funct. Mater. 2019, 29, 1900098. [Google Scholar] [CrossRef]
  19. Liu, Z.; Li, H.; Shi, B.; Fan, Y.; Wang, Z.L.; Li, Z. Wearable and Implantable Triboelectric Nanogenerators. Adv. Funct. Mater. 2019, 29, 1808820. [Google Scholar] [CrossRef]
  20. An, B.W.; Shin, J.H.; Kim, S.-Y.; Kim, J.; Ji, S.; Park, J.; Lee, Y.; Jang, J.; Park, Y.-G.; Cho, E.; et al. Smart Sensor Systems for Wearable Electronic Devices. Polymers 2017, 9, 303. [Google Scholar] [CrossRef]
  21. Kwak, S.S.; Yoon, H.-J.; Kim, S.-W. Textile-Based Triboelectric Nanogenerators for Self-Powered Wearable Electronics. Adv. Funct. Mater. 2019, 29, 1804533. [Google Scholar] [CrossRef]
  22. Gao, S.; He, T.; Zhang, Z.; Ao, H.; Jiang, H.; Lee, C. A Motion Capturing and Energy Harvesting Hybridized Lower-Limb System for Rehabilitation and Sports Applications. Adv. Sci. 2021, 8, 2101834. [Google Scholar] [CrossRef]
  23. He, T.; Guo, X.; Lee, C. Flourishing Energy Harvesters for Future Body Sensor Network: From Single to Multiple Energy Sources. iScience 2021, 24, 101934. [Google Scholar] [CrossRef]
  24. Lee, J.-H.; Kim, J.; Kim, T.Y.; Hossain, M.S.A.; Kim, S.-W.; Kim, J.H. All-in-One Energy Harvesting and Storage Devices. J. Mater. Chem. A 2016, 4, 7983–7999. [Google Scholar] [CrossRef]
  25. Zhu, M.; Yi, Z.; Yang, B.; Lee, C. Making Use of Nanoenergy from Human – Nanogenerator and Self-Powered Sensor Enabled Sustainable Wireless IoT Sensory Systems. Nano Today 2021, 36, 101016. [Google Scholar] [CrossRef]
  26. Liu, H.; Fu, H.; Sun, L.; Lee, C.; Yeatman, E.M. Hybrid Energy Harvesting Technology: From Materials, Structural Design, System Integration to Applications. Renew. Sustain. Energy Rev. 2021, 137, 110473. [Google Scholar] [CrossRef]
  27. Guo, X.; Liu, L.; Zhang, Z.; Gao, S.; He, T.; Shi, Q.; Lee, C. Technology Evolution from Micro-Scale Energy Harvesters to Nanogenerators. J. Micromech. Microeng. 2021, 31, 093002. [Google Scholar] [CrossRef]
  28. Wang, H.; Han, M.; Song, Y.; Zhang, H. Design, Manufacturing and Applications of Wearable Triboelectric Nanogenerators. Nano Energy 2021, 81, 105627. [Google Scholar] [CrossRef]
  29. Fan, F.R.; Tang, W.; Wang, Z.L. Flexible Nanogenerators for Energy Harvesting and Self-Powered Electronics. Adv. Mater. 2016, 28, 4283–4305. [Google Scholar] [CrossRef]
  30. Feng, H.; Zhao, C.; Tan, P.; Liu, R.; Chen, X.; Li, Z. Nanogenerator for Biomedical Applications. Adv. Healthc. Mater. 2018, 7, 1701298. [Google Scholar] [CrossRef]
  31. Yang, T.-H.; Kim, J.R.; Jin, H.; Gil, H.; Koo, J.-H.; Kim, H.J. Recent Advances and Opportunities of Active Materials for Haptic Technologies in Virtual and Augmented Reality. Adv. Funct. Mater. 2021, 31, 2008831. [Google Scholar] [CrossRef]
  32. Huang, Y.; Yao, K.; Li, J.; Li, D.; Jia, H.; Liu, Y.; Yiu, C.K.; Park, W.; Yu, X. Recent Advances in Multi-Mode Haptic Feedback Technologies towards Wearable Interfaces. Mater. Today Phys. 2022, 22, 100602. [Google Scholar] [CrossRef]
  33. Jung, Y.H.; Kim, J.-H.; Rogers, J.A. Skin-Integrated Vibrohaptic Interfaces for Virtual and Augmented Reality. Adv. Funct. Mater. 2021, 31, 2008805. [Google Scholar] [CrossRef]
  34. Ilami, M.; Bagheri, H.; Ahmed, R.; Skowronek, E.O.; Marvi, H. Materials, Actuators, and Sensors for Soft Bioinspired Robots. Adv. Mater. 2021, 33, 2003139. [Google Scholar] [CrossRef]
  35. Lee, J.; Kim, D.; Sul, H.; Ko, S.H. Thermo-Haptic Materials and Devices for Wearable Virtual and Augmented Reality. Adv. Funct. Mater. 2021, 31, 2007376. [Google Scholar] [CrossRef]
  36. Parida, K.; Bark, H.; Lee, P.S. Emerging Thermal Technology Enabled Augmented Reality. Adv. Funct. Mater. 2021, 31, 2007952. [Google Scholar] [CrossRef]
  37. Liu, S.; Li, Y.; Guo, W.; Huang, X.; Xu, L.; Lai, Y.-C.; Zhang, C.; Wu, H. Triboelectric Nanogenerators Enabled Sensing and Actuation for Robotics. Nano Energy 2019, 65, 104005. [Google Scholar] [CrossRef]
  38. Zhou, Y.; Shen, M.; Cui, X.; Shao, Y.; Li, L.; Zhang, Y. Triboelectric Nanogenerator Based Self-Powered Sensor for Artificial Intelligence. Nano Energy 2021, 84, 105887. [Google Scholar] [CrossRef]
  39. Zhang, Z.; Wen, F.; Sun, Z.; Guo, X.; He, T.; Lee, C. Artificial Intelligence-Enabled Sensing Technologies in the 5G/Internet of Things Era: From Virtual Reality/Augmented Reality to the Digital Twin. Adv. Intell. Syst. 2022, 4, 2100228. [Google Scholar] [CrossRef]
  40. Rhodin, H.; Richardt, C.; Casas, D.; Insafutdinov, E.; Shafiei, M.; Seidel, H.-P.; Schiele, B.; Theobalt, C. EgoCap: Egocentric Marker-Less Motion Capture with Two Fisheye Cameras. ACM Trans. Graph. 2016, 35, 1–11. [Google Scholar] [CrossRef]
  41. Xu, W.; Chatterjee, A.; Zollhöfer, M.; Rhodin, H.; Fua, P.; Seidel, H.-P.; Theobalt, C. Mo2Cap2: Real-Time Mobile 3D Motion Capture with a Cap-Mounted Fisheye Camera. IEEE Trans. Vis. Comput. Graph. 2019, 25, 2093–2101. [Google Scholar] [CrossRef] [PubMed]
  42. Tome, D.; Alldieck, T.; Peluse, P.; Pons-Moll, G.; Agapito, L.; Badino, H.; de la Torre, F. SelfPose: 3D Egocentric Pose Estimation From a Headset Mounted Camera. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 45, 6794–6806. [Google Scholar] [CrossRef]
  43. Ng, E.; Xiang, D.; Joo, H.; Grauman, K. You2Me: Inferring Body Pose in Egocentric Video via First and Second Person Interactions. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; pp. 9887–9897. [Google Scholar]
  44. Hickson, S.; Dufour, N.; Sud, A.; Kwatra, V.; Essa, I. Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 7–11 January 2019; pp. 1626–1635. [Google Scholar]
  45. Wu, H.; Feng, J.; Tian, X.; Sun, E.; Liu, Y.; Dong, B.; Xu, F.; Zhong, S. EMO: Real-Time Emotion Recognition from Single-Eye Images for Resource-Constrained Eyewear Devices. In Proceedings of the 18th International Conference on Mobile Systems, Applications, and Services, Toronto, ON, Canada, 15–19 June 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 448–461. [Google Scholar]
  46. Kwon, J.; Ha, J.; Kim, D.-H.; Choi, J.W.; Kim, L. Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses. IEEE Access 2021, 9, 146392–146403. [Google Scholar] [CrossRef]
  47. Nie, J.; Hu, Y.; Wang, Y.; Xia, S.; Jiang, X. SPIDERS: Low-Cost Wireless Glasses for Continuous In-Situ Bio-Signal Acquisition and Emotion Recognition. In Proceedings of the 2020 IEEE/ACM Fifth International Conference on Internet-of-Things Design and Implementation (IoTDI), Sydney, Australia, 21–24 April 2020; pp. 27–39. [Google Scholar]
  48. Nie, J.; Liu, Y.; Hu, Y.; Wang, Y.; Xia, S.; Preindl, M.; Jiang, X. SPIDERS+: A Light-Weight, Wireless, and Low-Cost Glasses-Based Wearable Platform for Emotion Sensing and Bio-Signal Acquisition. Pervasive Mob. Comput. 2021, 75, 101424. [Google Scholar] [CrossRef]
  49. Yan, Z.; Wu, Y.; Zhang, Y.; Chen, X. “Anthony” EmoGlass: An End-to-End AI-Enabled Wearable Platform for Enhancing Self-Awareness of Emotional Health. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 1–19. [Google Scholar]
  50. Cha, J.; Kim, J.; Kim, S. An IR-Based Facial Expression Tracking Sensor for Head-Mounted Displays. In Proceedings of the 2016 IEEE SENSORS, Orlando, FL, USA, 30 October–3 November 2016; pp. 1–3. [Google Scholar]
  51. Suzuki, K.; Nakamura, F.; Otsuka, J.; Masai, K.; Itoh, Y.; Sugiura, Y.; Sugimoto, M. Recognition and Mapping of Facial Expressions to Avatar by Embedded Photo Reflective Sensors in Head Mounted Display. In Proceedings of the 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 18–22 March 2017; pp. 177–185. [Google Scholar]
  52. Thies, J.; Zollhöfer, M.; Stamminger, M.; Theobalt, C.; Nießner, M. FaceVR: Real-Time Gaze-Aware Facial Reenactment in Virtual Reality. ACM Trans. Graph. 2018, 37, 25. [Google Scholar] [CrossRef]
  53. Chen, T.; Steeper, B.; Alsheikh, K.; Tao, S.; Guimbretière, F.; Zhang, C. C-Face: Continuously Reconstructing Facial Expressions by Deep Learning Contours of the Face with Ear-Mounted Miniature Cameras. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual, 20–23 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 112–125. [Google Scholar]
  54. Chen, T.; Li, Y.; Tao, S.; Lim, H.; Sakashita, M.; Zhang, R.; Guimbretiere, F.; Zhang, C. NeckFace: Continuously Tracking Full Facial Expressions on Neck-Mounted Wearables. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 1–31. [Google Scholar] [CrossRef]
  55. Li, H.; Wu, L.; Wang, H.; Han, C.; Quan, W.; Zhao, J. Hand Gesture Recognition Enhancement Based on Spatial Fuzzy Matching in Leap Motion. IEEE Trans. Ind. Inform. 2020, 16, 1885–1894. [Google Scholar] [CrossRef]
  56. Weng, Y.; Yu, C.; Shi, Y.; Zhao, Y.; Yan, Y.; Shi, Y. FaceSight: Enabling Hand-to-Face Gesture Interaction on AR Glasses with a Downward-Facing Camera Vision. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 1–14. [Google Scholar] [CrossRef]
  57. Hu, F.; He, P.; Xu, S.; Li, Y.; Zhang, C. FingerTrak: Continuous 3D Hand Pose Tracking by Deep Learning Hand Silhouettes Captured by Miniature Thermal Cameras on Wrist. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 71. [Google Scholar] [CrossRef]
  58. Wu, E.; Yuan, Y.; Yeo, H.-S.; Quigley, A.; Koike, H.; Kitani, K.M. Back-Hand-Pose: 3D Hand Pose Estimation for a Wrist-Worn Camera via Dorsum Deformation Network. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual Event, 20–23 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1147–1160. [Google Scholar]
  59. Zhang, R.; Chen, M.; Steeper, B.; Li, Y.; Yan, Z.; Chen, Y.; Tao, S.; Chen, T.; Lim, H.; Zhang, C. SpeeChin: A Smart Necklace for Silent Speech Recognition. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2022, 5, 1–23. [Google Scholar] [CrossRef]
  60. Zhang, Y.; Yang, J.; Liu, Z.; Wang, R.; Chen, G.; Tong, X.; Guo, B. VirtualCube: An Immersive 3D Video Communication System. IEEE Trans. Vis. Comput. Graph. 2022, 28, 2146–2156. [Google Scholar] [CrossRef]
  61. Shiratori, T.; Park, H.S.; Sigal, L.; Sheikh, Y.; Hodgins, J.K. Motion Capture from Body-Mounted Cameras. ACM Trans. Graph. 2011, 30, 1–10. [Google Scholar] [CrossRef]
  62. Award Winning Motion Capture Systems. Available online: https://www.vicon.com/ (accessed on 10 August 2024).
  63. Motion Capture Systems. Available online: http://optitrack.com/index.html (accessed on 10 August 2024).
  64. Hwang, D.-H.; Aso, K.; Yuan, Y.; Kitani, K.; Koike, H. MonoEye: Multimodal Human Motion Capture System Using A Single Ultra-Wide Fisheye Camera. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual Event, 20–23 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 98–111. [Google Scholar]
  65. Lim, H.; Li, Y.; Dressa, M.; Hu, F.; Kim, J.H.; Zhang, R.; Zhang, C. BodyTrak: Inferring Full-Body Poses from Body Silhouettes Using a Miniature Camera on a Wristband. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2022, 6, 1–21. [Google Scholar] [CrossRef]
  66. Ahuja, K.; Shen, V.; Fang, C.M.; Riopelle, N.; Kong, A.; Harrison, C. ControllerPose: Inside-Out Body Capture with VR Controller Cameras. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 30 April–5 May 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 1–13. [Google Scholar]
  67. Joo, H.; Liu, H.; Tan, L.; Gui, L.; Nabbe, B.; Matthews, I.; Kanade, T.; Nobuhara, S.; Sheikh, Y. Panoptic Studio: A Massively Multiview System for Social Motion Capture. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 13–16 December 2015; pp. 3334–3342. [Google Scholar]
  68. Fang, B.; Sun, F.; Liu, H.; Liu, C. 3D Human Gesture Capturing and Recognition by the IMMU-Based Data Glove. Neurocomputing 2018, 277, 198–207. [Google Scholar] [CrossRef]
  69. Kang, P.; Li, J.; Fan, B.; Jiang, S.; Shull, P.B. Wrist-Worn Hand Gesture Recognition While Walking via Transfer Learning. IEEE J. Biomed. Health Inform. 2022, 26, 952–961. [Google Scholar] [CrossRef] [PubMed]
  70. Hou, J.; Li, X.-Y.; Zhu, P.; Wang, Z.; Wang, Y.; Qian, J.; Yang, P. SignSpeaker: A Real-Time, High-Precision SmartWatch-Based Sign Language Translator. In Proceedings of the 25th Annual International Conference on Mobile Computing and Networking, Baja California Sur, Mexico, 21–25 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–15. [Google Scholar]
  71. Zhang, Q.; Jing, J.; Wang, D.; Zhao, R. WearSign: Pushing the Limit of Sign Language Translation Using Inertial and EMG Wearables. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2022, 6, 1–27. [Google Scholar] [CrossRef]
  72. Zhang, J.; Wang, Q.; Wang, Q.; Zheng, Z. Multimodal Fusion Framework Based on Statistical Attention and Contrastive Attention for Sign Language Recognition. IEEE Trans. Mob. Comput. 2024, 23, 1431–1443. [Google Scholar] [CrossRef]
  73. Pan, T.-Y.; Tsai, W.-L.; Chang, C.-Y.; Yeh, C.-W.; Hu, M.-C. A Hierarchical Hand Gesture Recognition Framework for Sports Referee Training-Based EMG and Accelerometer Sensors. IEEE Trans. Cybern. 2022, 52, 3172–3183. [Google Scholar] [CrossRef]
  74. Monoli, C.; Fuentez-Perez, J.F.; Cau, N.; Capodaglio, P.; Galli, M.; Tuhtan, J.A. Land and Underwater Gait Analysis Using Wearable IMU. IEEE Sens. J. 2021, 21, 11192–11202. [Google Scholar] [CrossRef]
  75. Golestani, N.; Moghaddam, M. Human Activity Recognition Using Magnetic Induction-Based Motion Signals and Deep Recurrent Neural Networks. Nat. Commun. 2020, 11, 1551. [Google Scholar] [CrossRef]
  76. Huang, G.-B. What Are Extreme Learning Machines? Filling the Gap Between Frank Rosenblatt’s Dream and John von Neumann’s Puzzle. Cogn. Comput. 2015, 7, 263–278. [Google Scholar] [CrossRef]
  77. Vásconez, J.P.; Barona López, L.I.; Valdivieso Caraguay, Á.L.; Benalcázar, M.E. Hand Gesture Recognition Using EMG-IMU Signals and Deep Q-Networks. Sensors 2022, 22, 9613. [Google Scholar] [CrossRef]
  78. Liu, Y.; Jiang, F.; Gowda, M. Finger Gesture Tracking for Interactive Applications: A Pilot Study with Sign Languages. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 112. [Google Scholar] [CrossRef]
  79. Kumari, P.; Mathew, L.; Syal, P. Increasing Trend of Wearables and Multimodal Interface for Human Activity Monitoring: A Review. Biosens. Bioelectron. 2017, 90, 298–307. [Google Scholar] [CrossRef] [PubMed]
  80. Hua, A.; Quicksall, Z.; Di, C.; Motl, R.; LaCroix, A.Z.; Schatz, B.; Buchner, D.M. Accelerometer-Based Predictive Models of Fall Risk in Older Women: A Pilot Study. npj Digit. Med. 2018, 1, 1–8. [Google Scholar] [CrossRef] [PubMed]
  81. Nweke, H.F.; Teh, Y.W.; Al-garadi, M.A.; Alo, U.R. Deep Learning Algorithms for Human Activity Recognition Using Mobile and Wearable Sensor Networks: State of the Art and Research Challenges. Expert. Syst. Appl. 2018, 105, 233–261. [Google Scholar] [CrossRef]
  82. Kim, J.; Kim, M.; Lee, M.-S.; Kim, K.; Ji, S.; Kim, Y.-T.; Park, J.; Na, K.; Bae, K.-H.; Kyun Kim, H.; et al. Wearable Smart Sensor Systems Integrated on Soft Contact Lenses for Wireless Ocular Diagnostics. Nat. Commun. 2017, 8, 14997. [Google Scholar] [CrossRef]
  83. Jones, S.E.; van Hees, V.T.; Mazzotti, D.R.; Marques-Vidal, P.; Sabia, S.; van der Spek, A.; Dashti, H.S.; Engmann, J.; Kocevska, D.; Tyrrell, J.; et al. Genetic Studies of Accelerometer-Based Sleep Measures Yield New Insights into Human Sleep Behaviour. Nat. Commun. 2019, 10, 1585. [Google Scholar] [CrossRef]
  84. Kim, J.; Campbell, A.S.; de Ávila, B.E.-F.; Wang, J. Wearable Biosensors for Healthcare Monitoring. Nat. Biotechnol. 2019, 37, 389–406. [Google Scholar] [CrossRef]
  85. Bianchi, V.; Bassoli, M.; Lombardo, G.; Fornacciari, P.; Mordonini, M.; De Munari, I. IoT Wearable Sensor and Deep Learning: An Integrated Approach for Personalized Human Activity Recognition in a Smart Home Environment. IEEE Internet Things J. 2019, 6, 8553–8562. [Google Scholar] [CrossRef]
  86. Tong, L.; Ma, H.; Lin, Q.; He, J.; Peng, L. A Novel Deep Learning Bi-GRU-I Model for Real-Time Human Activity Recognition Using Inertial Sensors. IEEE Sens. J. 2022, 22, 6164–6174. [Google Scholar] [CrossRef]
  87. Gao, L.; Bourke, A.K.; Nelson, J. Evaluation of Accelerometer Based Multi-Sensor versus Single-Sensor Activity Recognition Systems. Med. Eng. Phys. 2014, 36, 779–785. [Google Scholar] [CrossRef]
  88. Ertuǧrul, Ö.F.; Kaya, Y. Determining the Optimal Number of Body-Worn Sensors for Human Activity Recognition. Soft Comput. 2017, 21, 5053–5060. [Google Scholar] [CrossRef]
  89. Yin, R.; Wang, D.; Zhao, S.; Lou, Z.; Shen, G. Wearable Sensors-Enabled Human–Machine Interaction Systems: From Design to Application. Adv. Funct. Mater. 2021, 31, 2008936. [Google Scholar] [CrossRef]
  90. Kenry; Yeo, J.C.; Lim, C.T. Emerging Flexible and Wearable Physical Sensing Platforms for Healthcare and Biomedical Applications. Microsyst. Nanoeng. 2016, 2, 16043. [Google Scholar] [CrossRef]
  91. Kim, W.; Bhatia, D.; Jeong, S.; Choi, D. Mechanical Energy Conversion Systems for Triboelectric Nanogenerators: Kinematic and Vibrational Designs. Nano Energy 2019, 56, 307–321. [Google Scholar] [CrossRef]
  92. Xu, S.; Ding, W.; Guo, H.; Wang, X.; Wang, Z.L. Boost the Performance of Triboelectric Nanogenerators through Circuit Oscillation. Adv. Energy Mater. 2019, 9, 1900772. [Google Scholar] [CrossRef]
  93. He, X.; Zi, Y.; Yu, H.; Zhang, S.L.; Wang, J.; Ding, W.; Zou, H.; Zhang, W.; Lu, C.; Wang, Z.L. An Ultrathin Paper-Based Self-Powered System for Portable Electronics and Wireless Human-Machine Interaction. Nano Energy 2017, 39, 328–336. [Google Scholar] [CrossRef]
  94. Zhu, J.; Liu, X.; Shi, Q.; He, T.; Sun, Z.; Guo, X.; Liu, W.; Sulaiman, O.B.; Dong, B.; Lee, C. Development Trends and Perspectives of Future Sensors and MEMS/NEMS. Micromachines 2020, 11, 7. [Google Scholar] [CrossRef]
  95. Briscoe, J.; Dunn, S. Piezoelectric Nanogenerators—A Review of Nanostructured Piezoelectric Energy Harvesters. Nano Energy 2015, 14, 15–29. [Google Scholar] [CrossRef]
  96. Wu, Y.; Ma, Y.; Zheng, H.; Ramakrishna, S. Piezoelectric Materials for Flexible and Wearable Electronics: A Review. Mater. Des. 2021, 211, 110164. [Google Scholar] [CrossRef]
  97. Yang, Z.; Zhou, S.; Zu, J.; Inman, D. High-Performance Piezoelectric Energy Harvesters and Their Applications. Joule 2018, 2, 642–697. [Google Scholar] [CrossRef]
  98. Luo, J.; Wang, Z.L. Recent Progress of Triboelectric Nanogenerators: From Fundamental Theory to Practical Applications. EcoMat 2020, 2, e12059. [Google Scholar] [CrossRef]
  99. Li, X.; Xu, G.; Xia, X.; Fu, J.; Huang, L.; Zi, Y. Standardization of Triboelectric Nanogenerators: Progress and Perspectives. Nano Energy 2019, 56, 40–55. [Google Scholar] [CrossRef]
  100. Liu, L.; Guo, X.; Lee, C. Promoting Smart Cities into the 5G Era with Multi-Field Internet of Things (IoT) Applications Powered with Advanced Mechanical Energy Harvesters. Nano Energy 2021, 88, 106304. [Google Scholar] [CrossRef]
  101. Hong, J.; Xiao, Y.; Chen, Y.; Duan, S.; Xiang, S.; Wei, X.; Zhang, H.; Liu, L.; Xia, J.; Lei, W.; et al. Body-Coupled-Driven Object-Oriented Natural Interactive Interface. Adv. Mater. 2025, 37, e07067. [Google Scholar] [CrossRef] [PubMed]
  102. Wu, C.; Wang, A.C.; Ding, W.; Guo, H.; Wang, Z.L. Triboelectric Nanogenerator: A Foundation of the Energy for the New Era. Adv. Energy Mater. 2019, 9, 1802906. [Google Scholar] [CrossRef]
  103. Ding, W.; Wang, A.C.; Wu, C.; Guo, H.; Wang, Z.L. Human–Machine Interfacing Enabled by Triboelectric Nanogenerators and Tribotronics. Adv. Mater. Technol. 2019, 4, 1800487. [Google Scholar] [CrossRef]
  104. Guo, X.; Zhang, Z.; Ren, Z.; Li, D.; Xu, C.; Wang, L.; Liu, W.; Zhuge, Y.; Zhou, G.; Lee, C. Advances in Intelligent Nano-Micro-Scale Sensors and Actuators: Moving toward Self-Sustained Edge AI Microsystems. Adv. Mater. 2025, e10417. [Google Scholar] [CrossRef]
  105. Dong, B.; Shi, Q.; He, T.; Zhu, S.; Zhang, Z.; Sun, Z.; Ma, Y.; Kwong, D.-L.; Lee, C. Wearable Triboelectric/Aluminum Nitride Nano-Energy-Nano-System with Self-Sustainable Photonic Modulation and Continuous Force Sensing. Adv. Sci. 2020, 7, 1903636. [Google Scholar] [CrossRef]
  106. Gong, S.; Zhang, B.; Zhang, J.; Wang, Z.L.; Ren, K. Biocompatible Poly(Lactic Acid)-Based Hybrid Piezoelectric and Electret Nanogenerator for Electronic Skin Applications. Adv. Funct. Mater. 2020, 30, 1908724. [Google Scholar] [CrossRef]
  107. He, H.; Lu, X.; Hanc, E.; Chen, C.; Zhang, H.; Lu, L. Advances in Lead-Free Pyroelectric Materials: A Comprehensive Review. J. Mater. Chem. C 2020, 8, 1494–1516. [Google Scholar] [CrossRef]
  108. Bowen, C.R.; Taylor, J.; LeBoulbar, E.; Zabek, D.; Chauhan, A.; Vaish, R. Pyroelectric Materials and Devices for Energy Harvesting Applications. Energy Environ. Sci. 2014, 7, 3836–3856. [Google Scholar] [CrossRef]
  109. Yan, H.; Liu, Z.; Qi, R. A Review of Humidity Gradient-Based Power Generator: Devices, Materials and Mechanisms. Nano Energy 2022, 101, 107591. [Google Scholar] [CrossRef]
  110. Wei, Q.; Ge, W.; Yuan, Z.; Wang, S.; Lu, C.; Feng, S.; Zhao, L.; Liu, Y. Moisture Electricity Generation: Mechanisms, Structures, and Applications. Nano Res. 2023, 16, 7496–7510. [Google Scholar] [CrossRef]
  111. Vallem, V.; Sargolzaeiaval, Y.; Ozturk, M.; Lai, Y.-C.; Dickey, M.D. Energy Harvesting and Storage with Soft and Stretchable Materials. Adv. Mater. 2021, 33, 2004832. [Google Scholar] [CrossRef]
  112. Yan, J.; Liao, X.; Yan, D.; Chen, Y. Review of Micro Thermoelectric Generator. J. Microelectromech.Syst. 2018, 27, 1–18. [Google Scholar] [CrossRef]
  113. Pu, X.; Hu, W.; Wang, Z.L. Toward Wearable Self-Charging Power Systems: The Integration of Energy-Harvesting and Storage Devices. Small 2018, 14, 1702817. [Google Scholar] [CrossRef] [PubMed]
  114. Yang, Z.; Yang, Y.; Liu, F.; Wang, Z.; Li, Y.; Qiu, J.; Xiao, X.; Li, Z.; Lu, Y.; Ji, L.; et al. Power Backpack for Energy Harvesting and Reduced Load Impact. ACS Nano 2021, 15, 2611–2623. [Google Scholar] [CrossRef] [PubMed]
  115. Zheng, Q.; Zhang, H.; Shi, B.; Xue, X.; Liu, Z.; Jin, Y.; Ma, Y.; Zou, Y.; Wang, X.; An, Z.; et al. In Vivo Self-Powered Wireless Cardiac Monitoring via Implantable Triboelectric Nanogenerator. ACS Nano 2016, 10, 6510–6518. [Google Scholar] [CrossRef]
  116. Xiong, J.; Cui, P.; Chen, X.; Wang, J.; Parida, K.; Lin, M.-F.; Lee, P.S. Skin-Touch-Actuated Textile-Based Triboelectric Nanogenerator with Black Phosphorus for Durable Biomechanical Energy Harvesting. Nat. Commun. 2018, 9, 4280. [Google Scholar] [CrossRef]
  117. Liu, H.; Lee, C.; Kobayashi, T.; Tay, C.J.; Quan, C. Piezoelectric MEMS-Based Wideband Energy Harvesting Systems Using a Frequency-up-Conversion Cantilever Stopper. Sens. Actuators A Phys. 2012, 186, 242–248. [Google Scholar] [CrossRef]
  118. Zhu, G.; Yang, R.; Wang, S.; Wang, Z.L. Flexible High-Output Nanogenerator Based on Lateral ZnO Nanowire Array. Nano Lett. 2010, 10, 3151–3155. [Google Scholar] [CrossRef] [PubMed]
  119. Geisler, M.; Boisseau, S.; Perez, M.; Gasnier, P.; Willemin, J.; Ait-Ali, I.; Perraud, S. Human-Motion Energy Harvester for Autonomous Body Area Sensors. Smart Mater. Struct. 2017, 26, 035028. [Google Scholar] [CrossRef]
  120. Hou, C.; Chen, T.; Li, Y.; Huang, M.; Shi, Q.; Liu, H.; Sun, L.; Lee, C. A Rotational Pendulum Based Electromagnetic/Triboelectric Hybrid-Generator for Ultra-Low-Frequency Vibrations Aiming at Human Motion and Blue Energy Applications. Nano Energy 2019, 63, 103871. [Google Scholar] [CrossRef]
  121. We, J.H.; Kim, S.J.; Cho, B.J. Hybrid Composite of Screen-Printed Inorganic Thermoelectric Film and Organic Conducting Polymer for Flexible Thermoelectric Power Generator. Energy 2014, 73, 506–512. [Google Scholar] [CrossRef]
  122. Dong, B.; Shi, Q.; Yang, Y.; Wen, F.; Zhang, Z.; Lee, C. Technology Evolution from Self-Powered Sensors to AIoT Enabled Smart Homes. Nano Energy 2021, 79, 105414. [Google Scholar] [CrossRef]
  123. Xu, K.; Lu, Y.; Takei, K. Multifunctional Skin-Inspired Flexible Sensor Systems for Wearable Electronics. Adv. Mater. Technol. 2019, 4, 1800628. [Google Scholar] [CrossRef]
  124. Chen, J.; Xia, M.; Chen, P.; Cai, B.; Chen, H.; Xie, X.; Wu, J.; Shi, Q. Innovations in Multidimensional Force Sensors for Accurate Tactile Perception and Embodied Intelligence. AI Sens. 2025, 1, 7. [Google Scholar] [CrossRef]
  125. Zhu, M.; He, T.; Lee, C. Technologies toward next Generation Human Machine Interfaces: From Machine Learning Enhanced Tactile Sensing to Neuromorphic Sensory Systems. Appl. Phys. Rev. 2020, 7, 031305. [Google Scholar] [CrossRef]
  126. Qi, J.; Yang, P.; Waraich, A.; Deng, Z.; Zhao, Y.; Yang, Y. Examining Sensor-Based Physical Activity Recognition and Monitoring for Healthcare Using Internet of Things: A Systematic Review. J. Biomed. Inform. 2018, 87, 138–153. [Google Scholar] [CrossRef] [PubMed]
  127. O’Flynn, B.; Torres, J.; Connolly, J.; Condell, J.; Curran, K.; Gardiner, P. Novel Smart Sensor Glove for Arthritis Rehabiliation. In Proceedings of the 2013 IEEE International Conference on Body Sensor Networks, Cambridge, MA, USA, 6–9 May 2013; pp. 1–6. [Google Scholar]
  128. Wang, P.; Li, G.; Yu, W.; Meng, C.; Guo, S. Flexible Pseudocapacitive Iontronic Tactile Sensor Based on Microsphere-Decorated Electrode and Microporous Polymer Electrolyte for Ultrasensitive Pressure Detection. Adv. Electron. Mater. 2022, 8, 2101269. [Google Scholar] [CrossRef]
  129. He, J.; Zhou, R.; Zhang, Y.; Gao, W.; Chen, T.; Mai, W.; Pan, C. Strain-Insensitive Self-Powered Tactile Sensor Arrays Based on Intrinsically Stretchable and Patternable Ultrathin Conformal Wrinkled Graphene-Elastomer Composite. Adv. Funct. Mater. 2022, 32, 2107281. [Google Scholar] [CrossRef]
  130. Yeo, J.C.; Yu, J.; Koh, Z.M.; Wang, Z.; Lim, C.T. Wearable Tactile Sensor Based on Flexible Microfluidics. Lab. Chip 2016, 16, 3244–3250. [Google Scholar] [CrossRef]
  131. Gao, Y.; Ota, H.; Schaler, E.W.; Chen, K.; Zhao, A.; Gao, W.; Fahad, H.M.; Leng, Y.; Zheng, A.; Xiong, F.; et al. Wearable Microfluidic Diaphragm Pressure Sensor for Health and Tactile Touch Monitoring. Adv. Mater. 2017, 29, 1701985. [Google Scholar] [CrossRef] [PubMed]
  132. Ma, X.; Wang, C.; Wei, R.; He, J.; Li, J.; Liu, X.; Huang, F.; Ge, S.; Tao, J.; Yuan, Z.; et al. Bimodal Tactile Sensor without Signal Fusion for User-Interactive Applications. ACS Nano 2022, 16, 2789–2797. [Google Scholar] [CrossRef] [PubMed]
  133. Jeon, S.-B.; Kim, W.-G.; Park, S.-J.; Tcho, I.-W.; Jin, I.-K.; Han, J.-K.; Kim, D.; Choi, Y.-K. Self-Powered Wearable Touchpad Composed of All Commercial Fabrics Utilizing a Crossline Array of Triboelectric Generators. Nano Energy 2019, 65, 103994. [Google Scholar] [CrossRef]
  134. Zhu, M.; Sun, Z.; Chen, T.; Lee, C. Low Cost Exoskeleton Manipulator Using Bidirectional Triboelectric Sensors Enhanced Multiple Degree of Freedom Sensory System. Nat. Commun. 2021, 12, 2692. [Google Scholar] [CrossRef]
  135. Shi, Q.; Yang, Y.; Sun, Z.; Lee, C. Progress of Advanced Devices and Internet of Things Systems as Enabling Technologies for Smart Homes and Health Care. ACS Mater. Au 2022, 2, 394–435. [Google Scholar] [CrossRef]
  136. Tricoli, A.; Nasiri, N.; De, S. Wearable and Miniaturized Sensor Technologies for Personalized and Preventive Medicine. Adv. Funct. Mater. 2017, 27, 1605271. [Google Scholar] [CrossRef]
  137. Shen, Z.; Li, J.; Hu, H.; Du, C.; Ding, X.; Pan, T.; Yu, X. A Review of Non-Invasive Continuous Blood Pressure Measurement: From Flexible Sensing to Intelligent Modeling. AI Sens. 2025, 1, 8. [Google Scholar] [CrossRef]
  138. Wang, M.; Luo, Y.; Wang, T.; Wan, C.; Pan, L.; Pan, S.; He, K.; Neo, A.; Chen, X. Artificial Skin Perception. Adv. Mater. 2021, 33, 2003014. [Google Scholar] [CrossRef]
  139. Li, Z.; Zheng, Q.; Wang, Z.L.; Li, Z. Nanogenerator-Based Self-Powered Sensors for Wearable and Implantable Electronics. Research 2020, 2020, 8710686. [Google Scholar] [CrossRef]
  140. Luo, J.; Li, Y.; He, M.; Wang, Z.; Li, C.; Liu, D.; An, J.; Xie, W.; He, Y.; Xiao, W.; et al. Rehabilitation of Total Knee Arthroplasty by Integrating Conjoint Isometric Myodynamia and Real-Time Rotation Sensing System. Adv. Sci. 2022, 9, 2105219. [Google Scholar] [CrossRef] [PubMed]
  141. Yu, Y.; Nassar, J.; Xu, C.; Min, J.; Yang, Y.; Dai, A.; Doshi, R.; Huang, A.; Song, Y.; Gehlhar, R.; et al. Biofuel-Powered Soft Electronic Skin with Multiplexed and Wireless Sensing for Human-Machine Interfaces. Sci. Robot. 2020, 5, eaaz7946. [Google Scholar] [CrossRef] [PubMed]
  142. Sun, R.; Carreira, S.C.; Chen, Y.; Xiang, C.; Xu, L.; Zhang, B.; Chen, M.; Farrow, I.; Scarpa, F.; Rossiter, J. Stretchable Piezoelectric Sensing Systems for Self-Powered and Wireless Health Monitoring. Adv. Mater. Technol. 2019, 4, 1900100. [Google Scholar] [CrossRef]
  143. Wang, J.; Zhu, Y.; Wu, Z.; Zhang, Y.; Lin, J.; Chen, T.; Liu, H.; Wang, F.; Sun, L. Wearable Multichannel Pulse Condition Monitoring System Based on Flexible Pressure Sensor Arrays. Microsyst. Nanoeng. 2022, 8, 16. [Google Scholar] [CrossRef]
  144. Luo, N.; Dai, W.; Li, C.; Zhou, Z.; Lu, L.; Poon, C.C.Y.; Chen, S.-C.; Zhang, Y.; Zhao, N. Flexible Piezoresistive Sensor Patch Enabling Ultralow Power Cuffless Blood Pressure Measurement. Adv. Funct. Mater. 2016, 26, 1178–1187. [Google Scholar] [CrossRef]
  145. Dong, K.; Peng, X.; An, J.; Wang, A.C.; Luo, J.; Sun, B.; Wang, J.; Wang, Z.L. Shape Adaptable and Highly Resilient 3D Braided Triboelectric Nanogenerators as E-Textiles for Power and Sensing. Nat. Commun. 2020, 11, 2868. [Google Scholar] [CrossRef]
  146. Wang, W.; Yu, A.; Liu, X.; Liu, Y.; Zhang, Y.; Zhu, Y.; Lei, Y.; Jia, M.; Zhai, J.; Wang, Z.L. Large-Scale Fabrication of Robust Textile Triboelectric Nanogenerators. Nano Energy 2020, 71, 104605. [Google Scholar] [CrossRef]
  147. Dong, K.; Peng, X.; Wang, Z.L. Fiber/Fabric-Based Piezoelectric and Triboelectric Nanogenerators for Flexible/Stretchable and Wearable Electronics and Artificial Intelligence. Adv. Mater. 2020, 32, 1902549. [Google Scholar] [CrossRef]
  148. Shi, J.; Liu, S.; Zhang, L.; Yang, B.; Shu, L.; Yang, Y.; Ren, M.; Wang, Y.; Chen, J.; Chen, W.; et al. Smart Textile-Integrated Microelectronic Systems for Wearable Applications. Adv. Mater. 2020, 32, 1901958. [Google Scholar] [CrossRef] [PubMed]
  149. Zhu, M.; Shi, Q.; He, T.; Yi, Z.; Ma, Y.; Yang, B.; Chen, T.; Lee, C. Self-Powered and Self-Functional Cotton Sock Using Piezoelectric and Triboelectric Hybrid Mechanism for Healthcare and Sports Monitoring. ACS Nano 2019, 13, 1940–1952. [Google Scholar] [CrossRef] [PubMed]
  150. Baker, L.B.; Model, J.B.; Barnes, K.A.; Anderson, M.L.; Lee, S.P.; Lee, K.A.; Brown, S.D.; Reimel, A.J.; Roberts, T.J.; Nuccio, R.P.; et al. Skin-Interfaced Microfluidic System with Personalized Sweating Rate and Sweat Chloride Analytics for Sports Science Applications. Sci. Adv. 2020, 6, eabe3929. [Google Scholar] [CrossRef]
  151. Jha, R.; Mishra, P.; Kumar, S. Advancements in Optical Fiber-Based Wearable Sensors for Smart Health Monitoring. Biosens. Bioelectron. 2024, 254, 116232. [Google Scholar] [CrossRef]
  152. Zhu, H.-T.; Zhan, L.-W.; Dai, Q.; Xu, B.; Chen, Y.; Lu, Y.-Q.; Xu, F. Self-Assembled Wavy Optical Microfiber for Stretchable Wearable Sensor. Adv. Opt. Mater. 2021, 9, 2002206. [Google Scholar] [CrossRef]
  153. Wang, Z.; Chen, Z.; Ma, L.; Wang, Q.; Wang, H.; Leal-Junior, A.; Li, X.; Marques, C.; Min, R. Optical Microfiber Intelligent Sensor: Wearable Cardiorespiratory and Behavior Monitoring with a Flexible Wave-Shaped Polymer Optical Microfiber. ACS Appl. Mater. Interfaces 2024, 16, 8333–8345. [Google Scholar] [CrossRef] [PubMed]
  154. Wang, J.; Luo, Y.; Zhou, Z.; Xiao, J.; Xu, T.; Zhang, X. Epidermal Wearable Optical Sensors for Sweat Monitoring. Commun. Mater. 2024, 5, 77. [Google Scholar] [CrossRef]
  155. Kim, J.; Wu, Y.; Luan, H.; Yang, D.S.; Cho, D.; Kwak, S.S.; Liu, S.; Ryu, H.; Ghaffari, R.; Rogers, J.A. A Skin-Interfaced, Miniaturized Microfluidic Analysis and Delivery System for Colorimetric Measurements of Nutrients in Sweat and Supply of Vitamins Through the Skin. Adv. Sci. 2022, 9, 2103331. [Google Scholar] [CrossRef]
  156. Song, Y.; Min, J.; Yu, Y.; Wang, H.; Yang, Y.; Zhang, H.; Gao, W. Wireless Battery-Free Wearable Sweat Sensor Powered by Human Motion. Sci. Adv. 2020, 6, eaay9842. [Google Scholar] [CrossRef]
  157. Xu, Y.; De la Paz, E.; Paul, A.; Mahato, K.; Sempionatto, J.R.; Tostado, N.; Lee, M.; Hota, G.; Lin, M.; Uppal, A.; et al. In-Ear Integrated Sensor Array for the Continuous Monitoring of Brain Activity and of Lactate in Sweat. Nat. Biomed. Eng. 2023, 7, 1307–1320. [Google Scholar] [CrossRef]
  158. Niu, J.; Lin, S.; Chen, D.; Wang, Z.; Cao, C.; Gao, A.; Cui, S.; Liu, Y.; Hong, Y.; Zhi, X.; et al. A Fully Elastic Wearable Electrochemical Sweat Detection System of Tree-Bionic Microfluidic Structure for Real-Time Monitoring. Small 2024, 20, 2306769. [Google Scholar] [CrossRef] [PubMed]
  159. Xu, G.; Huang, X.; Shi, R.; Yang, Y.; Wu, P.; Zhou, J.; He, X.; Li, J.; Zen, Y.; Jiao, Y.; et al. Triboelectric Nanogenerator Enabled Sweat Extraction and Power Activation for Sweat Monitoring. Adv. Funct. Mater. 2024, 34, 2310777. [Google Scholar] [CrossRef]
  160. Ding, S.; Saha, T.; Yin, L.; Liu, R.; Khan, M.I.; Chang, A.-Y.; Lee, H.; Zhao, H.; Liu, Y.; Nazemi, A.S.; et al. A Fingertip-Wearable Microgrid System for Autonomous Energy Management and Metabolic Monitoring. Nat. Electron. 2024, 7, 788–799. [Google Scholar] [CrossRef]
  161. Wang, L.; Peng, H.; Wang, X.; Chen, X.; Yang, C.; Yang, B.; Liu, J. PDMS/MWCNT-Based Tactile Sensor Array with Coplanar Electrodes for Crosstalk Suppression. Microsyst. Nanoeng. 2016, 2, 16065. [Google Scholar] [CrossRef]
  162. Choong, C.-L.; Shim, M.-B.; Lee, B.-S.; Jeon, S.; Ko, D.-S.; Kang, T.-H.; Bae, J.; Lee, S.H.; Byun, K.-E.; Im, J.; et al. Highly Stretchable Resistive Pressure Sensors Using a Conductive Elastomeric Composite on a Micropyramid Array. Adv. Mater. 2014, 26, 3451–3458. [Google Scholar] [CrossRef]
  163. Mannsfeld, S.C.B.; Tee, B.C.-K.; Stoltenberg, R.M.; Chen, C.V.H.-H.; Barman, S.; Muir, B.V.O.; Sokolov, A.N.; Reese, C.; Bao, Z. Highly Sensitive Flexible Pressure Sensors with Microstructured Rubber Dielectric Layers. Nat. Mater. 2010, 9, 859–864. [Google Scholar] [CrossRef]
  164. Muhammad, H.B.; Oddo, C.M.; Beccai, L.; Recchiuto, C.; Anthony, C.J.; Adams, M.J.; Carrozza, M.C.; Hukins, D.W.L.; Ward, M.C.L. Development of a Bioinspired MEMS Based Capacitive Tactile Sensor for a Robotic Finger. Sens. Actuators A Phys. 2011, 165, 221–229. [Google Scholar] [CrossRef]
  165. Nie, B.; Li, R.; Brandt, J.D.; Pan, T. Microfluidic Tactile Sensors for Three-Dimensional Contact Force Measurements. Lab. Chip 2014, 14, 4344–4353. [Google Scholar] [CrossRef]
  166. Jiang, X.-Z.; Sun, Y.-J.; Fan, Z.; Zhang, T.-Y. Integrated Flexible, Waterproof, Transparent, and Self-Powered Tactile Sensing Panel. ACS Nano 2016, 10, 7696–7704. [Google Scholar] [CrossRef] [PubMed]
  167. Pu, X.; Liu, M.; Chen, X.; Sun, J.; Du, C.; Zhang, Y.; Zhai, J.; Hu, W.; Wang, Z.L. Ultrastretchable, Transparent Triboelectric Nanogenerator as Electronic Skin for Biomechanical Energy Harvesting and Tactile Sensing. Sci. Adv. 2017, 3, e1700015. [Google Scholar] [CrossRef] [PubMed]
  168. He, T.; Shi, Q.; Wang, H.; Wen, F.; Chen, T.; Ouyang, J.; Lee, C. Beyond Energy Harvesting—Multi-Functional Triboelectric Nanosensors on a Textile. Nano Energy 2019, 57, 338–352. [Google Scholar] [CrossRef]
  169. Chun, K.-Y.; Son, Y.J.; Jeon, E.-S.; Lee, S.; Han, C.-S. A Self-Powered Sensor Mimicking Slow- and Fast-Adapting Cutaneous Mechanoreceptors. Adv. Mater. 2018, 30, 1706299. [Google Scholar] [CrossRef] [PubMed]
  170. Park, D.Y.; Joe, D.J.; Kim, D.H.; Park, H.; Han, J.H.; Jeong, C.K.; Park, H.; Park, J.G.; Joung, B.; Lee, K.J. Self-Powered Real-Time Arterial Pulse Monitoring Using Ultrathin Epidermal Piezoelectric Sensors. Adv. Mater. 2017, 29, 1702308. [Google Scholar] [CrossRef] [PubMed]
  171. Chen, X.; Li, X.; Shao, J.; An, N.; Tian, H.; Wang, C.; Han, T.; Wang, L.; Lu, B. High-Performance Piezoelectric Nanogenerators with Imprinted P(VDF-TrFE)/BaTiO3 Nanocomposite Micropillars for Self-Powered Flexible Sensors. Small 2017, 13, 1604245. [Google Scholar] [CrossRef]
  172. Xu, K.; Lu, Y.; Takei, K. Flexible Hybrid Sensor Systems with Feedback Functions. Adv. Funct. Mater. 2021, 31, 2007436. [Google Scholar] [CrossRef]
  173. Laschi, C.; Mazzolai, B.; Cianchetti, M. Soft Robotics: Technologies and Systems Pushing the Boundaries of Robot Abilities. Sci. Robot. 2016, 1, eaah3690. [Google Scholar] [CrossRef]
  174. Liu, Y.; Park, W.; Yiu, C.K.; Huang, X.; Jia, S.; Chen, Y.; Zhang, H.; Chen, H.; Wu, P.; Wu, M.; et al. Miniaturized, Portable Gustation Interfaces for VR/AR/MR. Proc. Natl. Acad. Sci. USA 2024, 121, e2412116121. [Google Scholar] [CrossRef]
  175. Yin, J.; Hinchet, R.; Shea, H.; Majidi, C. Wearable Soft Technologies for Haptic Sensing and Feedback. Adv. Funct. Mater. 2021, 31, 2007428. [Google Scholar] [CrossRef]
  176. Besse, N.; Rosset, S.; Zarate, J.J.; Shea, H. Flexible Active Skin: Large Reconfigurable Arrays of Individually Addressed Shape Memory Polymer Actuators. Adv. Mater. Technol. 2017, 2, 1700102. [Google Scholar] [CrossRef]
  177. Leroy, E.; Hinchet, R.; Shea, H. Multimode Hydraulically Amplified Electrostatic Actuators for Wearable Haptics. Adv. Mater. 2020, 32, 2002564. [Google Scholar] [CrossRef]
  178. Shi, Y.; Wang, F.; Tian, J.; Li, S.; Fu, E.; Nie, J.; Lei, R.; Ding, Y.; Chen, X.; Wang, Z.L. Self-Powered Electro-Tactile System for Virtual Tactile Experiences. Sci. Adv. 2021, 7, eabe2943. [Google Scholar] [CrossRef]
  179. Ji, X.; Liu, X.; Cacucciolo, V.; Civet, Y.; El Haitami, A.; Cantin, S.; Perriard, Y.; Shea, H. Untethered Feel-Through Haptics Using 18-Μm Thick Dielectric Elastomer Actuators. Adv. Funct. Mater. 2021, 31, 2006639. [Google Scholar] [CrossRef]
  180. Sonar, H.A.; Huang, J.-L.; Paik, J. Soft Touch Using Soft Pneumatic Actuator–Skin as a Wearable Haptic Feedback Device. Adv. Intell. Syst. 2021, 3, 2000168. [Google Scholar] [CrossRef]
  181. Schmidt, G.C.; Werner, J.M.; Weissbach, T.; Strutwolf, J.; Eland, R.; Drossel, W.-G.; Hübler, A.C. Printed Multilayer Piezoelectric Transducers on Paper for Haptic Feedback and Dual Touch-Sound Sensation. Sensors 2022, 22, 3796. [Google Scholar] [CrossRef] [PubMed]
  182. Lee, Y.; Lee, S.; Lee, D. Wearable Haptic Device for Stiffness Rendering of Virtual Objects in Augmented Reality. Appl. Sci. 2021, 11, 6932. [Google Scholar] [CrossRef]
  183. Hinchet, R.; Shea, H. High Force Density Textile Electrostatic Clutch. Adv. Mater. Technol. 2020, 5, 1900895. [Google Scholar] [CrossRef]
  184. Jo, I.; Park, Y.; Kim, H.; Bae, J. Evaluation of a Wearable Hand Kinesthetic Feedback System for Virtual Reality: Psychophysical and User Experience Evaluation. IEEE Trans. Hum.-Mach. Syst. 2019, 49, 430–439. [Google Scholar] [CrossRef]
  185. Günther, S.; Makhija, M.; Müller, F.; Schön, D.; Mühlhäuser, M.; Funk, M. PneumAct: Pneumatic Kinesthetic Actuation of Body Joints in Virtual Reality Environments. In Proceedings of the 2019 on Designing Interactive Systems Conference, San Diego, CA, USA, 23–28 June 2019; pp. 227–240. [Google Scholar]
  186. Kayhan, O.; Samur, E. A Wearable Haptic Guidance System Based on Skin Stretch around the Waist for Visually-Impaired Runners. In Proceedings of the 2022 IEEE Haptics Symposium (HAPTICS), Virtual Event, 21–24 March 2022; pp. 1–6. [Google Scholar]
  187. Liu, Y.; Nishikawa, S.; Seong, Y.A.; Niiyama, R.; Kuniyoshi, Y. ThermoCaress: A Wearable Haptic Device with Illusory Moving Thermal Stimulation. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Virtual Event, 8–13 May 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 1–12. [Google Scholar]
  188. Oh, J.; Kim, S.; Lee, S.; Jeong, S.; Ko, S.H.; Bae, J. A Liquid Metal Based Multimodal Sensor and Haptic Feedback Device for Thermal and Tactile Sensation Generation in Virtual Reality. Adv. Funct. Mater. 2021, 31, 2007772. [Google Scholar] [CrossRef]
  189. Nasser, A.; Zheng, K.; Zhu, K. ThermEarhook: Investigating Spatial Thermal Haptic Feedback on the Auricular Skin Area. In Proceedings of the 2021 International Conference on Multimodal Interaction, Montreal, QC, Canada, 18–22 October 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 662–672. [Google Scholar]
  190. Lee, J.; Sul, H.; Lee, W.; Pyun, K.R.; Ha, I.; Kim, D.; Park, H.; Eom, H.; Yoon, Y.; Jung, J.; et al. Stretchable Skin-Like Cooling/Heating Device for Reconstruction of Artificial Thermal Sensation in Virtual Reality. Adv. Funct. Mater. 2020, 30, 1909171. [Google Scholar] [CrossRef]
  191. Shi, Q.; Wang, T.; Lee, C. MEMS Based Broadband Piezoelectric Ultrasonic Energy Harvester (PUEH) for Enabling Self-Powered Implantable Biomedical Devices. Sci. Rep. 2016, 6, 24946. [Google Scholar] [CrossRef] [PubMed]
  192. Shi, B.; Liu, Z.; Zheng, Q.; Meng, J.; Ouyang, H.; Zou, Y.; Jiang, D.; Qu, X.; Yu, M.; Zhao, L.; et al. Body-Integrated Self-Powered System for Wearable and Implantable Applications. ACS Nano 2019, 13, 6017–6024. [Google Scholar] [CrossRef]
  193. Quan, T.; Wang, X.; Wang, Z.L.; Yang, Y. Hybridized Electromagnetic–Triboelectric Nanogenerator for a Self-Powered Electronic Watch. ACS Nano 2015, 9, 12301–12310. [Google Scholar] [CrossRef]
  194. Li, J.; Dong, Y.; Park, J.H.; Lin, L.; Tang, T.; Zhang, M.; Wu, H.; Zhang, L.; Tan, J.S.Y.; Yoo, J. Human-Body-Coupled Power-Delivery and Ambient-Energy-Harvesting ICs for a Full-Body-Area Power Sustainability. In Proceedings of the 2020 IEEE International Solid-State Circuits Conference–(ISSCC), San Francisco, CA, USA, 16–20 February 2020; pp. 514–516. [Google Scholar]
  195. Wagih, M.; Komolafe, A.; Zaghari, B. Dual-Receiver Wearable 6.78 MHz Resonant Inductive Wireless Power Transfer Glove Using Embroidered Textile Coils. IEEE Access 2020, 8, 24630–24642. [Google Scholar] [CrossRef]
  196. Wagih, M.; Yong, S.; Yang, K.; Weddell, A.S.; Beeby, S. Printed Non-Metallic Textile-Based Carbon Antenna for Low-Cost Green Wearable Applications. In Proceedings of the 2022 16th European Conference on Antennas and Propagation (EuCAP), Madrid, Spain, 27 March–1 April 2022; pp. 1–4. [Google Scholar]
  197. Yi, Q.; Pei, X.; Das, P.; Qin, H.; Lee, S.W.; Esfandyarpour, R. A Self-Powered Triboelectric MXene-Based 3D-Printed Wearable Physiological Biosignal Sensing System for on-Demand, Wireless, and Real-Time Health Monitoring. Nano Energy 2022, 101, 107511. [Google Scholar] [CrossRef]
  198. Lim, K.-W.; Peddigari, M.; Park, C.H.; Lee, H.Y.; Min, Y.; Kim, J.-W.; Ahn, C.-W.; Choi, J.-J.; Hahn, B.-D.; Choi, J.-H.; et al. A High Output Magneto-Mechano-Triboelectric Generator Enabled by Accelerated Water-Soluble Nano-Bullets for Powering a Wireless Indoor Positioning System. Energy Environ. Sci. 2019, 12, 666–674. [Google Scholar] [CrossRef]
  199. Chen, Y.-L.; Liu, D.; Wang, S.; Li, Y.-F.; Zhang, X.-S. Self-Powered Smart Active RFID Tag Integrated with Wearable Hybrid Nanogenerator. Nano Energy 2019, 64, 103911. [Google Scholar] [CrossRef]
  200. Wen, F.; Wang, H.; He, T.; Shi, Q.; Sun, Z.; Zhu, M.; Zhang, Z.; Cao, Z.; Dai, Y.; Zhang, T.; et al. Battery-Free Short-Range Self-Powered Wireless Sensor Network (SS-WSN) Using TENG Based Direct Sensory Transmission (TDST) Mechanism. Nano Energy 2020, 67, 104266. [Google Scholar] [CrossRef]
  201. Chen, J.; Xuan, W.; Zhao, P.; Farooq, U.; Ding, P.; Yin, W.; Jin, H.; Wang, X.; Fu, Y.; Dong, S.; et al. Triboelectric Effect Based Instantaneous Self-Powered Wireless Sensing with Self-Determined Identity. Nano Energy 2018, 51, 1–9. [Google Scholar] [CrossRef]
  202. Shih, B.; Shah, D.; Li, J.; Thuruthel, T.G.; Park, Y.-L.; Iida, F.; Bao, Z.; Kramer-Bottiglio, R.; Tolley, M.T. Electronic Skins and Machine Learning for Intelligent Soft Robots. Sci. Robot. 2020, 5, eaaz9239. [Google Scholar] [CrossRef] [PubMed]
  203. Guo, K.; Lu, Y.; Gao, H.; Cao, R. Artificial Intelligence-Based Semantic Internet of Things in a User-Centric Smart City. Sensors 2018, 18, 1341. [Google Scholar] [CrossRef] [PubMed]
  204. Shao, B.; Wu, T.-C.; Yan, Z.-X.; Ko, T.-Y.; Peng, W.-C.; Jhan, D.-J.; Chang, Y.-H.; Fong, J.-W.; Lu, M.-H.; Yang, W.-C.; et al. Deep Learning–Empowered Triboelectric Acoustic Textile for Voice Perception and Intuitive Generative AI-Voice Access on Clothing. Sci. Adv. 2025, 11, eadx3348. [Google Scholar] [CrossRef]
  205. Liu, Y.; Jia, S.; Yiu, C.K.; Park, W.; Chen, Z.; Nan, J.; Huang, X.; Chen, H.; Li, W.; Gao, Y.; et al. Intelligent Wearable Olfactory Interface for Latency-Free Mixed Reality and Fast Olfactory Enhancement. Nat. Commun. 2024, 15, 4474. [Google Scholar] [CrossRef] [PubMed]
  206. Ha, N.; Xu, K.; Ren, G.; Mitchell, A.; Ou, J.Z. Machine Learning-Enabled Smart Sensor Systems. Adv. Intell. Syst. 2020, 2, 2000063. [Google Scholar] [CrossRef]
  207. Han, J.H.; Bae, K.M.; Hong, S.K.; Park, H.; Kwak, J.-H.; Wang, H.S.; Joe, D.J.; Park, J.H.; Jung, Y.H.; Hur, S.; et al. Machine Learning-Based Self-Powered Acoustic Sensor for Speaker Recognition. Nano Energy 2018, 53, 658–665. [Google Scholar] [CrossRef]
  208. Wen, F.; Zhang, Z.; He, T.; Lee, C. AI Enabled Sign Language Recognition and VR Space Bidirectional Communication Using Triboelectric Smart Glove. Nat. Commun. 2021, 12, 5378. [Google Scholar] [CrossRef] [PubMed]
  209. Shi, Q.; Zhang, Z.; Yang, Y.; Shan, X.; Salam, B.; Lee, C. Artificial Intelligence of Things (AIoT) Enabled Floor Monitoring System for Smart Home Applications. ACS Nano 2021, 15, 18312–18326. [Google Scholar] [CrossRef]
  210. Zhu, M.; Sun, Z.; Zhang, Z.; Shi, Q.; He, T.; Liu, H.; Chen, T.; Lee, C. Haptic-Feedback Smart Glove as a Creative Human-Machine Interface (HMI) for Virtual/Augmented Reality Applications. Sci. Adv. 2020, 6, eaaz8693. [Google Scholar] [CrossRef]
  211. Guo, X.; He, T.; Zhang, Z.; Luo, A.; Wang, F.; Ng, E.J.; Zhu, Y.; Liu, H.; Lee, C. Artificial Intelligence-Enabled Caregiving Walking Stick Powered by Ultra-Low-Frequency Human Motion. ACS Nano 2021, 15, 19054–19069. [Google Scholar] [CrossRef]
  212. Zhu, M.; Sun, Z.; Lee, C. Soft Modular Glove with Multimodal Sensing and Augmented Haptic Feedback Enabled by Materials’ Multifunctionalities. ACS Nano 2022, 16, 14097–14110. [Google Scholar] [CrossRef]
  213. Sun, Z.; Zhu, M.; Shan, X.; Lee, C. Augmented Tactile-Perception and Haptic-Feedback Rings as Human-Machine Interfaces Aiming for Immersive Interactions. Nat. Commun. 2022, 13, 5224. [Google Scholar] [CrossRef]
  214. Zhou, F.; Chai, Y. Near-Sensor and in-Sensor Computing. Nat. Electron. 2020, 3, 664–671. [Google Scholar] [CrossRef]
  215. Liu, J.; Wang, Y.; Liu, Y.; Wu, Y.; Bian, B.; Shang, J.; Li, R. Recent Progress in Wearable Near-Sensor and In-Sensor Intelligent Perception Systems. Sensors 2024, 24, 2180. [Google Scholar] [CrossRef]
  216. Wan, T.; Shao, B.; Ma, S.; Zhou, Y.; Li, Q.; Chai, Y. In-Sensor Computing: Materials, Devices, and Integration Technologies. Adv. Mater. 2023, 35, 2203830. [Google Scholar] [CrossRef]
  217. Moin, A.; Zhou, A.; Rahimi, A.; Menon, A.; Benatti, S.; Alexandrov, G.; Tamakloe, S.; Ting, J.; Yamamoto, N.; Khan, Y.; et al. A Wearable Biosensing System with In-Sensor Adaptive Machine Learning for Hand Gesture Recognition. Nat. Electron. 2021, 4, 54–63. [Google Scholar] [CrossRef]
  218. Xu, H.; Zheng, W.; Zhang, Y.; Zhao, D.; Wang, L.; Zhao, Y.; Wang, W.; Yuan, Y.; Zhang, J.; Huo, Z.; et al. A Fully Integrated, Standalone Stretchable Device Platform with in-Sensor Adaptive Machine Learning for Rehabilitation. Nat. Commun. 2023, 14, 7769. [Google Scholar] [CrossRef] [PubMed]
  219. Duan, S.; Lin, Y.; Shi, Q.; Wei, X.; Zhu, D.; Hong, J.; Xiang, S.; Yuan, W.; Shen, G.; Wu, J. Highly Sensitive and Mechanically Stable MXene Textile Sensors for Adaptive Smart Data Glove Embedded with Near-Sensor Edge Intelligence. Adv. Fiber Mater. 2024, 6, 1541–1553. [Google Scholar] [CrossRef]
  220. Yang, H.; Li, J.; Xiao, X.; Wang, J.; Li, Y.; Li, K.; Li, Z.; Yang, H.; Wang, Q.; Yang, J.; et al. Topographic Design in Wearable MXene Sensors with In-Sensor Machine Learning for Full-Body Avatar Reconstruction. Nat. Commun. 2022, 13, 5311. [Google Scholar] [CrossRef] [PubMed]
  221. Wang, M.; Tu, J.; Huang, Z.; Wang, T.; Liu, Z.; Zhang, F.; Li, W.; He, K.; Pan, L.; Zhang, X.; et al. Tactile Near-Sensor Analogue Computing for Ultrafast Responsive Artificial Skin. Adv. Mater. 2022, 34, 2201962. [Google Scholar] [CrossRef] [PubMed]
  222. Zhao, Z.; Tang, J.; Yuan, J.; Li, Y.; Dai, Y.; Yao, J.; Zhang, Q.; Ding, S.; Li, T.; Zhang, R.; et al. Large-Scale Integrated Flexible Tactile Sensor Array for Sensitive Smart Robotic Touch. ACS Nano 2022, 16, 16784–16795. [Google Scholar] [CrossRef]
  223. Li, Z.; Li, Z.; Tang, W.; Yao, J.; Dou, Z.; Gong, J.; Li, Y.; Zhang, B.; Dong, Y.; Xia, J.; et al. Crossmodal Sensory Neurons Based on High-Performance Flexible Memristors for Human-Machine in-Sensor Computing System. Nat. Commun. 2024, 15, 7275. [Google Scholar] [CrossRef]
  224. Huang, J.; Feng, J.; Chen, Z.; Dai, Z.; Yang, S.; Chen, Z.; Zhang, H.; Zhou, Z.; Zeng, Z.; Li, X.; et al. A Bioinspired MXene-Based Flexible Sensory Neuron for Tactile near-Sensor Computing. Nano Energy 2024, 126, 109684. [Google Scholar] [CrossRef]
  225. Choi, Y.; Jin, P.; Lee, S.; Song, Y.; Tay, R.Y.; Kim, G.; Yoo, J.; Han, H.; Yeom, J.; Cho, J.H.; et al. All-Printed Chip-Less Wearable Neuromorphic System for Multimodal Physicochemical Health Monitoring. Nat. Commun. 2025, 16, 5689. [Google Scholar] [CrossRef]
  226. Feng, X.; Li, S.; Wong, S.L.; Tong, S.; Chen, L.; Zhang, P.; Wang, L.; Fong, X.; Chi, D.; Ang, K.-W. Self-Selective Multi-Terminal Memtransistor Crossbar Array for In-Memory Computing. ACS Nano 2021, 15, 1764–1774. [Google Scholar] [CrossRef]
  227. Liu, K.; Zhang, T.; Dang, B.; Bao, L.; Xu, L.; Cheng, C.; Yang, Z.; Huang, R.; Yang, Y. An Optoelectronic Synapse Based on α-In2Se3 with Controllable Temporal Dynamics for Multimode and Multiscale Reservoir Computing. Nat. Electron. 2022, 5, 761–773. [Google Scholar] [CrossRef]
  228. Liu, X.; Sun, C.; Guo, Z.; Xia, X.; Jiang, Q.; Ye, X.; Shang, J.; Zhang, Y.; Zhu, X.; Li, R.-W. Near-Sensor Reservoir Computing for Gait Recognition via a Multi-Gate Electrolyte-Gated Transistor. Adv. Sci. 2023, 10, 2300471. [Google Scholar] [CrossRef]
  229. Liu, D.; Tian, X.; Bai, J.; Wang, S.; Dai, S.; Wang, Y.; Wang, Z.; Zhang, S. A Wearable In-Sensor Computing Platform Based on Stretchable Organic Electrochemical Transistors. Nat. Electron. 2024, 7, 1176–1185. [Google Scholar] [CrossRef]
  230. Sharma, D.; Luqman, A.; Ng, S.E.; Yantara, N.; Xing, X.; Tay, Y.B.; Basu, A.; Chattopadhyay, A.; Mathews, N. Halide Perovskite Photovoltaics for In-Sensor Reservoir Computing. Nano Energy 2024, 129, 109949. [Google Scholar] [CrossRef]
  231. Ren, Z.; Zhang, Z.; Zhuge, Y.; Xiao, Z.; Xu, S.; Zhou, J.; Lee, C. Near-Sensor Edge Computing System Enabled by a CMOS Compatible Photonic Integrated Circuit Platform Using Bilayer AlN/Si Waveguides. Nano-Micro Lett. 2025, 17, 261. [Google Scholar] [CrossRef] [PubMed]
  232. Kang, M.; Han, J.-K.; Lee, K.; Jeong, J.; Yoo, C.; Jeon, J.W.; Park, B.; Choi, W.; Ahn, J.; Yoon, K.-J.; et al. Neuromorphic Olfaction with Ultralow-Power Gas Sensors and Ovonic Threshold Switch. Sci. Adv. 2025, 11, eadv9222. [Google Scholar] [CrossRef] [PubMed]
  233. Cao, Z.; Simon, T.; Wei, S.-E.; Sheikh, Y. Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields; IEEE Computer Society: Washington, WA, USA, 2017; pp. 7291–7299. [Google Scholar]
  234. Zhong, S.; Su, L.; Xu, M.; Loke, D.; Yu, B.; Zhang, Y.; Zhao, R. Recent Advances in Artificial Sensory Neurons: Biological Fundamentals, Devices, Applications, and Challenges. Nano-Micro Lett. 2024, 17, 61. [Google Scholar] [CrossRef]
  235. Ham, D.; Park, H.; Hwang, S.; Kim, K. Neuromorphic Electronics Based on Copying and Pasting the Brain. Nat. Electron. 2021, 4, 635–644. [Google Scholar] [CrossRef]
  236. Gomez, C.; Oller, J.; Paradells, J. Overview and Evaluation of Bluetooth Low Energy: An Emerging Low-Power Wireless Technology. Sensors 2012, 12, 11734–11753. [Google Scholar] [CrossRef]
  237. Xu, W.; Zhang, J.; Huang, S.; Luo, C.; Li, W. Key Generation for Internet of Things: A Contemporary Survey. ACM Comput. Surv. 2021, 54, 1–37. [Google Scholar] [CrossRef]
  238. Shokri, R.; Stronati, M.; Song, C.; Shmatikov, V. Membership Inference Attacks Against Machine Learning Models. In Proceedings of the 2017 IEEE Symposium on Security and Privacy (SP), San Jose, CA, USA, 22–24 May 2017; pp. 3–18. [Google Scholar]
  239. Jang, H.; Lee, J.; Beak, C.-J.; Biswas, S.; Lee, S.-H.; Kim, H. Flexible Neuromorphic Electronics for Wearable Near-Sensor and In-Sensor Computing Systems. Adv. Mater. 2025, 37, 2416073. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Article metric data becomes available approximately 24 hours after publication online.