Open AccessArticle
A Flexible Wearable Sensor Based on Laser-Induced Graphene for High-Precision Fine Motion Capture for Pilots
by
Xiaoqing Xing, Yao Zou, Mian Zhong, Shichen Li, Hongyun Fan, Xia Lei, Juhang Yin, Jiaqing Shen, Xinyi Liu, Man Xu, Yong Jiang, Tao Tang, Yu Qian and Chao Zhou
Cited by 1 | Viewed by 1914
Abstract
There has been a significant shift in research focus in recent years toward laser-induced graphene (LIG), which is a high-performance material with immense potential for use in energy storage, ultrahydrophobic water applications, and electronic devices. In particular, LIG has demonstrated considerable potential in
[...] Read more.
There has been a significant shift in research focus in recent years toward laser-induced graphene (LIG), which is a high-performance material with immense potential for use in energy storage, ultrahydrophobic water applications, and electronic devices. In particular, LIG has demonstrated considerable potential in the field of high-precision human motion posture capture using flexible sensing materials. In this study, we investigated the surface morphology evolution and performance of LIG formed by varying the laser energy accumulation times. Further, to capture human motion posture, we evaluated the performance of highly accurate flexible wearable sensors based on LIG. The experimental results showed that the sensors prepared using LIG exhibited exceptional flexibility and mechanical performance when the laser energy accumulation was optimized three times. They exhibited remarkable attributes, such as high sensitivity (~41.4), a low detection limit (0.05%), a rapid time response (response time of ~150 ms; relaxation time of ~100 ms), and excellent response stability even after 2000 s at a strain of 1.0% or 8.0%. These findings unequivocally show that flexible wearable sensors based on LIG have significant potential for capturing human motion posture, wrist pulse rates, and eye blinking patterns. Moreover, the sensors can capture various physiological signals for pilots to provide real-time capturing.
Full article
►▼
Show Figures