Next Article in Journal
Bending Properties of Lightweight Copper Specimens with Different Infill Patterns Produced by Material Extrusion Additive Manufacturing, Solvent Debinding and Sintering
Previous Article in Journal
A Novel Bio-Fertilizer Produced by Prickly Ash Seeds with Biochar Addition Induces Soil Suppressiveness against Black Shank Disease on Tobacco
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Hybrid Microstructure Piezoresistive Sensor with Machine Learning Approach for Gesture Recognition

1
Research Centre for Medical Robotics and Minimally Invasive Surgical Devices, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China
2
Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen 518055, China
3
Shenzhen Institute of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen 518055, China
4
CAS Key Laboratory for Health Informatics, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(16), 7264; https://doi.org/10.3390/app11167264
Submission received: 28 June 2021 / Revised: 31 July 2021 / Accepted: 2 August 2021 / Published: 6 August 2021

Abstract

:
Developments in flexible electronics have adopted various approaches which have enhanced the applicability of human–machine interface fields. Recently, microstructural integration and hybrid functional materials were designed for realizing human somatosensory. Nonetheless, designing tactile sensors with smart structures using facile and low-cost fabrication processes remains challenging. Furthermore, using the sensors for recognizing stimuli and feedback applications remains poorly validated. In this study, a highly flexible piezoresistive tactile sensor was developed by homogeneously dispersing carbon black (CB) in a microstructure porous sugar/PDMS-based sponge. Owning to its high flexibility and softness, the sensor can be mounted on human or robotic systems for different clinical applications. We validated the applicability of the proposed sensor by applying it to recognizing grasp and release forces in an open setting and to classifying hand motions that surgeons apply on the master interface of a robotic system during intravascular catheterization. For this purpose, we implemented the long short-term memory (LSTM)-dense classification model and five traditional machine learning methods, namely, support vector machine, multilayer perceptron, decision tree, and k-nearest neighbor. The models were used to classify the different hand gestures obtained in an open-setting experiment. Amongst all, the LSTM-dense method yielded the highest overall recognition accuracy (87.38%). Nevertheless, the performance of the other models was in a similar range, showing that our sensor structure can be applied in intelligence sensing or tactile feedback systems. Secondly, the sensor prototype was applied to analyze the motions made while manipulating an interventional robot. We analyzed the displacement and velocity of the master interface during typical axial (push/pull) and radial operations with the robot. The results obtained show that the sensor is capable of recording unique patterns during different operations. Thus, a combination of the flexible wearable sensors and machine learning could yield a future generation of flexible materials and artificial intelligence of things (AIoT) devices.

1. Introduction

The rapid development of artificial intelligence (AI) in the field of wearable electronics has received increased research attention due to their promising applications in personal healthcare devices, electronic skin (e-skin), human–machine interface, and haptic devices [1,2,3,4]. Flexible tactile sensors are still considered as key candidates toward achieving intelligent sensing systems; this can be attributed to the integration of higher capabilities for processing information [5]. In recent years, great efforts have been made to enhance the flexibility and sensitivity of wearable tactile sensors to meet the essential requirements needed for intelligent devices [6]. In terms of functionality, piezoresistivity, piezoelectricity, capacitance, and triboelectricity are the four main sensing mechanisms that have been utilized [7,8,9,10]. Mainly, these sensing mechanisms involve converting external forces and mechanical stimuli into electrical signals. Among them, piezoresistive-type sensors are more favored owning to their cost-effectiveness, simple design and structure, facile fabrication, and simple read-out signals; thus, they are widely adopted. In addition, the flexibility, stability, and sensitivity of piezoresistive sensors are highly desired when compared with sensors using other sensing mechanisms.
Usually, the conventional designs of high-performance flexible piezoresistive sensors are based on polymer matrices (such as elastomers—Ecoflex rubber and polydimethylsiloxane (PDMS) [11,12]) and conductive filler networks including graphene, nanowires, nanofibers, carbon black, and carbon nanomaterials [13,14,15,16,17]. However, a variety of microstructure materials, such as pyramids [18,19], nanofibers, and two-dimensional electrodes, are widely used for developing flexible piezoresistive sensors which have shown high sensitivity and reliable linear response. Despite the enormous research outputs describing sensitivity and stretchability, sensors built on these strategies still exhibit several limitations related to the detectable pressure range, fabrication process, response time, and biocompatibility. However, in some practical applications such as bionic robots [20,21,22], cost-effective production, data processing, and smartness are also major concerns. Hence, new fabrication techniques with intelligent data processing methods are required to develop low-cost smart sensory systems [23,24].
Advances in AI techniques have leveraged the functionalities of flexible tactile sensors [25]. For instance, machine learning algorithms can enable ways of analyzing modalities of a desired output pattern from touching or grasping objects to gesture recognition in developing human–machine interfaces [26,27]. Furthermore, the intelligent integration of low-cost flexible materials with stimulus realization and quantification can help accomplish the performance of tactile piezoresistive devices that operate in a similar working principle to the somatosensory system in humans [28]. Recently, various approaches have been reported on the basis of a combination of AI with low-cost materials used in small scales. For instance, Liu et al. [29] developed a smart e-skin based on a capacitive sensing array, and data were processed by applying long short-term memory (LSTM) and neural network. Furthermore, Shon et al. [30] reported a smart sensory e-skin by employing a multiwalled carbon nanotube/polydimethylsiloxane (MWCNT/PDMS) composite film and a deep neural network (DNN) to realize the applied pressure levels, as well as obtain position data by processing changes in the electrical resistance. Therefore, developing fully or semi-automated flexible sensory systems based on machine learning has become a research hotspot in recent years [31].
In this study, an intelligent recognition system was developed by combining a piezoresistive facile and cost-effective tactile sensor with a machine learning model for human–machine interfaces. A microporous PDMS-based sponge was integrated using cost-effective and commercially available sugar cubes. Retrospective studies were analyzed to obtain object manipulation data, and the data recorded by several sensor units were trained with machine learning algorithms to recognize different tactile signals in the hand gestures. The remainder of this paper is structured as follows: Section 2 presents the sensor fabrication process and data measurement procedures; Section 3 includes details of the experimental setup and an analysis of results; Section 4 includes the discussion and conclusion.

2. System Design and Method

2.1. Sensor Manufacturing

The sensor fabrication and design processes followed the route proposed in our previous study [32]. Following the sensor characterization results based on its chemical, physical, mechanical, and electrical properties, it was realized that the sensor model is strongly dependent on the material preparation and fabrication process conditions. However, to turn the nanoparticles from highly resistive to highly conductive, we used only carbon black instead of a hybrid synergetic network. Herein, the microporous sponge skeleton was fabricated by immersing a commercially available sugar cube (15 × 15 × 10) in PDMS solution prepared by mixing Ecoflex prepolymer with a base curing agent in a weight ratio of 10:1. Then, the mixture was stored on a heating platform for 2 h at 80 °C to solidify the prepolymer solution around and inside the cube’s structure. Next, the solid cubes were cut into slices with a thickness of 400 μm and further soaked in a glass container full of hot water. Subsequently, we put the solution in an oven at 60 °C for 1 h to dissolve the sugar particles from the inner porous sponge. The flexible skeleton was prepared in an inner 3D porous sponge and treated with plasma. At the same time, the conductive CB–alcohol solution was prepared by homogeneously dispersing CB powder in alcohol solvent with a mass ratio 1:9 g/L−1. The conductive solution was dispersed in the PDMS sponge by ultrasonication (in 45 min at the rate 30%) to firmly obtain conductive fillers on the inner holes of the sponge’s walls. Finally, the conductive sponge was dried in the oven to evaporate the alcohol and extract the resulting tactile sensor.

2.2. Design and Measurements

The fabricated sensor was sandwiched within wire electrodes and fixed with a polyimide tape film which was employed as a soft surface. The distribution of the piezoresistive tactile sensors was first characterized by demonstrating that it could record data from human fingers when making different hand gestures. Hence, to obtain stable signals, three sensors—made using the above-presented process—were fixed to the thumb, index, and middle fingers of five subjects. The piezoresistive tactile sensor obtained is shown in Figure 1a. The morphological characterization in Figure 1b shows the unique porous microstructure filled with conducive nanoparticles. We evaluated the reflection of deformation on the sensor performance by subjecting the central area of the sensor to different loadings on an electrochemical workstation connected to a Mark-10 force gauge and multimeter converter. Pressure deformation loading was increased gradually from 10–100 kPa at intervals of 10 kPa, i.e., the range of values that fit both human physiological signals and robotics applications [33].
The piezoresistive sensor response values were recorded with real-time sensor feedback using computer software, namely, KBD-24. Additionally, the testing conditions were repeatedly applied to 10 different sensor samples to obtain data from different units and properly analyze the stability and accuracy of the sensor prototype. Figure 2a shows the gradual response obtained during each iteration, where it can be seen that the sensor produced data even up to tiny pressure values. Notably, the response curve was very close to a linear shape when loading values higher than 70 kPa were applied, as shown in Figure 2b. This demonstrates the relationship that exists between the applied deformation and the response outputs. Similarly, the piezoresistive effect showed a decrease in electrical resistance with an increase in pressure loading.
We measured the performance using the sensitivity (S) metric as defined in Equation (1); ∆R is the difference between the resistance R and the initial resistance ( R 0 ), while ∆P is the difference between the applied pressure P and the initial pressure P 0 . The statical results reveal that there were three linear levels of sensitivity produced by the force sensor: S1 = 0.0311 under a loading range of 10–30 kPa, S2 = 0.038 under a loading range of 30–70 kPa, and S3 = 0.075 under a loading range of 70–100 kPa.
S = Δ R ( R 0 × Δ P ) .
Additionally, we investigated the sensor’s response to normal and shear forces. Normal force was applied at 15.75 N for 300 s. As shown in Figure 2c, the piezoresistive effect decreased the electrical resistance under the normal force due to the continuous deformation of the sensor. This decreased the diameter of the inner pores and increased the contact of the conductive nanoparticles located on the walls. Conversely, shear force was applied to the top surface parallel to the XY plane without applying any normal force. The sensor showed an obvious resistance variation in response to any changes in the surface, as shown in Figure 1d, owing to the unique elastic porous structure which enabled a fast interaction between the conductive fillers located on walls.
Lastly, to further characterize the electrical properties and explain the voltage distribution, the sensor was powered by a constant resistor and a loading force of 15.75 N was applied with different frequencies between 1 kHz and 10 kHz. The voltage outputs produced by the sensor under the loading forces were set as a function of recording time, as shown in Figure 3a,b. The results indicate super elasticity characterized by a fast response of the sensor at 1 kHz frequency, while output peaks decreased with an increase in the frequency values (up to 10 kHz). The results obtained reveal a decline in the amplitudes produced by the sensor as the frequency values increased. This shows that the tactile sensor is capable of achieving a rapid response, while it can enhance loading/unloading forces with values of frequencies that are within the range that human skin can sense [34]. According to different application requirements, the current results prove the potential of the sensor for a wide range of application areas.

2.3. Gesture Recognition Procedure

One of the key functions of smart sensory gesture systems is their ability to instantaneously respond to external stimuli, as well as accurately identify force and position information during somatosensory stimuli [35]. Three tactile sensors fabricated and characterized as described above were used for training a machine learning model. Based on using piezoresistive outputs, the system was first used for touch force recognition, and later deployed within a robotic system for kinesthesis analysis.
In both cases, the sensors were assembled and fixed with an adhesive tape on the thumb, index, and middle fingers of the right hand. Then, open-setting experiments were conducted in which five volunteers were directed to establish typical hand motions used during intravascular catheterization. The outputs from the tactile sensors were simultaneously collected to monitor the original gesture signals by converting hand force motions to resistance variation values. During the process, Arduino uno 3r (Atmega 328P-PU) was used as the data recording platform. The corresponding resistance changes were transferred to the host computer via USB and used as the input data. To ensure high recognition accuracy of hand movements, a huge dataset comprising different sensors was acquired from multiple sessions, whereby the five volunteers were guided to repeatedly establish only a single hand motion. In each trial, a minimum of 11,500 data points were acquired for each movement, resulting in a total of 45,000 data points from all experiments. We further argue that the new sensor structure has stable characteristics in terms of data acquisition; that is, each sensor unit produces statistical data that are learnable by both traditional machine learning and deep learning approaches. For this purpose, we developed a long short-term memory (LSTM)-dense classification model and used it to classify the different hand gestures obtained in the open-setting experiment. The LSTM-dense network was implemented within Python based on the Keras® Tensorflow framework on a desktop with a 2.2 GHz Intel i7 processor and an NVIDIA graphic processing unit. Recurrent networks have relatively limited performance capacities on some time-series datasets due to the vanishing gradient problem. Thus, LSTM serves as an alternative approach since it requires switching the input, forget, and output gates without hidden cells. Furthermore, dense layers were supplemented into the LSTM-based learning-based network to classify four different types of hand gestures. The flow of the gesture recognition process is illustrated in Figure 4.
The LSTM layers integrated into the network were separated with an attention unit, while two consecutive dense layers were appended for the actual motion classification based on signals received on the tactile sensors. The deep network consisted of three main parts with different functional layers. The first mainly consisted of the input and preprocessing layers, the hidden module included a variety of LSTM, attention, and dense layers, and the last was a single-layered module for reporting the classification output. During training, network parameters were configured arbitrarily, and the feature learning stratagem of the network was based on unique training (60%), validation (20%), and test (20%) datasets. The training was achieved with a dynamic learning rate initialized at 104, an adaptive moment optimizer with a decay value of 106, and a recurrent dropout of 10% added for regularization.
The performance of the final model was obtained under 1000 training epochs and a batch size of 5. Differences in network performance were achieved, but only the best testing accuracies obtained are reported in Figure 5. The prediction performance of the learning-based network was analyzed using the 20% test dataset, as shown in Figure 5. The matrix shows that more intra-axial and intra-radial misclassifications were obtained by the learning-based network. For instance, while 81.21% of 2960 push data samples were correctly classified, 7.25% were misclassified as pull motions, whereas none were misclassified as either of the radial motions. A similar result was obtained fror the two radial motions.
However, extreme misclassifications were observed at very low percentages; for example, 0.08% of the pull motions were predicted to be rotate-clockwise samples, and 0.09% of the rotate-clockwise motions were recognized to be pull samples by the learning-based network. Overall, the deep learning algorithm exhibited an acceptably high-test accuracy of 87.38%. Furthermore, the usability of data from the piezoresistive tactile sensor in different existing classification methods was observed to investigate the possibility of obtaining reliable hand motion recognitions with high accuracies. We compared the learning-based network with five commonly used data classification methods, namely, support vector machine (SVM), decision tree (DT), multilayer perceptron (MLP), neural network, and k-nearest neighbor. The recognition accuracies obtained using each method for the four hand motions are listed in Table 1. It can be seen from the table that the deep learning algorithm demonstrated values in the same range as the conventional learning algorithms, while the learning-based network arbitrarily introduced in this study showed the highest recognition rate. It should be noted that proper tuning of the parameters can improve the prediction performance of the models for the hand motion data.
In summary, the contacting and releasing actions of the piezoresistive tactile sensor with an object can instantaneously convert finger tactile stimulus into notable real-time electrical signals, which can be used to train machine and deep learning networks for prediction tasks. These outputs can be varied according to the loading/holding contacting fingers and releasing fingers. We demonstrated how our sensory system combines four complex tasks, specifically, pushing, pulling, left rotating, and right rotating.

3. Application in Interventional Surgical Robot Training

Next, the sensor prototype was further validated by applying it to analyze the hand motions made while manipulating the master interface of an interventional robot. The robotic system, shown in Figure 6a, was used for catheterization of tools such as a guidewire during intravascular interventions.

3.1. Master Interface

The robotic system included a suite of master and slave devices which served as the interface between surgeons and the tools. The master interface was a portable 2-DoF robotic device that can be used to issue translation (push/pull) and rotation action during interventions. It is located outside the operation room; hence, the interventionist can sit outside the radiation area to issue control commands. The device has a knob with a magnetostrictive electromagnetic bar that provides hand motion control data. Meanwhile, the usability of the tactile sensor during intravascular interventions was validated by fixing the sensors on the thumb and index fingers to obtain the tactile information as surgeons manipulate the knob. The master device also had a display unit from a control station which is used to guide the bedside (i.e., slave) robotic device during intravascular cardiac interventions. As found in our previous study [36], data from the potentiometer sensor bar are transmitted to a slave robotic device which is positioned close to patients in the operation rooms.
The apparent resistance signals prove the fast response to touch with stability along the testing time of the pushing and rotating processes. Moreover, we noticed that the outputs of both sensors matched one another almost perfectly. This result proves the potential of the piezoresistive tactile sensor for force sensing in surgical robots, smart gloves, and e-skin.

3.2. Sensing Performance

Piezoresistive tactile sensors are mainly assigned to mimic the sense of touch in human. Thus, they can enhance control and feedback studies. In this evaluation study, the developed sensor was used for tracking the several hand motions during robot-assisted procedures, i.e., pushing, pulling, clockwise rotation, and anticlockwise rotation, which were analyzed using machine learning in the previous section. Usually, surgeons are aware of their motions (push, pull, and rotate) made at any time, while they only anticipate the sensory touch (stimuli) via experience using haptic feedback. These are transmitted to their brain for translation purposes to decide the next hand motion needed during an intervention. Thus, it has become vital to design systems that can predict the hand movements for robot intelligence and surgery security purposes.
To mimic the surgeon’s capability of determining hand motion executed each time, an open-setting experiment was carried out using two units of the piezoresistive tactile sensors mounted on a volunteer’s thumb and index fingers to assess sensor feedback. The sensors were fixed on the volunteer’s fingers while manipulating the robotic control interface. The electrical resistance shows the sensor’s ability to respond swiftly as the robot’s control interface was swiftly moved along the different axes. In this scenario, the control knob of the robot hand was arbitrarily glided to different positions for several seconds. During all procedures, the sensors were in a fixed position on a wearable glove; thus, the pressure applied could be correctly recorded from the sensor’s surface. One key function of the piezoresistive tactile sensor is the viscoelastic behavior of the elastomer structure and the conductivity network fillers. During the process, sensors were in contact with the robot arm for a period of 0.45 s and then released for 0.15 s. The results were analyzed (Figure 6b,c), showing that the sensors exhibited a stable status. The apparent resistance signals showed a fast response to the hand’s touch, while stability was inferred along the testing time of the axial and radial processes. Furthermore, we noticed that the outputs from both sensors matched one another closely. This result highlights the potential of the piezoresistive tactile sensor for force sensing in surgical robots, smart gloves, and e-skin.

3.3. Motion Analysis

Another demonstration of the piezoresistive sensor was investigated by analyzing the displacement and velocity motion variables of the operator’s hand while carrying out an interventional procedure. As basic operations are, in the broadest sense, translation and rotation navigations, the displacement values were calculated as the linear and angular distance traveled for each hand movement, while the velocity was obtained as the first derivative of the displacement with time. Motion analysis was carried out with respect to the relative resistance change during push and rotate operations (Figure 7), with displacement and velocity obtained at the robot’s control knob plotted for visualization.
The position and velocity of the robot’s master control for the axial and radial operations are given in Figure 7a–d, respectively. Both the position and the velocity signals were recorded symmetrically with the relative resistance change during the push and rotate hand motions. Some inferences can be made. For instance, integration of the sensing components with the robot did not cause motion damping or an unsmooth trajectory when handling the robot. Thus, it can be concluded that sensor placements did not interfere with the procedure while operating the interventional robot for a range of different hand motions utilized for tool manipulation during intravascular catheterization procedures.

4. Conclusions

This work reports the design of a low-cost, facile fabricated piezoresistive tactile sensor, which is capable of imitating the human sense of touch. The resistance changes of our novel microstructural design were presented, and the results reveal how forces applied to the sensor were transformed into a measurable signal that could enhance pattern recognition. The tactile sensor was utilized for a smart glove application during interventional surgery via an acquisition unit collected during real-time resistance variations. The acquired data were processed using machine learning models to analyze the sensed data and recognize different types of hand gestures. The piezoresistive tactile sensors were able to recognize hand gestures with an accuracy of about 87.38%, with applicability in analyzing hand motions using a robot. In general, this work paves the way for the combination of flexible materials with intelligent robots, as well as highlights the possibilities of realizing advanced functions in human–machine interfaces.

Author Contributions

Conceptual, Y.A.-H. and O.M.O.; Methodology, Y.A.-H.; software, Y.A.-H. and O.M.O.; investigation, J.C. and T.O.A.; Data Curation, Y.A.-H. and Y.Y.; Visualization, Y.A.-H., X.C.; Writing draft preparation, Y.A.-H.; writing-review and editing, Y.A.-H., O.M.O., L.W.; supervision, Y.A.-H. and L.W.; project administration, Y.Y. and L.W.; funding acquisition, L.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Key Research and Development program of China (#2019YFB1311700), the National Natural Science Foundation of China (#U1505251, #U1713219, #61950410618), the Shenzhen Natural Science Foundation of China (#JCYJ20190812173205538, #JSGG2019111816401741), and the CAS President’s Postdoctoral Fellowship.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Amjadi, M.; Pichitpajongkit, A.; Lee, S.; Ryu, S.; Park, I. Highly Stretchable and Sensitive Strain Sensor Based on Silver Nanowire–Elastomer Nanocomposite. ACS Nano 2014, 8, 5154–5163. [Google Scholar] [CrossRef] [PubMed]
  2. Ho, D.H.; Sun, Q.; Kim, S.Y.; Han, J.T.; Kim, D.H.; Cho, J.H. Stretchable and Multimodal All Graphene Electronic Skin. Adv. Mater. 2016, 28, 2601–2608. [Google Scholar] [CrossRef] [PubMed]
  3. Al-Handarish, Y.; Omisore, O.M.; Igbe, T.; Han, S.; Li, H.; Du, W.; Zhang, J.; Wang, L. A Survey of Tactile-Sensing Systems and Their Applications in Biomedical Engineering. Adv. Mater. Sci. Eng. 2020, 2020, 1–17. [Google Scholar] [CrossRef] [Green Version]
  4. Schiavullo, R. Antilatency positional tracking brings 6 degrees of freedom to standalone vr headsets. Virtual Real. 2018, 15, 10. [Google Scholar]
  5. Chang, J.; Dommer, M.; Chang, C.; Lin, L. Piezoelectric nanofibers for energy scavenging applications. Nano Energy 2012, 1, 356–371. [Google Scholar] [CrossRef]
  6. Atalay, O.; Atalay, A.; Gafford, J.; Walsh, C. A Highly Sensitive Capacitive-Based Soft Pressure Sensor Based on a Conductive Fabric and a Microporous Dielectric Layer. Adv. Mater. Technol. 2018, 3, 1700237. [Google Scholar] [CrossRef]
  7. Zhu, G.; Yang, W.; Zhang, T.; Jing, Q.; Chen, J.; Zhou, Y.; Bai, P.; Wang, Z.L. Self-Powered, Ultrasensitive, Flexible Tactile Sensors Based on Contact Electrification. Nano Lett. 2014, 14, 3208–3213. [Google Scholar] [CrossRef]
  8. Dong, K.; Wu, Z.; Deng, J.; Wang, A.; Zou, H.; Chen, C.; Hu, D.; Gu, B.; Sun, B.; Wang, Z.L. A Stretchable Yarn Embedded Triboelectric Nanogenerator as Electronic Skin for Biomechanical Energy Harvesting and Multifunctional Pressure Sensing. Adv. Mater. 2018, 30, e1804944. [Google Scholar] [CrossRef]
  9. Sun, Q.; Seung, W.; Kim, B.J.; Seo, S.; Kim, S.-W.; Cho, J.H. Active Matrix Electronic Skin Strain Sensor Based on Piezopotential-Powered Graphene Transistors. Adv. Mater. 2015, 27, 3411–3417. [Google Scholar] [CrossRef]
  10. Lee, K.Y.; Yoon, H.; Jiang, T.; Wen, X.; Seung, W.; Kim, S.-W.; Wang, Z.L. Fully Packaged Self-Powered Triboelectric Pressure Sensor Using Hemispheres-Array. Adv. Energy Mater. 2016, 6, 1502566. [Google Scholar] [CrossRef]
  11. Amjadi, M.; Yoon, Y.J.; Park, I. Ultra-stretchable and skin-mountable strain sensors using carbon nanotubes–Ecoflex nanocomposites. Nanotechnology 2015, 26, 375501. [Google Scholar] [CrossRef] [PubMed]
  12. Zheng, Q.; Liu, X.; Xu, H.; Cheung, M.-S.; Choi, Y.-W.; Huang, H.-C.; Lei, H.-Y.; Shen, X.; Wang, Z.; Wu, Y.; et al. Sliced graphene foam films for dual-functional wearable strain sensors and switches. Nanoscale Horiz. 2017, 3, 35–44. [Google Scholar] [CrossRef] [PubMed]
  13. Liu, X.; Tang, C.; Du, X.; Xiong, S.; Xi, S.; Liu, Y.; Shen, X.; Zheng, Q.; Wang, Z.; Wu, Y.; et al. A highly sensitive graphene woven fabric strain sensor for wearable wireless musical instruments. Mater. Horiz. 2017, 4, 477–486. [Google Scholar] [CrossRef]
  14. Pan, L.; Chortos, A.; Yu, G.; Wang, Y.; Isaacson, S.; Allen, R.; Shi, Y.; Dauskardt, R.; Bao, Z. An ultra-sensitive resistive pressure sensor based on hollow-sphere microstructure induced elasticity in conducting polymer film. Nat. Commun. 2014, 5, 3002. [Google Scholar] [CrossRef] [Green Version]
  15. Pang, C.; Lee, G.-Y.; Kim, T.-I.; Kim, S.M.; Kim, H.N.; Ahn, S.-H.; Suh, K.-Y. A flexible and highly sensitive strain-gauge sensor using reversible interlocking of nanofibres. Nat. Mater. 2012, 11, 795–801. [Google Scholar] [CrossRef] [PubMed]
  16. Wu, X.; Han, Y.; Zhang, X.; Zhou, Z.; Lu, C. Large-Area Compliant, Low-Cost, and Versatile Pressure-Sensing Platform Based on Microcrack-Designed Carbon Black@Polyurethane Sponge for Human-Machine Interfacing. Adv. Funct. Mater. 2016, 26, 6246–6256. [Google Scholar] [CrossRef]
  17. Wu, W.; Wen, X.; Wang, Z.L. Taxel-Addressable Matrix of Vertical-Nanowire Piezotronic Transistors for Active and Adaptive Tactile Imaging. Science 2013, 340, 952–957. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Gong, S.; Schwalb, W.; Wang, Y.; Chen, Y.; Tang, Y.; Si, J.; Shirinzadeh, B.; Cheng, W. A wearable and highly sensitive pressure sensor with ultrathin gold nanowires. Nat. Commun. 2014, 5, 3132. [Google Scholar] [CrossRef] [Green Version]
  19. Guo, H.; Tan, Y.J.; Chen, G.; Wang, Z.; Susanto, G.J.; See, H.H.; Yang, Z.; Lim, Z.W.; Yang, L.; Tee, B.C.K. Artificially innervated self-healing foams as synthetic piezo-impedance sensor skins. Nat. Commun. 2020, 11, 1–10. [Google Scholar] [CrossRef]
  20. Cheng, Y.; Wang, R.; Zhai, H.; Sun, J. Stretchable electronic skin based on silver nanowire composite fiber electrodes for sensing pressure, proximity, and multidirectional strain. Nanoscale 2017, 9, 3834–3842. [Google Scholar] [CrossRef] [PubMed]
  21. Yogeswaran, N.; Dang, W.; Navaraj, W.; Shakthivel, D.; Khan, S.; Polat, E.; Gupta, S.; Heidari, H.; Kaboli, M.; Lorenzelli, L.; et al. New materials and advances in making electronic skin for interactive robots. Adv. Robot. 2015, 29, 1359–1373. [Google Scholar] [CrossRef] [Green Version]
  22. Alsamhi, S.H.; Ma, O.; Ansari, M.S. Survey on artificial intelligence-based techniques for emerging robotic communication. Telecommun. Syst. 2019, 72, 483–503. [Google Scholar] [CrossRef]
  23. Gil, B.; Li, B.; Gao, A.; Yang, G.-Z. Miniaturized Piezo Force Sensor for a Medical Catheter and Implantable Device. ACS Appl. Electron. Mater. 2020, 2, 2669–2677. [Google Scholar] [CrossRef] [PubMed]
  24. Alsamhi, S.H.; Ma, O.; Ansari, M.S. Convergence of Machine Learning and Robotics Communication in Collaborative Assembly: Mobility, Connectivity and Future Perspectives. J. Intell. Robot. Syst. 2019, 98, 541–566. [Google Scholar] [CrossRef]
  25. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  26. Kim, S.; Zhang, X.; Daugherty, R.; Lee, E.; Kunnen, G.; Allee, D.R.; Forsythe, E.; Chae, J. Microelectromechanical Systems (MEMS) Based-Ultrasonic Electrostatic Actuators on a Flexible Substrate. IEEE Electron Device Lett. 2012, 33, 1072–1074. [Google Scholar] [CrossRef]
  27. Molchanov, P.; Gupta, S.; Kim, K.; Kautz, J. Hand gesture recognition with 3D convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Boston, MA, USA, 7–12 June 2015; pp. 1–7. [Google Scholar]
  28. Chortos, A.; Liu, J.; Bao, J.L.Z. Pursuing prosthetic electronic skin. Nat. Mater. 2016, 15, 937–950. [Google Scholar] [CrossRef] [PubMed]
  29. Liu, G.Y.; Kong, D.Y.; Hu, S.G.; Yu, Q.; Liu, Z.; Chen, T.; Yin, Y.; Hosaka, S.; Liu, Y. Smart electronic skin having gesture recognition function by LSTM neural network. Appl. Phys. Lett. 2018, 113, 084102. [Google Scholar] [CrossRef]
  30. Sohn, K.-S.; Chung, J.; Cho, M.-Y.; Timilsina, S.; Park, W.B.; Pyo, M.; Shin, N.; Sohn, K.; Kim, J.S. An extremely simple macroscale electronic skin realized by deep machine learning. Sci. Rep. 2017, 7, 11061. [Google Scholar] [CrossRef] [PubMed]
  31. Yang, G.-Z.; Bellingham, J.; Dupont, P.E.; Fischer, P.; Floridi, L.; Full, R.; Jacobstein, N.; Kumar, V.; McNutt, M.; Merrifield, R.; et al. The grand challenges ofScience Robotics. Sci. Robot. 2018, 3, eaar7650. [Google Scholar] [CrossRef] [PubMed]
  32. Al-Handarish, Y.; Omisore, O.M.; Duan, W.; Chen, J.; Zebang, L.; Akinyemi, T.; Du, W.; Li, H.; Wang, L. Facile Fabrication of 3D Porous Sponges Coated with Synergistic Carbon Black/Multiwalled Carbon Nanotubes for Tactile Sensing Applications. Nanomaterials 2020, 10, 1941. [Google Scholar] [CrossRef] [PubMed]
  33. Xiao, X.; Yuan, L.; Zhong, J.; Ding, T.; Liu, Y.; Cai, Z.; Rong, Y.; Han, H.; Zhou, J.; Wang, Z.L. High-Strain Sensors Based on ZnO Nanowire/Polystyrene Hybridized Flexible Films. Adv. Mater. 2011, 23, 5440–5444. [Google Scholar] [CrossRef] [PubMed]
  34. Johansson, R.S.; Flanagan, J.R. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat. Rev. Neurosci. 2009, 10, 345–359. [Google Scholar] [CrossRef] [PubMed]
  35. Arakeri, T.J.; Hasse, B.A.; Fuglevand, A.J. Object discrimination using electrotactile feedback. J. Neural Eng. 2018, 15, 046007. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Omisore, O.; Duan, W.; Du, W.; Li, W.; Zheng, Y.; Al-Handarish, Y.; Akinyemi, T.; Liu, Y.; Xiong, J.; Wang, L. Automatic tool segmentation and tracking during robotic intravascular catheterization for cardiac interventions. Quant. Imaging Med. Surg. 2021, 11, 2688–2710. [Google Scholar] [CrossRef]
Figure 1. (a) Optical image of the piezoresistive sponge before and after coating with the conductive network; (b) scanning electron microscope (SEM) images of the tactile sensor.
Figure 1. (a) Optical image of the piezoresistive sponge before and after coating with the conductive network; (b) scanning electron microscope (SEM) images of the tactile sensor.
Applsci 11 07264 g001
Figure 2. Resistance variations in response to the tiny stimulus pressures (a) and large loading range from 10 to 100 kPa. (bd) Resistance response of the piezoresistive tactile sensor to an applied normal force of 15.75 N and shear force.
Figure 2. Resistance variations in response to the tiny stimulus pressures (a) and large loading range from 10 to 100 kPa. (bd) Resistance response of the piezoresistive tactile sensor to an applied normal force of 15.75 N and shear force.
Applsci 11 07264 g002
Figure 3. Rapid response of the output voltage of the piezoresistive tactile sensor under different loading pressures at frequencies of (a) 1 kHz and (b) 10 kHz.
Figure 3. Rapid response of the output voltage of the piezoresistive tactile sensor under different loading pressures at frequencies of (a) 1 kHz and (b) 10 kHz.
Applsci 11 07264 g003
Figure 4. Learning-based network for hand motion classification.
Figure 4. Learning-based network for hand motion classification.
Applsci 11 07264 g004
Figure 5. Prediction performance of the deep learning network with four typical hand motions used during intravascular catheterization.
Figure 5. Prediction performance of the deep learning network with four typical hand motions used during intravascular catheterization.
Applsci 11 07264 g005
Figure 6. Application of the piezoresistive tactile sensor on intravascular intervention robot; (a) positioning of two tactile sensors on index and thumb fingers; (b,c) relative resistance change during push and rotate operations, respectively.
Figure 6. Application of the piezoresistive tactile sensor on intravascular intervention robot; (a) positioning of two tactile sensors on index and thumb fingers; (b,c) relative resistance change during push and rotate operations, respectively.
Applsci 11 07264 g006
Figure 7. Robotic response to the application of 3D porous tactile sensor (a,b) showing the position and velocity changes in the pushing process; (c,d) the rotation and angular velocity changes in the process of the rotation.
Figure 7. Robotic response to the application of 3D porous tactile sensor (a,b) showing the position and velocity changes in the pushing process; (c,d) the rotation and angular velocity changes in the process of the rotation.
Applsci 11 07264 g007
Table 1. Recognition rate of different classification methods.
Table 1. Recognition rate of different classification methods.
Classification MethodsRecognition Accuracy on Test Data (%)
PUSHPULLRTCLRTCCAggregate
LSTM + Dense81.2188.5589.9493.9787.38
SVM78.6683.8189.2191.1384.82
MLP82.7883.8190.1994.2387.37
DT84.0475.4187.1889.8884.52
KNN85.3380.7090.5892.0887.31
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Al-Handarish, Y.; Omisore, O.M.; Chen, J.; Cao, X.; Akinyemi, T.O.; Yan, Y.; Wang, L. A Hybrid Microstructure Piezoresistive Sensor with Machine Learning Approach for Gesture Recognition. Appl. Sci. 2021, 11, 7264. https://doi.org/10.3390/app11167264

AMA Style

Al-Handarish Y, Omisore OM, Chen J, Cao X, Akinyemi TO, Yan Y, Wang L. A Hybrid Microstructure Piezoresistive Sensor with Machine Learning Approach for Gesture Recognition. Applied Sciences. 2021; 11(16):7264. https://doi.org/10.3390/app11167264

Chicago/Turabian Style

Al-Handarish, Yousef, Olatunji Mumini Omisore, Jing Chen, Xiuqi Cao, Toluwanimi Oluwadara Akinyemi, Yan Yan, and Lei Wang. 2021. "A Hybrid Microstructure Piezoresistive Sensor with Machine Learning Approach for Gesture Recognition" Applied Sciences 11, no. 16: 7264. https://doi.org/10.3390/app11167264

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop