Next Article in Journal
Review on LPV Approaches for Suspension Systems
Previous Article in Journal
Development of a High-Power-Factor Power Supply for an Atmospheric-Pressure Plasma Jet
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Real-Time Identification of Knee Joint Walking Gait as Preliminary Signal for Developing Lower Limb Exoskeleton

by
Susanto Susanto
1,
Ipensius Tua Simorangkir
1,
Riska Analia
1,
Daniel Sutopo Pamungkas
1,*,
Hendawan Soebhakti
1,
Abdullah Sani
1 and
Wahyu Caesarendra
2,*
1
Department of Electrical Engineering, Politeknik Negeri Batam, Kepulauan Riau 29461, Indonesia
2
Faculty of Integrated Technologies, Universiti Brunei Darussalam, Jalan Tungku Link, Gadong BE1410, Brunei
*
Authors to whom correspondence should be addressed.
Electronics 2021, 10(17), 2117; https://doi.org/10.3390/electronics10172117
Submission received: 27 July 2021 / Revised: 27 August 2021 / Accepted: 28 August 2021 / Published: 31 August 2021
(This article belongs to the Section Systems & Control Engineering)

Abstract

:
An exoskeleton is a device used for walking rehabilitation. In order to develop a proper rehabilitation exoskeleton, a user’s walking intention needs to be captured as the initial step of work. Moreover, every human has a unique walking gait style. This work introduced a wearable sensor, which aimed to recognize the walking gait phase, as the fundamental step before applying it into the rehabilitation exoskeleton. The sensor used in this work was the IMU sensor, used to recognize the pitch angle generated from the knee joint while the user walks, as information about the walking gait cycle, before doing the investigation on how to identify the walking gait cycle. In order to identify the walking gait cycle, Neural Network has been proposed as a method. The gait cycle identification was generated to recognize the gait cycle on the knee joint. To verify the performance of the proposed method, experiments have been done in real-time application. The experiments were carried out with different processes such as walking on a flat floor, climbing up, and walking down stairs. Five subjects were trained and tested using the system. The experiments showed that the proposed method was able to recognize each gait cycle for all users as they wore the sensor on their knee joints. This study has the potential to be applied on an exoskeleton rehabilitation robot as a further research experiment.

1. Introduction

Walking is a very important activity for humans in carrying out their daily activities. Without the ability to walk, humans will find it difficult to carry out their activities; as occurring to those who have had a stroke, recovering from a stroke, the elderly who have lost the muscle power to walk, or those suffering from spinal cord injuries. Thanks to the rapid development of medical device technologies, the problem of walking difficulties is increasingly capable of being tackled with the help of technology. One of the solutions is an exoskeleton robot that is intended to help humans walk. Such a robot can be classified into two types, namely lower limb and upper limb, which have been reviewed in [1,2,3,4]. The lower limb exoskeleton is not only used for medical purposes, such as assisting human walking [5], training or rehabilitation due to stroke, post stroke, and spinal cord injury [6,7], but also for military purposes or lifting heavy objects [8], and for gaining the ability for walking long distances [9]. However, in developing the lower limb exoskeleton, the most challenging part is to understand the human walking intention: whether it can predict a movement or only react to an existing one.
In order to understand the human walking intention, some methods have been proposed by researchers. As presented by Castagneri, et al. [10], the Statistical Gait Analysis (SGA) is introduced to process muscle cycle activation patterns, extracted from a functional walk by using an EMG sensor. Meng, et.al in [11], they used EMG sensors and a hidden Markov model (HMM) in order to recognize the gait phases. In another study, the gait cycle could also be recognized by using sensors placed on the plantar of the feet, as presented by Pawin, et.al [12], calculating the locus of Zero Moment Point (ZMP) from the walking gait cycle estimated by using Force Sensitive Resistors (FSRs). In order to classify the normal gait cycle, they implemented a neural network Kim et al. [13] proposed a simple gait segmentation from two sensorized insoles and developed eight flexible capacitive costume-made sensors to calculate the gait parameters and segment the gait cycle phased and subphases. Using the foot plantar for gait recognition was also proposed in [14,15], using a different approach. Moreover, Luu, et al. [16] presented their work using the EEG sensor in order to recognize the walking condition by adding the independent component analysis and k-means clustering. In addition, Villarreal et al. [17] proposed the gait phase pattern from the parameters of the mechanical variable.
The gait cycle recognition can also be analyzed from the gait flow image thanks to current technological developments. As proposed in [18], Kinect depth cameras were used to recognize the human gait recognition from the modeled body parts of a human to identify a walking person. The gait image also can be analyzed with different approaches such as Lacus-Kanade’s approach [19], extracting features from the human body [20,21], generating the 3D dynamic by using the active triangulation principle [22], using the Kohonen Self-Organizing Mapping (KSOM) neural network [23], and deriving the full model on a 7-link system and then performing the Denavit-Hartenberg method to analysis the kinematic and the Euler-Lagrane equation for the dynamic analysis [24].
Nevertheless, using the camera or gait image to recognize the human walking gait cycle may be complicated due to the need of setting up camera to collect data. When using the gait cycle as the preliminary signal for activating the exoskeleton robot, the signal needs to be identified while the suit is worn on the human body. Therefore, the use of a wearable sensor is necessary to this work. Tiny wearable sensors, namely an accelerometer or a gyroscope, or the combination of these two sensors, can be used, commonly called the IMU sensor. As shown in [25], they used a single accelerometer for the Parkinson’s disease (PD) patients by fitting a model equation to the relationships of the subject’s walking behavior. Whereas Hu et al. [26] used a tri-axial accelerometer on the waist to collect the body acceleration and derive the kinematic equation for human-walking model. Mota et al. [27] placed the sensor on the shank and used the complementary filter to obtain the body motion. As for the IMU sensor, several researchers have used this sensor in order to recognize the human walking phases with different sensor placement and approaches. The sensor can be placed in the user’s trouser pocket [28], pelvis [29], on the hip and knee joint [30,31], and chest [32]. Besides sensors, the actuators also play an important role [33].
The aim of this work is to identify the walking gait cycle where the IMU sensor can be attached to the human body; so that it can be integrated to the exoskeleton robot. In contrast with our previous work [32], in this work, we identified the human walking gait cycle by only looking at the information provided by the IMU sensors (only pitch angle) placed on both thighs for the knee joint. A number of methods to identify the gait cycle of the exoskeleton user has been presented. Recent literatures are summarized in Table 1. This work will focus on identifying the human walking gait cycle by using the IMU sensor, which will be placed on the thigh for determine the gait phase produced by the knee joint. In order to recognize the gait phase from the IMU pitch angle, which is produced by knee joint, the neural network method is implemented in this proposed system by real-time application. Experiments carried out in this work include walking on a flat surface, climbing up and walking down stairs with multiple samples.
To deliver a complete discussion, this paper is organized as follows: Section 2 provides the information about the sensors and their placement. Section 3 describes the proposed method and experiments. Section 4 presents the results of the experiments and Section 5 provides the concluding remarks and the future work of this investigation.

2. Sensor Placement

The sensor used in this work was the IMU 6050 sensor to measure the pitch angle produced by the knee while the user was walking. The reason of using the IMU sensor is that this sensor has an immediate response to any movement occurring towards it, and it is convenient to put close to the knee joint so that the signal will be more accurate. This is different from previous work [13], where a foot plantar sensor was used, needing much sensor configuration and the signal having to be wait until the user stepped their foot on the right position. As for sensor placement in this work, presented on Figure 1, Figure 1a shows the mechanical equipment of the lower limb exoskeleton for the knee joints, and Figure 1b shows the position of the sensor on the center of the thigh, about 5 cm above from the center point of the knee for each leg. The IMU sensor position on the knee joint was the X+ headed downward of the user and the X− to the opposite direction. The Y+ pointed to the left side of the user and the Y- to the opposite direction. The Z+ was directed to the rear of the user and the Z− to the opposite direction. The complete mechanical design of the lower limb exoskeleton can be seen on Figure 2, where Figure 2a–c show the prototype with different points of view. In this work, the IMU sensor was used to give the gait cycle recognition of the leg in order to generate assistance from the actuator mounted to the robot. The mechanical was designed to assist the knee joint for rehabilitation purposes. As presented on Figure 2c, the robot was equipped with two servo motors as the actuator, and a mini-PC Intel NUC as the main controller to control the motion and the gait cycle identification.

3. Methods

The block diagram of the system is shown in Figure 3. The IMU sensors were connected to the Arduino microcontroller. Prior to the gait cycle recognition, initially the IMU sensor needed to capture the pitch signal generated from the knee joint when the user was walking. Afterward, the signal was forwarded to the main controller i.e., mini PC via serial communication. In this main controller, the signal was processed, producing the whole gait signal. These signals would be generated when the user was walking. Furthermore, the signal given by the signal processing was used as the reference signal to the gait cycle identification process. In this work, Neural Network (NN) was used as the proposed method in order to recognize the gait phase produced by the users when they were walking. The architecture of the proposed method is presented on Figure 4, where the left and right pitch IMU sensor data are used as the reference signal. The output of the NN can be modeled by Equation (1). Back propagation method is used to determine the weights of the model.
O = H ( 3 ) [ W ( 3 ) H ( 2 ) [ W 2 H ( 1 ) [ W ( 1 ) I ] ] ]
where:
  • O = Output
  • I = Input
  • H(1) = Activation function in layer 1
  • H(2) = Activation function in layer 2
  • H(3) = Activation function in layer 3
  • W(1) = Weight matric for layer 1
  • W(2) = Weight matric for layer 2
  • W(3) = Weight matric for layer 3
The NN architecture consisted of two nodes at the input layers and two hidden layers. These hidden layers had ten nodes for the first hidden layer and six nodes for the second one. The output layer generated three nodes. These nodes represented each gait phase as shown on Figure 5. The walking phase on Figure 5, consists of heel strike (HS), contralateral toe off (CTO), mid stance, contralateral heel strike (CHS), toe off (TO), mid swing, and finally returning to heel strike (HS). In reality, the initial gait phase of every user will be different according to the first foot initial contact. If the signal recognized the HS after initial contact for the first foot, then the other foot would present the CHS signal, and so on. Moreover, the binary output digits from the NN processing can be seen on Table 2, where each phase possesses its own binary code in this table.
The signal monitoring was developed in this work using C# programming as can be seen on Figure 6a. When the proposed method was implemented in the prototype, the proposed method needed to be normalized into a suitable number. In order to normalize the input angle, it had to be no more than 130°. This is because of the maximum stride angle produced by a normal human walking is 130°, which can be seen on Figure 6b. The maximum stride angles were collected from the sample user posture and maximum pitch angle of the IMU when the user was walking, and the normalization value was generated from the IMU data read while the user was walking.

4. Results

This section will discuss some experiments which have been completed in real-time application. In order to verify the performance of the proposed method in identifying the gait cycle phase, the experiment was carried out in three different procedures such as walking on a flat floor, ascending stairs and descending stairs, which can be seen on Figure 7a–c, respectively. This experiment involved some users as seen in Table 3. There were five users for walking on the flat floor procedure and three users for ascending and descending stairs procedures. This experiment also involved male and female users with different ages, weight, and heights as described in Table 3. In the training phase every user had to walk two gait cycles. The purpose of the training phase purpose was to obtain the weight of the NN algorithm using back propagation method. After this process, the test phase was performed. The tests comprised of experiments of walking on a flat surface, followed by the experiments of walking up and down stairs.
The first experiment was walking on a flat surface (floor), taken for two cycles of walking, where the results of each user are presented on Figure 8. On Figure 8, all the users used their right leg for the initial step, denoted by the higher signal produced by the right leg compared to the left. As for the results, at the heel strike phase, User E generated the lowest angle while striding the left leg and User B for the right leg. On the other hand, User D generated the highest angle while striding the right leg and User C and User A for the left leg. Furthermore, when the contralateral toe off (CTO) phase occurred while the users were walking, the right leg was in a supporting position in the front while the other began to lift off the floor. In this phase, User E produced the lowest signal for the right and left leg, while the highest signal was produced by User A. Meanwhile, in the mid stance phase User A generated the highest angle for the left and right legs. The lowest angle was produced by User B for the left leg and User D for the right leg. In the contralateral heel strike (CHS) phase, the feet were placed on the floor, the left foot in the front of right foot. In this phase, the lowest angle was generated by User C for the left leg and User B for the right leg, while the highest signal was produced by User D for the left leg and User A for the right leg. Moreover, in the toe off position, the highest angle was produced by User B for the left leg and User D for the right leg. The lowest angle was produced by User B for the right leg and User D for the left leg. The last phase was the mid swing phase where the left leg would be stepping on behind the right leg, while the right leg was in swinging position in front of left leg. In this phase, User E produced the lowest angle at the left leg and User D for the right leg. As for the highest signal generated by the User A for the left leg and User C for the right leg. The summary of the angles produced by each user can be seen on Table 4. From this experiment it can be concluded that each user has their own habits for their daily walking activities. Table 5 shows the confusion matrix between the prediction and the real gait circle in the flat surface. The results presented in the Table 5 are the average value of five users.
The second experiment was tested out to verify the system when the users were climbing up stairs. In order to record the signal, the users ascend four steps, each having a height of about 20 cm. Information of the users can be seen in Table 3, the subjects consisting of two women and a man with different ages and heights. As seen on Figure 9, each user generated the same trend when climbing upstairs. The gait phase trends generated by the users were mid stand, mid swing, toe off (TO), and contralateral toe off (CTO). From Figure 9, the User B generated the highest signal of each phase. From the experiment, it can be concluded that the most common phase generated by the users were TO for the right leg and CTO for the left leg. The confusion matrix on the climbing up the stairs test for three subjects is presented in Table 6.
Further, when the users were going down a staircase, the signal generated can be seen on Figure 10. On Figure 10, the gait phase occurred when the user walked down the stairs, also mid stand, mid swing, toe off (TO), and contralateral toe off (CTO) for all users. In addition, the confusion matrix on the test when all subjects are going down is presented in Table 7. Compared with the gait movement of the users, the system is able to predetermine phases of gait of the users in every surface with successful rate of 98%.

5. Conclusions

The Neural Network has been proposed as the gait cycle recognition in this work, which was aimed to recognize the gait phase on the knee joint of the users. Experiments have been completed in real-time application. From the experiment results, the proposed method was found to be able to recognize the gait phases of human walking, whether they walked on a flat floor, ascending stairs, or descending stairs. The results of each user showed different signals, depending on their walking habits in daily activities. In the future, we will integrate this sensor to the prototype of an exoskeleton robot for use as a rehabilitation robot on the knee joint. Nevertheless, the prototype of the exoskeleton that will be developed should be appropriate to the sensor’s characteristic, the actuator chosen, and mechanical design should be emphasized in the next project. Selecting the proper actuator for the exoskeleton has been achieved. Therefore, it can be used for further study. Moreover, due to development of artificial intelligence, it is possible to predict the human walking intention by establishing the human–machine coupling relationship for the future.

Author Contributions

Conceptualization, S.S., R.A. and D.S.P.; methodology, H.S. and A.S.; validation, W.C. and D.S.P.; formal analysis, W.C. and H.S.; investigation, W.C. and I.T.S.; resources, W.C., I.T.S. and R.A.; data curation, W.C.; writing—original draft, S.S., W.C. and D.S.P.; writing—review and editing, W.C., D.S.P. and R.A.; visualization, W.C. and D.S.P.; supervision, D.S.P.; project administration, A.S. and W.C.; funding acquisition, D.S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was fully funded by the Ministry of Research, Technology and Higher Education of Republic of Indonesia.

Data Availability Statement

Data supporting reported results can be found in https://bit.ly/38G3VvT (accessed on 30 August 2021).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gull, M.A.; Bai, S.; Bak, T. A Review on Design of Upper Limb Exoskeletons. Robotics 2020, 9, 16. [Google Scholar] [CrossRef] [Green Version]
  2. Pamungkas, D.S.; Caesarendra, W.; Soebakti, H.; Analia, R.; Susanto, S. Overview: Types of Lower Limb Exoskeletons. Electronics 2019, 8, 1283. [Google Scholar] [CrossRef] [Green Version]
  3. Zhou, J.; Yang, S.; Xue, Q. Lower limb rehabilitation exoskeleton robot: A review. Adv. Mech. Eng. 2021, 13. [Google Scholar] [CrossRef]
  4. Shi, D.; Zhang, W.; Zhang, W.; Ding, X. A Review on Lower Limb Rehabilitation Exoskeleton Robots. Chin. J. Mech. Eng. 2019, 32, 74. [Google Scholar] [CrossRef] [Green Version]
  5. Rewalk by ARGO Medical Technologies, Inc. Available online: http://www.rewalk.com (accessed on 19 December 2019).
  6. Sankai, Y. Leading Edge of Cybernics: Robot Suit HAL. In Proceedings of the 2006 SICE-ICASE International Joint Conference, Busan, Korea, 18–21 October 2006; pp. P-1–P-2. [Google Scholar] [CrossRef]
  7. Kazerooni, H.; Steger, R.; Huang, L. Hybrid Control of the Berkeley Lower Extremity Exoskeleton (BLEEX). Int. J. Robot. Res. 2006, 25, 561–573. [Google Scholar] [CrossRef]
  8. Kazerooni, H.; Racine, J.; Huang, L.; Steger, R. On the Control of the Berkeley Lower Extremity Exoskeleton (BLEEX). In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 4353–4360. [Google Scholar] [CrossRef]
  9. Malcolm, P.; Derave, W.; Galle, S.; de Clercq, D. A Simple Exoskeleton That Assists Plantarflexion Can Reduce the Metabolic Cost of Human Walking. PLoS ONE 2013, 8, e56137. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Castagneri, C.; Agostini, V.; Balestra, G.; Knaflitz, M.; Carlone, M.; Massazza, G. EMG Asymmetry Index in Cyclic Movements. In Proceedings of the 2018 IEEE Life Sciences Conference (LSC), Montreal, QC, Canada, 28–30 October 2018; pp. 223–226. [Google Scholar] [CrossRef]
  11. Meng, M.; She, Q.; Gao, Y.; Luo, Z. EMG signals based gait phases recognition using hidden Markov models. In Proceedings of the 2010 IEEE International Conference on Information and Automation, Harbin, China, 20–23 June 2010; pp. 852–856. [Google Scholar] [CrossRef]
  12. Pawin, J.; Khaorapapong, T.; Chawalit, S. Neural-based human’s abnormal gait detection using Force Sensitive Resistors. In Proceedings of the Fourth International Workshop on Advanced Computational Intelligence, Wuhan, China, 19–21 October 2011; pp. 224–229. [Google Scholar] [CrossRef]
  13. Chandel, V.; Singhal, S.; Sharma, V.; Ahmed, N.; Ghose, A. PI-Sole: A Low-Cost Solution for Gait Monitoring Using Off-The-Shelf Piezoelectric Sensors and IMU. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 3290–3296. [Google Scholar] [CrossRef]
  14. Aqueveque, P.; Germany, E.; Osorio, R.; Pastene, F. Gait Segmentation Method Using a Plantar Pressure Measurement System with Custom-Made Capacitive Sensors. Sensors 2020, 20, 656. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Susanto, R.A.; Song, K. Design of assistive torque for a lower limb exoskeleton based on motion prediction. In Proceedings of the 2016 55th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Tsukuba, Japan, 20–23 September 2016; pp. 172–177. [Google Scholar] [CrossRef]
  16. Luu, T.P.; Brantley, J.A.; Zhu, F.; Contreras-Vidal, J.L. Electrocortical amplitude modulations of human level-ground, slope, DSP: OK you can delete it and stair walking. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Seogwipo, Korea, 11–15 July 2017; pp. 1913–1916. [Google Scholar] [CrossRef]
  17. Villarreal, D.J.; Poonawala, H.A.; Gregg, R.D. A Robust Parameterization of Human Gait Patterns Across Phase-Shifting Perturbations. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 265–278. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Kim, W.; Kim, Y.; Lee, K.Y. Human Gait Recognition Based on Integrated Gait Features using Kinect Depth Cameras. In Proceedings of the 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC), Madrid, Spain, 13–17 July 2020; pp. 328–333. [Google Scholar] [CrossRef]
  19. Wang, L.; Jia, S.; Li, X.; Wang, S. Human gait recognition based on gait flow image considering walking direction. In Proceedings of the 2012 IEEE International Conference on Mechatronics and Automation, Chengdu, China, 5–8 August 2012; pp. 1990–1995. [Google Scholar] [CrossRef]
  20. Yusuf, S.I.; Adeshina, S.; Boukar, M.M. Parameters for Human Gait Analysis: A Review. In Proceedings of the 2019 15th International Conference on Electronics, Computer and Computation (ICECCO), Abuja, Nigeria, 10–12 December 2019; pp. 1–4. [Google Scholar] [CrossRef]
  21. Roberts, M.; Mongeon, D.; Prince, F. Biomechanical parameters for gait analysis: A systematic review of healthy human gait. Phys. Ther. Rehabil. 2017, 4, 6. [Google Scholar] [CrossRef]
  22. Posada-Gomez, R.; Martinez, M.A.G.; Aguila-Rodriguez, G.; Daul, C.; Salas, L.L. Conception and realization of a 3D dynamic sensor as a tool in human walking study. In Proceedings of the 2005 2nd International Conference on Electrical and Electronics Engineering, Mexico City, Mexico, 7–9 September 2005; pp. 178–181. [Google Scholar] [CrossRef]
  23. Bakchy, S.C.; Islam, M.R.; Sayeed, A. Human identification on the basis of gait analysis using Kohonen self-organizing mapping technique. In Proceedings of the 2016 2nd International Conference on Electrical, Computer & Telecommunication Engineering (ICECTE), Rajshahi, Bangladesh, 8–10 December 2016; pp. 1–4. [Google Scholar] [CrossRef]
  24. Miranda-Pereira, P.A.; Milián-Ccopa, L.P. Brief biomechanical analysis on the walking for a lower-limb rehabilitation exoskeleton. In Proceedings of the 2016 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), Tokyo, Japan, 17–20 December 2016; pp. 67–72. [Google Scholar] [CrossRef]
  25. Yoneyama, M.; Kurihara, Y.; Watanabe, K.; Mitoma, H. Accelerometry-Based Gait Analysis and Its Application to Parkinson’s Disease Assessment— Part 2: A New Measure for Quantifying Walking Behavior. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 21, 999–1005. [Google Scholar] [CrossRef] [PubMed]
  26. Hu, J.-S.; Sun, K.-C.; Cheng, C.-Y. A Kinematic Human-Walking Model for the Normal-Gait-Speed Estimation Using Tri-Axial Acceleration Signals at Waist Location. IEEE Trans. Biomed. Eng. 2013, 60, 2271–2279. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Mota, F.A.O.; Biajo, V.H.M.; Mota, H.O.; Vasconcelos, F.H. A wireless sensor network for the biomechanical analysis of the gait. In Proceedings of the 2017 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Turin, Italy, 22–25 May 2017; pp. 1–6. [Google Scholar] [CrossRef]
  28. Abhayasinghe, N.; Murray, I. Human gait phase recognition based on thigh movement computed using IMUs. In Proceedings of the 2014 IEEE Ninth International Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP), Singapore, 21–24 April 2014; pp. 1–4. [Google Scholar] [CrossRef] [Green Version]
  29. Jang, J.; Lee, J.; Lim, B.; Shim, Y. Natural gait event-based level walking assistance with a robotic hip exoskeleton. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 17–21 July 2018; pp. 1–5. [Google Scholar] [CrossRef]
  30. Wang, X.; Ristic-Durrant, D.; Spranger, M.; Gräser, A. Gait assessment system based on novel gait variability measures. In Proceedings of the 2017 International Conference on Rehabilitation Robotics (ICORR), London, UK, 17–20 July 2017; pp. 467–472. [Google Scholar] [CrossRef]
  31. Chen, J.; Huang, Y.; Guo, X.; Zhou, S.; Jia, L. Parameter identification and adaptive compliant control of rehabilitation exoskeleton based on multiple sensors. Measurement 2020, 159, 107765. [Google Scholar] [CrossRef]
  32. Analia, R.; Sutopo, P.D.; Soebhakti, H.; Sani, A. Walking Classification of Hip Joint Lower Limb Exoskeleton. In Proceedings of the 2019 2nd International Conference on Applied Engineering (ICAE), Batam, Indonesia, 2–3 October 2019; pp. 1–5. [Google Scholar] [CrossRef]
  33. Barjuei, E.S.; Ardakani, M.M.G.; Caldwell, D.G.; Sanguineti, M.; Ortiz, J. Optimal Selection of Motors and Transmissions in Back-Support Exoskeleton Applications. IEEE Trans. Med. Robot. Bionics 2020, 2, 320–330. [Google Scholar] [CrossRef]
Figure 1. (a) The IMU sensor placement, (b) the distance of the IMU to each knee joint.
Figure 1. (a) The IMU sensor placement, (b) the distance of the IMU to each knee joint.
Electronics 10 02117 g001
Figure 2. The prototype of lower limb exoskeleton (a) frontal view, (b) side view, (c) rear view.
Figure 2. The prototype of lower limb exoskeleton (a) frontal view, (b) side view, (c) rear view.
Electronics 10 02117 g002
Figure 3. The block diagram system.
Figure 3. The block diagram system.
Electronics 10 02117 g003
Figure 4. The neural network architecture of gait cycle recognition.
Figure 4. The neural network architecture of gait cycle recognition.
Electronics 10 02117 g004
Figure 5. The walking gait cycle phase.
Figure 5. The walking gait cycle phase.
Electronics 10 02117 g005
Figure 6. (a) The GUI real-time signal monitoring, (b) the maximum angle produced by IMU when user walking, (c) the GUI monitoring system when the user activated the device.
Figure 6. (a) The GUI real-time signal monitoring, (b) the maximum angle produced by IMU when user walking, (c) the GUI monitoring system when the user activated the device.
Electronics 10 02117 g006aElectronics 10 02117 g006b
Figure 7. The experiment procedure of collecting the gait cycle when the user was: (a) walking on a flat surface, (b) climbing upstairs, (c) going downstairs.
Figure 7. The experiment procedure of collecting the gait cycle when the user was: (a) walking on a flat surface, (b) climbing upstairs, (c) going downstairs.
Electronics 10 02117 g007
Figure 8. The waking gait cycle signal of the users when walking on the flat surface. (a) Gait Cycle-User A; (b) Gait Cycle-User B; (c) Gait Cycle-User C; (d) Gait Cycle-User D; (e) Gait Cycle-User E.
Figure 8. The waking gait cycle signal of the users when walking on the flat surface. (a) Gait Cycle-User A; (b) Gait Cycle-User B; (c) Gait Cycle-User C; (d) Gait Cycle-User D; (e) Gait Cycle-User E.
Electronics 10 02117 g008aElectronics 10 02117 g008b
Figure 9. The walking gait signal of the users climbing up stairs. (a) Gait Cycle-User A; (b) Gait Cycle-User B; (c) Gait Cycle-User C.
Figure 9. The walking gait signal of the users climbing up stairs. (a) Gait Cycle-User A; (b) Gait Cycle-User B; (c) Gait Cycle-User C.
Electronics 10 02117 g009
Figure 10. The walking gait cycle of users going down stairs. (a) Gait Cycle--User A; (b) Gait Cycle-User B; (c) Gait Cycle-User C.
Figure 10. The walking gait cycle of users going down stairs. (a) Gait Cycle--User A; (b) Gait Cycle-User B; (c) Gait Cycle-User C.
Electronics 10 02117 g010aElectronics 10 02117 g010b
Table 1. A summary of recent methods on gait cycle identification.
Table 1. A summary of recent methods on gait cycle identification.
NoSensorsReferenceDescription
1EMG[10]Statistical Gait Analysis Algorithm
2EMG[11]Hidden Markov Model Algorithm
3Force Sensitive Resistor[12]Placed on plantar
4Kinect depth cameras[19]Captured human walking then perform the body part modeling
5IMU[24]The sensor was placed on waists
6IMU[26]The sensor was placed on trouser pocket
7IMU[27]The sensor placed on pelvis
8IMU[28]The sensor placed on hip joints
9IMU[32]The sensor placed on chest
Table 2. The binary output digits from the neural network result.
Table 2. The binary output digits from the neural network result.
NoDigits Binary OutputGait Cycle
1000Initial
2001Heel strike
3010Contralateral toe off
4011Mid stance
5100Contralateral heel strike
6101Toe off
7110Mid swing
Table 3. Experimental users’ data.
Table 3. Experimental users’ data.
NoWalking on Flat FloorClimbing Up and Down the Stairs
UserAge (Years)Height (cm)*F/MWeight
(Kg)
UserAge (Years)Height (cm)*F/MWeight
(Kg)
1User A50160M60User A17155F50
2User B17155F50User B21168M60
3User C21168M60User C45150F65
4User D21165M65
5User E45150F65
*F/M: Female/Male.
Table 4. The angle of each gait phase generated by users.
Table 4. The angle of each gait phase generated by users.
UserHeel Strike (HS)Contralateral Toe Off (CTO)Mid StanceContralateral Heel Strike (CHS)Toe off (TO)Mid Swing
Pitch Left (°)Pitch Right (°)Pitch Left (°)Pitch Right (°)Pitch Left (°)Pitch Right (°)Pitch Left (°)Pitch Right (°)Pitch Left (°)Pitch Right (°)Pitch Left (°)Pitch Right (°)
User A581216310810880116731067563116
User B51117561019277109661106356113
User C5812059951037710770897661117
User D491286210698751226877776087
User E4312149891067611268796454111
Table 5. Confusion matrix flat surface (in percentage).
Table 5. Confusion matrix flat surface (in percentage).
Prediction
Real InitialHeel StrikeContralateral Toe offMid StanceContralateral Heel StrikeToe offMid Swing
Initial100000000
Heel strike0.298.11.6000.10
Contralateral toe off00.897.60.301.20
Mid stance000.498.60.200.7
Contralateral heel strike0000.796.71.70.8
Toe off000.700.297.21.8
Mid swing0001.30.10.198.4
Table 6. Confusion Matrix climbing up in percentage.
Table 6. Confusion Matrix climbing up in percentage.
Prediction
Real InitialHeel strikeContralateral toe offMid stanceContralateral heel strikeToe offMid swing
Initial100000000
Heel strike0.496.31.90.20.80.30
Contralateral toe off00.897.60.301.2
Mid stance000.698.10.60.20.5
Contralateral heel strike00.20.30.796.21.60.9
Toe off000.60.40.497.11.5
Mid swing000.10.80.60.298.2
Table 7. Confusion Matrix climbing down in percentage.
Table 7. Confusion Matrix climbing down in percentage.
Prediction
Real InitialHeel strikeContralateral toe offMid stanceContralateral heel strikeToe offMid swing
Initial100000000
Heel strike0.297.20.900.70.50.3
Contralateral toe off00.597.50.20.11.50
Mid stance000.498.60.30.40.2
Contralateral heel strike000.20.497.21.30.8
Toe off000.50.60.596.81.4
Mid swing000.30.60.80.397.9
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Susanto, S.; Simorangkir, I.T.; Analia, R.; Pamungkas, D.S.; Soebhakti, H.; Sani, A.; Caesarendra, W. Real-Time Identification of Knee Joint Walking Gait as Preliminary Signal for Developing Lower Limb Exoskeleton. Electronics 2021, 10, 2117. https://doi.org/10.3390/electronics10172117

AMA Style

Susanto S, Simorangkir IT, Analia R, Pamungkas DS, Soebhakti H, Sani A, Caesarendra W. Real-Time Identification of Knee Joint Walking Gait as Preliminary Signal for Developing Lower Limb Exoskeleton. Electronics. 2021; 10(17):2117. https://doi.org/10.3390/electronics10172117

Chicago/Turabian Style

Susanto, Susanto, Ipensius Tua Simorangkir, Riska Analia, Daniel Sutopo Pamungkas, Hendawan Soebhakti, Abdullah Sani, and Wahyu Caesarendra. 2021. "Real-Time Identification of Knee Joint Walking Gait as Preliminary Signal for Developing Lower Limb Exoskeleton" Electronics 10, no. 17: 2117. https://doi.org/10.3390/electronics10172117

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop