Next Article in Journal
The Effect of the FIRST Robotics Program on Its Graduates
Previous Article in Journal
Directional-Sensor Network Deployment Planning for Mobile-Target Search
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mapping Three Electromyography Signals Generated by Human Elbow and Shoulder Movements to Two Degree of Freedom Upper-Limb Robot Control

by
Pringgo Widyo Laksono
1,2,
Kojiro Matsushita
1,3,
Muhammad Syaiful Amri bin Suhaimi
4,
Takahide Kitamura
1,
Waweru Njeri
3,5,
Joseph Muguro
1,5 and
Minoru Sasaki
1,3,*
1
Graduate School of Engineering, Gifu University, Gifu 501-1193, Japan
2
Industrial Engineering, Faculty of Engineering, Universitas Sebelas Maret, Surakarta 57126, Indonesia
3
Intelligent Production Technology Research & Development Center for Aerospace (IPTeCA), Tokai National Higher Education and Research System, Gifu 501-1193, Japan
4
National Institute of Technology, Gifu College, Gifu 501-0495, Japan
5
School of Engineering, Dedan Kimathi University of Technology, Private Bag, Nyeri 10143, Kenya
*
Author to whom correspondence should be addressed.
Robotics 2020, 9(4), 83; https://doi.org/10.3390/robotics9040083
Submission received: 20 August 2020 / Revised: 5 October 2020 / Accepted: 7 October 2020 / Published: 9 October 2020
(This article belongs to the Section Sensors and Control in Robotics)

Abstract

:
This article sought to address issues related to human-robot cooperation tasks focusing especially on robotic operation using bio-signals. In particular, we propose to develop a control scheme for a robot arm based on electromyography (EMG) signal that allows a cooperative task between humans and robots that would enable teleoperations. A basic framework for achieving the task and conducting EMG signals analysis of the motion of upper limb muscles for mapping the hand motion is presented. The objective of this work is to investigate the application of a wearable EMG device to control a robot arm in real-time. Three EMG sensors are attached to the brachioradialis, biceps brachii, and anterior deltoid muscles as targeted muscles. Three motions were conducted by moving the arm about the elbow joint, shoulder joint, and a combination of the two joints giving a two degree of freedom. Five subjects were used for the experiments. The results indicated that the performance of the system had an overall accuracy varying from 50% to 100% for the three motions for all subjects. This study has further shown that upper-limb motion discrimination can be used to control the robotic manipulator arm with its simplicity and low computational cost.

1. Introduction

In the recent past, robots have turned to be an integral part of the society with applications in industrial processes and manufacturing, military, welfare and healthcare systems, transportation, and autonomous vehicles, to name a few [1,2]. Robots in automation are fueled by the inherent virtue of machines in doing monotonous tasks with repeatable precision over a lengthy duration. In contrast to human labor, robots require fewer safety precautions, which makes them ideal for handling dangerous elements and disasters [1,2]. As an application area, the COVID-19 pandemic that has hit the world is a good case for the usage of robots in monitoring and delivery of essential services safely without compromising the safety of the medical staff [3].
Robots have been in use in various aspects of human life, as earlier mentioned. Regarding the control mechanism, modes of application of robot can be broadly categorized as autonomous, cooperative, and or hybrid mode [4]. Autonomous mode, in this case, is defined to capture all control schemes that do not require human intervention. This is the case in most industrial robots, as well as the highly anticipated autonomous cars. In cooperative mode, the robot is driven by human input where the robot act to respond or mimic the human input. An example of this scheme would be crane operation, robot control using joysticks, and others. The hybrid model would be a case where the robot has an element of autonomy as well as input control initiated by a human. In both the hybrid and cooperative model, an element of human-robot collaboration (HRC) is present [5,6,7,8,9].
HRC focuses on cooperative usage of either workspace, control scheme, or task completion. From the literature review, much more fine-tuning of the current robot system is needed to integrate robots in daily lives without being overly intrusive [2,10,11]. Attempts like the miniaturization of robots to fit in human workspaces is a step pursued by developers to achieve this integration. This context, called agent autonomy, closely considered leader-follower relationships that express how much robot motion is directly determined by humans for conducting tasks [4,12].
One of the significant aspects of HRC is the control mechanism employed. Conventionally, interaction with a robot is achieved with physical joysticks, keyboards, and other hardware systems [5,6,7,8,13]. The limitation with this input system is the level and quality of interactivity since they need to be physically attached. An alternative to this provision is the use of wireless and wearable devices. Wireless and wearable devices as a means of robot interaction open the control scheme to be versatile and user-friendly. In this research, the wearable control system was employed as it availed advantages, as is discussed below.
With the advent of computing technologies, fine-tuned wearable devices have hit the market. Of particular interest to the discussion is devices that can record physiological signals and process it to give meaningful information like heart rate, muscle activities, and such. All such signals emanating from the human body, jointly referred to as biopotential signals, are present in the human body and can be integrated to enhance the quality of life [14,15,16,17]. Particularly so, in cases where there is a physical inability, biosignals have been applied to restore control to patients and disabled individuals as well. An example of this is the use of prosthetics. Integrating such high-end devices with an HRC system would be advantageous in the versatility and universality of the control schema. In this paper, we present one of the readily available signals from the skin surface, electromyography (EMG).
Biopotential signals have been proposed to control robots in literature. Fukuda et al. [18] conducted teleoperation involving a human-assisted robotic arm using EMG signals. The systems used six EMG channels and a position sensor to capture grasping and manipulation signals. Artemiadis and Kyriakopoulos [6,19] proposed a methodology for controlling an anthropomorphic robot arm in real-time and high accuracy using EMG of the upper limb using eleven muscles and two-position tracker measurements. Benchabane, Saadia, and Ramdane-Cherif [20] introduced a new algorithm for real-time control of five prosthetic finger motions and developed a simple and wearable myoelectric interface. Liu and Young [5] proposed a practical and simple adaptive method for robot control using two channels for conducting upper-arm motion. Junior et al. [21] proposed a surface EMG control system to control the robotic arm based on the threshold analysis strategy. In their proposal, the EMG signal was acquired and processed by a conditioning system using LabVIEW software that gives flexibility and a fast way to reconfigure the settings for controlling an actuation system device.
EMG signals are prone to interference by noises from power lines, electromagnetic radiation, cable movements, skin impedance, among others [5,22,23]. For this reason, signal processing is an indispensable step in the control algorithm. The general outline of the control algorithm can be described as follows. From targeted muscle, the acquired raw EMG signal is conditioned to eliminate noise, relevant features are extracted, and finally, control is performed based on the resulting signal.
EMG control has been applied in various areas of robot control. The control systems in literature can be categorized as pattern and non-pattern recognition systems [8]. The pattern recognition system detects patterns and EMG signals associated with the task. This is the case in hand pattern recognition, finger patterns, and other systems proposed to recognize the current state of the hand [1,11,13,24,25,26,27,28]. On the other hand, nonpattern recognition controls are practical and often used as control schemes. The objective, in this case, is to characterize motion, gripping force, rotation of angles, and others [5,11,18,27,29,30,31,32].
The main objective of this experiment is to investigate the application of wearable EMG device to control a robot arm in real-time. The focus is on EMG signal control corresponding to upper-limb motions of the arm and to analyze its relations. The control scheme is non-pattern recognition in nature, specifically ON/OFF control with threshold level control. This motivation for the usage of this scheme was informed by its simplicity and low computation cost. However, the method has been reported to have reduced accuracy compared with other pattern-based methods [24].
The authors in [33] proposed the use of pattern recognition control mechanism for a malfunctioning upper-limb prosthesis. In this control scheme, only a single degree of freedom (DOF) movement (hand open/close or wrist flexion/extension) was supported at any one time. This paper target multiple DOF with fewer electrodes on the upper limb. Besides, the limitation of the DOF, conventional amplitude-based control method has a slow response and takes time for the users to learn to contract/co-contract the targeted upper-limb muscles [1,25,26,33,34].
In this research, the EMG signal corresponding to upper-limb motions consisting of elbow and shoulder joints is discriminated to control a robot arm applicable in the teleoperated robot cooperation system. A low-cost wearable embedded system was customized to conduct three motions corresponding to the motion of the elbow joint and shoulder joint in real-time to the robot manipulator. The robot manipulator is a compact serial communication robot and suitable for use in home or experiment environments. Three pairs of surface EMG sensors were mounted on the Anterior deltoid, Biceps brachii, and Brachioradialis as the target muscles.
Three different motions were proposed for analysis; the elbow flexion as motion 1, shoulder flexion as motion 2, and a combination of elbow and shoulder flexion movement called the uppercut motion as motion 3. In the recording of the EMG signal, the upper limb of motion 1 and motion 2 are limited to a range of up to 90 degrees for modeling of the relation of EMG and joint angle. The contribution of this study is the development of a control scheme for a robot arm based on the Electromyography and the influence of the position of the EMG muscle targeted and its relationship to upper-limb movements. The targeted muscles are those that play an active role in the movement of the upper arm involving the elbow and shoulder joints. The initial hypothesis from this research is that the brachioradialis muscle (EMG channel 1/CH1) and biceps brachii (CH2) will play a role in movement 1, while the anterior deltoid muscle (CH3) will play a major role in motion 2 and motion 3.
The rest of this document is divided as follows: Section 2 covers the methodology, setup, and materials used. Section 3 summarizes the results from the experiment highlighting the significance of the results in the discussion, and finally, Section 4 offers the conclusion.

2. Materials and Methods

2.1. Proposed System Overview

Figure 1 shows an overview of the proposed control scheme. The system comprises of EMG signal acquisition system, processing unit, motion discrimination model/algorithm, and robot control mechanism. First, three target muscles were selected that are representative of arm muscle activity during motion. Data acquisition was performed on muscle surface using silver chloride electrodes (Ag/AgCl, size: 57 × 48 mm, Biorode, Japan). A combination of three different motion comprising of single and double DOF were recorded.
The flowchart in Figure 2 describes the flow from the beginning until the end of the robot control.
EMG measurement system featured a sensor circuit comprising of an instrumentation amplifier and an operational amplifier. The detailed schematic circuit can be found in reference [14]. The system employed an analog band-pass filter with a lower cut-off frequency of approximately 7 Hz and an upper cut-off frequency of 589 Hz and differential amplifiers’ adjustable gain set to around 59–65 dB for each channel. Data acquisition unit comprised of National Instruments (NI) Corporation USB-6008 for analog to digital conversion and a personal computer (PC) i5 2.7 GHz Let’ note Panasonic. In the offline mode, signals were acquired at a sampling rate of 2 kHz. We used MATLAB® software for subsequent signal processing.

2.1.1. EMG Analysis

The experimental setup involved multichannel EMG signal detection. Control is achieved by the introduction of a threshold to discriminate the state of muscle activation. The signals were processed with a conventional signal processing start from classical EMG signal acquisition, EMG feature extraction, and EMG motion mapping/model.
The acquired raw EMG signal was first processed to remove zero-offset, rectified, and filtered to smoothen the signal. Although the necessary analog filter already performed in the EMG measurement device, digital filter processing, as recommended by the previous researcher [31,35,36] was conducted. The band-pass filter having a bandwidth of 10–400 Hz was applied using the MATLAB signal processing toolbox to remove high random frequency interferences, noise introduced in the digitalization process, and remaining low-frequency noises from the motion artifacts, etc.
Feature extraction employed in this research was in time-domain. The method is often used because of its quick and simple implementation. Time-domain features are processed without any signal transformation for the raw EMG signals and evaluated based on the value of signal amplitude that varies over time [1,22,28,30,32,33,37]. Processing of the raw data is critical to remove baseline noises, motion artifact noises, etc. An essential part of this paper is the analysis of the EMG signal. The acquired signal is passed through equation 1 for rectification and smoothing (by moving average) as shown below.
Y [ n ] = 1 M i = m M | | x [ n + i ] | |
where M is the smoothing window size, and n is the current sampling point, x is the raw EMG signal from DAQ. The output Y[n] represents the processed EMG signal of the anterior deltoid, biceps brachii, and brachioradialis muscles, respectively [14]. After getting the processed EMG signal, the signal was normalized to obtain a uniform distribution discernable by a specified threshold. Equation (2) shows the operation.
E M G n o r m = Y [ n ] Y [ n ] ( m i n ) Y [ n ] ( m a x ) Y [ n ] ( m i n )
where Y [ n ] (min) is the minimum value of the processed signal, and Y [ n ] (max) is the maximum.
Discrimination of active motion was done by identifying different features of three EMG channels (Ch1, Ch2, and Ch3). In this case, we used the features as the control parameters. Three control parameters were applied; the mean of the envelope signal, the maximum value of the amplitude, and the area under the curve. These steps consisted of how to overcome the desired output from the signal features. As earlier mentioned, to discriminate the activation state of the muscles from each channel, we proposed the threshold method. The control parameters determine whether the threshold will be activated or not. The threshold method determined muscle activation state (MS), which was expressed as muscle activation (ON) or muscle deactivation (OFF). The ON state was returned if the signal was above the baseline threshold of the envelope signal whereas, an OFF state resulted whenever the rectified signal was below the baseline threshold. The choice of the threshold was advantageous in aiding the removal of any residual noise of the envelope signal [38]. The muscle state is defined as in (3):
M S ( n )   = { 1 ( O N )   i f   E M G n o r m   > T h   0 ( O F F )   e l s e
where EMGnorm is represented as normalized and filtered EMG signal and n represent either channels 1, 2, or 3 of EMG. To evaluate the performance of the controlling parameter, the successful mapping rate at the times that the robot arm mimics the motion of the subject’s upper-limb motion correctly out of the total number of trials was determined.

2.1.2. Robot Control

Custom assembled R-R-R (three revolute joints) configuration robot arm shown in Figure 3c, assembled from the Dynamixel unit was used in this research work for experiments. Dynamixel AX-12A is a smart actuator with a fully integrated DC servo motor module. The input voltage rating is around 9.0–12 V, which can produce a speed of 59 rpm with a rotation angle (max) of 300 degree, and resolution of 0.2930 deg/pulse. Four actuators were assembled as a robotic arm with two links and three joints, as shown in Figure 3b. In this research, two motors are needed to move the robot joint according to the movement of the shoulder joint and elbow joint.
Robot-PC communication was achieved by using a serial connection between MATLAB and the motor controller connected to the computer via USB. Servo motor controller is a small size universal serial bus (USB) communication converter that enables the interfacing and operation of the actuators from the computer. It also supports 3 pin TTL connectors that used to link up with the Dynamixel motor.

2.2. Target Upper Limb Motion

Human upper-limb movement is one of the most complex motions and involves many components such as musculoskeletal, nerves, and others to support multiple degrees of freedom. The upper limb conducts many motions that require coordination of the joint, which consists of many ranges of motions for daily life tasks. To make the scope of the research more specific, we focus on the shoulder and elbow joints moving separately to represent single DOF movements and a combination of two joints to achieve multiple DOF. The combination of three motion gesture is as illustrated in Figure 4a. Shoulder motion allows three DOFs (i.e., abduction/adduction, flexion/extension, and internal/external rotation). Elbow motion has two DOFs (i.e., flexion/extension and supination/pronation) [36,39].
Targeted muscles were selected, incorporating the motion which is applied. EMG signals were captured from Brachioradialis, Biceps Brachii, and Anterior Deltoid muscles using bipolar electrode placement and one common electrode as reference (ground) placed on the bony part of the elbow as shown in Figure 4b.

2.3. Experimental Design

Five healthy, right-handed subjects (all males) with ages ranging from 20 to 40 years participated as volunteers for the experiment. All of them provided written informed consent following approval procedures (number 27–226) issued by Gifu University ethics committee. The issue is the application to motion intention estimation and device control based on biological signal measurement considering user-specific physical characteristics and environmental characteristics. The experiment was conducted with subjects seated comfortably in a chair with right arm rested (0-degrees). For familiarization, the participants performed several arm motions prior to recording as well as get a proper threshold for individual calibration.
During recording, the participants were instructed to raise and lower the arm, as shown in Figure 4a within 2 s. Every motion was repeated 10 times in offline mode. In online mode, the robot was moved with successive arm motion for visual feedback. The robot arm control was initially conducted in offline mode. The EMG data loaded as an input for controlling the robotic arm.
From Formula (3), if we represented MS(CH1) as A, MS(CH2) as B, and MS(CH3) as C, discrimination of the active motion is handled by the control algorithm with conditions shown in (4). Motion 1, motion 2, and motion 3 are deduced conditional manipulations of signals from the three channels. In particular, when the EMG signal from both CH1 and CH2 is above the baseline TH (threshold), and CH3 is below the threshold; the command to activate motion 1 is classified as ON, as shown in Equation 3. Similarly, if either CH 1 or CH2 is lower than the threshold and CH3 is higher than the threshold, then motion 2 is ON. Finally, motion 3 is solely dependent on all channels that are greater than the threshold.
I f   A   > T h   a n d   B > T h   a n d   C < A   t h e n   I t   i s   C o n d i t i o n   1 I f   A   < T h   o r   B < T h   a n d   C > T h   t h e n   I t   i s   C o n d i t i o n   2 I f   A   > T h   a n d   B > T h   a n d   C > T h   I t   i s   C o n d i t i o n   3
Table 1 shows the discriminations status for each EMG signal related to the movements of the robot arm. Angle θ1 (shoulder joint) and θ2 (elbow joint) are joint angles.

3. Results and Discussion

The following section describes the experiment, signal processing, and robot control model results. The output of the processed EMG is used for controlling the robotic manipulator.
Figure 5 shows three channels and raw EMG signals matrix captured for 2 s. From the figure, motion 1 produces a higher EMG signal on Ch1 (column 1) and Ch2 (column 2) compared to Ch 3 (column 3). The second row describes the results of motion 2, the muscles that are most active to produce the EMG signal voltage are the anterior deltoid (Ch3) and bicep brachii (Ch2) muscles, while the brachioradialis muscles tend to produce minimal tension. Meanwhile, motion 3 is seen in the third line, where all channels appear to generate EMG signal activity.
Figure 6 shows the result of EMG after rectification. Rectification basically yields the magnitude of the signal without its polarity. The full-wave rectified results show oscillatory input of the muscles from the neural activation that EMG signals. This is inherent in all EMG signals, and hence, further processing is necessary to ensure usability. Further processing is performed to arrive at a processed signal.
The results of the normalized and processed signal are shown in Figure 7. From the figure, the onset of raising hand motion is clearly discernable as well as lowering motion. From the design of the experiment, 2 s was found sufficient to capture all the motion. Motion 3 had the strictest time budget, while motion 1 had excess time that resulted in the capturing of motion not related to the research. This excess motion included wrist flex, as will be discussed later.
Figure 8 shows the comparison of three control parameters; mean of the envelope (shown as Mean in the figure), maximum amplitude (Max), and area under the curve (AUC) of the envelope signal (Area). From the results, the AUC of the signal shows more consistency in the accuracy of successful control for each motion better than the Mean and Max methods. Besides the comparison of accuracy, the consistency of the results for different motions was an important factor to consider in choosing the model for the robot arm control. It can be seen in Figure 8 that the Area shows the highest level of accuracy compared to other parameters for each movement. Also, the consistency, calculated as average error for each control parameter, was least in the AUC method. The average error of AUC was 10.1% compared to the Mean and Max methods that reported an error of 13.1% and 14.3%, respectively. From this, AUC parameter is maintained in the rest of the document for inter-subject evaluation of performance.
Figure 9 shows the robotic arm motion corresponding to the discriminated motion. From Figure 9a, which corresponds to Motion 1, brachioradialis muscle (CH1) and biceps brachii (CH2) surpass the baseline threshold. In Figure 9b for motion 2, CH2 and CH3 surpass the threshold, with CH2 being more dominant. In motion 3, all the channels surpass the threshold. From this, it can be seen that anterior deltoid muscle (CH3) plays a major role in motion 2 and motion 3. Biceps brachii (CH2) is equally significant in motion 2 and 3 but more pronounced in motion 3. This can be understood intuitively by the extended nature of motion 2, compared to the clenched elbow joint in motion 3. A visualization of motion and corresponding robot manipulation can be found in this link: https://rb.gy/lfq1iv. The video illustrates controlling a robotic arm using EMG signals in an offline system.
The results of the comparison of intersubject performance carried out in the experiments are shown in Figure 10. It shows the accuracy of the output of successful robot control out of the repetition for the five subjects. The accuracy for each subject varied from 50% to 100% for the three motions. Motion 1 had the highest consistency than others.
Subjects 1, 2, and 4 controlled the robot with more than 70% accuracy while 3 and 5 reported trouble, particularly in motion 2 and 3, which ranged between 50–60%. This was attributed to inconsistent muscle activity during recording. Muscle inconsistency resulted from excessive force employed during motion that is not part of the target muscle activity. This included wrist motions (fist crunching, flexing, or rotations), among others. The discrepancy in the muscle activation introduced difficulty in proper threshold determination. Additionally, we noted timing errors in motion 3 to be a challenge for the two subjects. Besides the motion artifacts, noise and other crosstalk artifacts affected the quality of the signal and thereby affected the prediction of intention from the signal, which is expected from EMG processing [1,22,25]. This presented as overshoots and oversaturation in the amplification gain whenever the users overexerted the motions. This was remedied by familiarization repetitions and feedback from the experimenter during preparatory steps.
From the above, the implementation of upper arm control using the two main joints of the elbow and shoulder is possible. The EMG signal obtained from the movement of the upper arm can be observed and the movement mapped. Based on Farina et al. [40], there are several criteria for implementing ideal prosthetic arm such accuracy, intuitively, robustness, adaptive for the user, the minimum number of electrodes, short and easy training/calibration, feedback on relevant functions/close loop control, limited computational complexity, low consumptions, response time. There is no ideal system to date that meets all of the criteria, but several researchers have tried at least some of these criteria. In this research, we have attempted to optimize target muscle and characterize the upper-limb motion for 2D robot control. The challenge with the approach is a priori knowledge of the sensor data required to inform the choice of threshold. An adaptive threshold determination algorithm and pattern recognition will be explored to further the research. Overall, the development of a measurement control system that improves accuracy, robustness, response time can be attained from this control scheme.

4. Conclusions

This paper presented an EMG investigation of the upper-limb motion-based interface for human-robot interaction in terms of teleoperation tasks. This research demonstrates that three EMG signals that correspond with upper-limb motions have been discriminated and applied successfully for controlling the robotic arm. The area under the curve parameter control yielded more consistency in performance better than the mean and maximum amplitude signals of parameter control. Based on the model’s discrimination that generated from the EMG signals envelope using moving average processing signal, the result shows that models can be used for all subjects to control the robotic arm for conducting single and two DOF movements with its simplicity and low computation cost. Also, the performance of the systems statistically describes the accuracy of all motion for all subjects that varies from 50% to 100%. Further developments will include an improved control scheme using pattern recognition, and better methods can be used to tackle the limitation of the system.

Author Contributions

P.W.L., M.S., and K.M. made the conception and design of the study. P.W.L., T.K., M.S.A.b.S., J.M, and K.M. conducted experiments and analyzed data. P.W.L., J.M., and W.N. wrote and edited this paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Feleke, A.G.; Bi, L.; Guan, C. A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration. Biomed. Signal Process. Control 2019, 51, 1–17. [Google Scholar] [CrossRef]
  2. Bodenhagen, L.; Suvei, S.D.; Juel, W.K.; Brander, E.; Krüger, N. Robot technology for future welfare: Meeting upcoming societal challenges—An outlook with offset in the development in Scandinavia. Health Technol. 2019, 9, 197–218. [Google Scholar] [CrossRef]
  3. Javaid, M.; Haleem, A.; Vaishya, R.; Bahl, S.; Suman, R.; Vaish, A. Industry 4.0 technologies and their applications in fighting COVID-19 pandemic. Diabetes Metab. Syndr. Clin. Res. Rev. 2020, 14, 419–422. [Google Scholar] [CrossRef] [PubMed]
  4. Wang, X.V.; Kemény, Z.; Váncza, J.; Wang, L. Human-robot collaborative assembly in cyber-physical production: Classification framework and implementation. CIRP Ann. Manuf. Technol. 2017, 66, 5–8. [Google Scholar] [CrossRef] [Green Version]
  5. Liu, H.J.; Young, K.Y. An adaptive upper-arm EMG-based robot control system. Int. J. Fuzzy Syst. 2010, 12, 181–189. [Google Scholar]
  6. Artemiadis, P.K.; Kyriakopoulos, K.J. A switching regime model for the emg-based control of a robot arm. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2011, 41, 53–63. [Google Scholar] [CrossRef]
  7. Campeau-Lecours, A.; Cote-Allard, U.; Vu, D.S.; Routhier, F.; Gosselin, B.; Gosselin, C. Intuitive Adaptive Orientation Control for Enhanced Human-Robot Interaction. IEEE Trans. Robot. 2019, 35, 509–520. [Google Scholar] [CrossRef]
  8. Artemiadis, P.K.; Kyriakopoulos, K.J. An EMG-based robot control scheme robust to time-varying EMG signal features. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 582–588. [Google Scholar] [CrossRef]
  9. Tsuji, T.; Shibanoki, T.; Shima, K. EMG-Based Control of a Multi-Joint Robot for Operating a Glovebox. Handb. Res. Adv. Robot. Mechatronics 2015, 36–52. [Google Scholar] [CrossRef]
  10. Dai, H.; Song, S.; Hu, C.; Sun, B.; Lin, Z. A Novel 6-D Tracking Method by Fusion of 5-D Magnetic Tracking and 3-D Inertial Sensing. IEEE Sens. J. 2018, 18, 9640–9648. [Google Scholar] [CrossRef]
  11. Meattini, R.; Benatti, S.; Scarcia, U.; de Gregorio, D.; Benini, L.; Melchiorri, C. An sEMG-Based Human-Robot Interface for Robotic Hands Using Machine Learning and Synergies. IEEE Trans. Compon. Packaging Manuf. Technol. 2018, 1–10. [Google Scholar] [CrossRef]
  12. Cherubini, A.; Passama, R.; Crosnier, A.; Lasnier, A.; Fraisse, P. Collaborative manufacturing with physical human-robot interaction. Robot. Comput. Integr. Manuf. 2016, 40, 1–13. [Google Scholar] [CrossRef] [Green Version]
  13. Shin, S.; Tafreshi, R.; Langari, R. EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm. J. Intell. Fuzzy Syst. 2018, 35, 861–876. [Google Scholar] [CrossRef]
  14. Laksono, P.W.; Sasaki, M.; Matsushita, K.; bin Suhaimi, M.S.A.; Muguro, J. Preliminary Research of Surface Electromyogram (sEMG) Signal Analysis for Robotic Arm Control. AIP Conf. Proc. 2020, 2217, 030034. [Google Scholar]
  15. Sasaki, M.; Matsushita, K.; Rusydi, M.I.; Laksono, P.W.; Muguro, J.; Bin Suhaimi, M.S.A.; Njeri, P.W. Robot control systems using bio-potential signals Robot Control Systems Using Bio-Potential Signals. AIP Conf. Proc. 2020, 2217, 020008. [Google Scholar]
  16. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
  17. Benatti, S.; Milosevic, B.; Farella, E.; Gruppioni, E.; Benini, L. A Prosthetic Hand Body Area Controller Based on Efficient Pattern Recognition Control Strategies. Sensors 2017, 17, 869. [Google Scholar] [CrossRef] [Green Version]
  18. Fukuda, O.; Tsuji, T.; Kaneko, M.; Otsuka, A. A human-assisting manipulator teleoperated by EMG signals and arm motions. IEEE Trans. Robot. Autom. 2003. [Google Scholar] [CrossRef] [Green Version]
  19. Artemiadis, P.K.; Kyriakopoulos, K.J. EMG-based control of a robot arm using low-dimensional embeddings. IEEE Trans. Robot. 2010, 26, 393–398. [Google Scholar] [CrossRef]
  20. Benchabane, S.I.; Saadia, N.; Ramdane-Cherif, A. Novel algorithm for conventional myocontrol of upper limbs prosthetics. Biomed. Signal Process. Control. 2020, 57, 101791. [Google Scholar] [CrossRef]
  21. Junior, J.J.A.M.; Pires, M.B.; Okida, S.; Stevan, S.L. Robotic Arm Activation using Surface Electromyography with LABVIEW. IEEE Lat. Am. Trans. 2016, 14, 3597–3605. [Google Scholar] [CrossRef]
  22. Nazmi, N.; Rahman, M.A.A.; Yamamoto, S.I.; Ahmad, S.A.; Zamzuri, H.; Mazlan, S.A. A review of classification techniques of EMG signals during isotonic and isometric contractions. Sensors 2016, 16, 1304. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Parajuli, N.; Sreenivasan, N.; Bifulco, P.; Cesarelli, M.; Savino, S.; Niola, V.; Esposito, D.; Hamilton, T.J.; Naik, G.R.; Gunawardana, U.; et al. Real-time EMG based pattern recognition control for hand prostheses: A review on existing methods, challenges and future implementation. Sensors 2019, 19, 4596. [Google Scholar] [CrossRef] [Green Version]
  24. Gopura, R.C.; Bandara, S.V.; Gunasekara, M.P. Recent Trends in EMG-Based Control Methods for Assistive Robots. In Electrodiagnosis in New Frontiers of Clinical Research; Turker, H., Ed.; 2013; Chapter 12; pp. 237–268. Available online: https://www.intechopen.com/books/electrodiagnosis-in-new-frontiers-of-clinical-research (accessed on 18 August 2020).
  25. Simao, M.; Mendes, N.; Gibaru, O.; Neto, P. A Review on Electromyography Decoding and Pattern Recognition for Human-Machine Interaction. IEEE Access 2019, 7, 39564–39582. [Google Scholar] [CrossRef]
  26. Young, A.J.; Smith, L.H.; Rouse, E.J.; Hargrove, L.J. A comparison of the real-time controllability of pattern recognition to conventional myoelectric control for discrete and simultaneous movements. J. Neuroeng. Rehabil. 2014, 11, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Hassan, H.F.; Abou-Loukh, S.J.; Ibraheem, I.K. Teleoperated robotic arm movement using electromyography signal with wearable Myo armband. J. King Saud. Univ. Eng. Sci. 2019, 32, 378–387. [Google Scholar] [CrossRef]
  28. Phinyomark, A.; Phukpattaranont, P.; Limsakul, C. Feature reduction and selection for EMG signal classification. Expert Syst. Appl. 2012, 39, 7420–7431. [Google Scholar] [CrossRef]
  29. Rasoo, G.; Iqbal, K.; Bouaynaya, N.; White, G. Real-time task discrimination for myoelectric control employing task-specific muscle synergies. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 24, 98–108. [Google Scholar] [CrossRef] [PubMed]
  30. Jaramillo-Yánez, A.; Benalcázar, M.E.; Mena-Maldonado, E. Real-time hand gesture recognition using surface electromyography and machine learning: A systematic literature review. Sensors 2020, 20, 2467. [Google Scholar] [CrossRef]
  31. Sharma, S.; Dubey, A.K. Movement control of robot in real time using EMG signal. In Proceedings of the 2012 2nd International Conference on Power, Control and Embedded Systems (ICPCES 2012), Allahabad, India, 17–19 December 2012. [Google Scholar] [CrossRef]
  32. Triwiyanto, T.; Rahmawati, T.; Yulianto, E.; Mak’ruf, M.R.; Nugraha, P.C. Dynamic feature for an effective elbow-joint angle estimation based on electromyography signals. Indones. J. Electr. Eng. Comput. Sci. 2020, 19, 178–187. [Google Scholar] [CrossRef]
  33. Samuel, W.O.; Asogbon, M.G.; Geng, Y.; Al-Timemy, A.H.; Pirbhulal, S.; Ji, N.; Chen, S.; Fang, P.; Li, G. Intelligent EMG pattern recognition control method for upper-limb multifunctional prostheses: Advances, current challenges, and future prospects. IEEE Access 2019, 7, 10150–10165. [Google Scholar] [CrossRef]
  34. Hargrove, L.J.; Englehart, K.; Hudgins, B. A comparison of surface and intramuscular myoelectric signal classification. IEEE Trans. Biomed. Eng. 2007, 54, 847–853. [Google Scholar] [CrossRef] [PubMed]
  35. Pons, J.L. Wearable Robots: Biomechatronic Exoskeletons; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  36. Jang, G.; Kim, J.; Choi, Y.; Yim, J. Human shoulder motion extraction using EMG signals. Int. J. Precis. Eng. Manuf. 2014, 15, 2185–2192. [Google Scholar] [CrossRef]
  37. Phinyomark, A.; Khushaba, R.N.; Scheme, E. Feature extraction and selection for myoelectric control based on wearable EMG sensors. Sensors 2018, 18, 1615. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Bin Suhaimi, M.S.A.; Matsushita, K.; Sasaki, M.; Njeri, W. 24-Gaze-Point Calibration Method for Improving the Precision of Ac-Eog Gaze Estimation. Sensors 2019, 19, 3650. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Duprey, S.; Naaim, A.; Moissenet, F.; Begon, M.; Chèze, L. Kinematic models of the upper limb joints for multibody kinematics optimisation: An overview. J. Biomech. 2017, 62, 87–94. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Farina, D.; Jiang, N.; Rehbaum, H.; Holobar, A.; Graimann, B.; Dietl, H.; Aszmann, O.C. The extraction of neural information from the surface EMG for the control of upper-limb prostheses: Emerging avenues and challenges. IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 22, 797–809. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Overview of the experimental setup.
Figure 1. Overview of the experimental setup.
Robotics 09 00083 g001
Figure 2. Flow chart offline robot control.
Figure 2. Flow chart offline robot control.
Robotics 09 00083 g002
Figure 3. Robot arm configuration.
Figure 3. Robot arm configuration.
Robotics 09 00083 g003
Figure 4. (a) Motion illustration and (b) electrode placement for three target muscles.
Figure 4. (a) Motion illustration and (b) electrode placement for three target muscles.
Robotics 09 00083 g004
Figure 5. Sample electromyography (EMG) signals for different motion.
Figure 5. Sample electromyography (EMG) signals for different motion.
Robotics 09 00083 g005
Figure 6. Rectified EMG signals.
Figure 6. Rectified EMG signals.
Robotics 09 00083 g006
Figure 7. Normalized processed EMG signals.
Figure 7. Normalized processed EMG signals.
Robotics 09 00083 g007
Figure 8. Comparison of three controlling parameters.
Figure 8. Comparison of three controlling parameters.
Robotics 09 00083 g008
Figure 9. Upper pictures for subfigure (ac) are shown each robot arm motions, besides lower pictures show discrimination of the EMG signal motion 1, motion 2, and motion 3 respectively.
Figure 9. Upper pictures for subfigure (ac) are shown each robot arm motions, besides lower pictures show discrimination of the EMG signal motion 1, motion 2, and motion 3 respectively.
Robotics 09 00083 g009
Figure 10. Percentage of successful control of the robot arm.
Figure 10. Percentage of successful control of the robot arm.
Robotics 09 00083 g010
Table 1. Discriminating of EMG and robot angles.
Table 1. Discriminating of EMG and robot angles.
EMGUpper-Limb StatusRobot Arm
CH1CH2CH3Angle θ1Angle θ2
ONONOFFMotion 190°
OFFONONMotion 290°
ONONONMotion 390°90°
OFFOFFOFFDo nothing

Share and Cite

MDPI and ACS Style

Laksono, P.W.; Matsushita, K.; Suhaimi, M.S.A.b.; Kitamura, T.; Njeri, W.; Muguro, J.; Sasaki, M. Mapping Three Electromyography Signals Generated by Human Elbow and Shoulder Movements to Two Degree of Freedom Upper-Limb Robot Control. Robotics 2020, 9, 83. https://doi.org/10.3390/robotics9040083

AMA Style

Laksono PW, Matsushita K, Suhaimi MSAb, Kitamura T, Njeri W, Muguro J, Sasaki M. Mapping Three Electromyography Signals Generated by Human Elbow and Shoulder Movements to Two Degree of Freedom Upper-Limb Robot Control. Robotics. 2020; 9(4):83. https://doi.org/10.3390/robotics9040083

Chicago/Turabian Style

Laksono, Pringgo Widyo, Kojiro Matsushita, Muhammad Syaiful Amri bin Suhaimi, Takahide Kitamura, Waweru Njeri, Joseph Muguro, and Minoru Sasaki. 2020. "Mapping Three Electromyography Signals Generated by Human Elbow and Shoulder Movements to Two Degree of Freedom Upper-Limb Robot Control" Robotics 9, no. 4: 83. https://doi.org/10.3390/robotics9040083

APA Style

Laksono, P. W., Matsushita, K., Suhaimi, M. S. A. b., Kitamura, T., Njeri, W., Muguro, J., & Sasaki, M. (2020). Mapping Three Electromyography Signals Generated by Human Elbow and Shoulder Movements to Two Degree of Freedom Upper-Limb Robot Control. Robotics, 9(4), 83. https://doi.org/10.3390/robotics9040083

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop