Next Article in Journal
Electrical Performance Compensation of Reflector Antenna Based on Sub-Reflector Array
Next Article in Special Issue
Improving Dual-Population Differential Evolution Based on Hierarchical Mutation and Selection Strategy
Previous Article in Journal
The Development of a Model System for the Visualization of Information on Cultural Activities and Events
Previous Article in Special Issue
Exploring Zero-Shot Semantic Segmentation with No Supervision Leakage
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Head and Voice-Controlled Human-Machine Interface System for Transhumeral Prosthesis

Department of Biocybernetics and Biomedical Engineering, AGH University of Krakow, al. A. Mickiewicza 30, 30-059 Kraków, Poland
*
Author to whom correspondence should be addressed.
Electronics 2023, 12(23), 4770; https://doi.org/10.3390/electronics12234770
Submission received: 27 October 2023 / Revised: 18 November 2023 / Accepted: 22 November 2023 / Published: 24 November 2023

Abstract

:
The design of artificial limbs is a research topic that has, over time, attracted considerable interest from researchers in various fields of study, such as mechanics, electronics, robotics, and neuroscience. Continuous efforts are being made to build electromechanical systems functionally equivalent to the original limbs and to develop strategies to control them appropriately according to the intentions of the user. The development of Human–Machine Interfaces (HMIs) is a key point in the development of upper limb prostheses, since the actions carried out with the upper limbs lack fixed patterns, in contrast to the more predictable nature of lower limb movements. This paper presents the development of an HMI system for the control of a transhumeral prosthesis. The HMI is based on a hybrid control strategy that uses voice commands to trigger prosthesis movements and regulates the applied grip strength when the user turns his head. A prototype prosthesis was built using 3D technology and trials were conducted to test the proposed control strategy under laboratory conditions. Numerical simulations were also performed to estimate the grip strength generated. The results obtained show that the proposed prosthesis with the dedicated HMI is a promising low-cost alternative to the current solutions. The proposed hybrid control system is capable of recognizing the user’s voice with an accuracy of up to 90%, controlling the prosthesis joints and adjusting the grip strength according to the user’s wishes.

1. Introduction

Reproducing the mechanical functionalities of the human arm is a constant challenge for scientists and engineers. Human upper limbs are indispensable tools, the functions of which include the execution of gross movements and hand gestures, the grasping of objects, and their fine manipulation. Along with the loss of an upper limb, amputees lose not only the ability to grasp and manipulate objects but also some of their ability to express themselves through body language. A sufficient solution to address this issue is the development of an upper extremity prosthesis, which is an electromechanical device that aims to functionally replace the amputee limb.
Depending on the type of source power, the prostheses are classified as body-powered and externally powered prostheses [1]. Body-powered prostheses consist of mechanisms that are usually controlled by ropes or strings and are characterized by their relatively simple operation. When the string or prosthetic pusher is pulled using a given part of the body, the kinematic chain, which replaces the hand and is normally closed, opens, allowing the user to grasp objects when the string is released. Although this design provides simplicity and a relatively low cost, the main disadvantage of a body-powered prosthesis is the requirement to apply significant body force to move the device [2]. Externally powered prostheses, on the other hand, are powered by batteries and are frequently controlled using biological signals, such as electromyography (EMG), electrocorticography (ECoG), and electroencephalography (EEG) signals [3].
A key point in the development of an upper limb prosthesis is the proper design of Human–Machine Interfaces (HMI) that offer intuitive and natural control to the user of the prosthesis [2]. Efforts are underway to develop HMIs that manipulate biosignals to control the mobility of electromechanical elements. The term biosignal is defined as any source of information from a living being that can be continuously measured and monitored [4]. Typically, biosignals used in most applications are electrical in nature, as numerous transducers are based on changes in electrical current or voltage. EMG signals are widely used for the control of prostheses due to their inherent non-invasiveness [5,6]. Myoelectric prostheses are capable of identifying the intention of the user to move by decoding electrical signals in the nervous system. The acquired signals must be detected accurately and classified in real time to properly control the operation of the prostheses [7]. The main limitation of myoelectric prostheses is the difficulty involved in handling EMG signals, which requires expensive processing systems and is highly affected by noise [8]. Furthermore, in high-level amputations or cases in which the stump is in poor condition, the use of EMG signals is an inefficient solution to control the prosthesis. Another alternative to control powered prostheses is the use of EEG signals acquired directly from the skull. However, this signal also presents some challenges, such as low reliability, a complex acquisition process, and low user adaptation [9].
Controlling the movement of the prosthesis could be addressed by taking advantage of nonelectric signals from healthy parts of the body, such as force signals obtained from the muscles [10], mechanical displacements measured through inertial sensors [11], voice commands recorded through microphones [12,13], and respiratory signals [14]. In the literature, different studies have been conducted in which nonelectric signals are used to control limb prostheses. The foot controller approach, which has been widely reported, uses force sensors or Inertial Measurement Units (IMUs) to extract control signals from the foot. When force sensors are used, strain gauges are commonly placed in an insole inside a user-worn shoe [15] and electrical signals are activated when the wearer presses the sensor with the foot. In the case of using IMUs, the control signals are quantities related to foot movement, such as accelerations or linear or angular velocities [16].
This method is suitable for most levels of amputation, provided that the amputee has a healthy foot. However, since it is a signal that is not related to the function of the amputated limb, its application does not allow for a wide range of movements. Another approach to controlling upper limb prostheses is the use of voice recognition devices [17]. Voice recognition is an easy way to interact and control electromechanical devices in real time [18]. Currently, several prostheses exploit voice recognition technology to perform simple movements [13,18]. However, the main drawback of voice control systems is the limited number of voice commands stored in the memory and the difficulty of implementing them to perform precise and complex movements [19].
These limitations have led to the development of hybrid control systems [8,20,21], in which one control system attempts to compensate for the shortcomings of the other. This approach has been especially implemented to enhance the performance of the myoelectric prosthesis [22,23]. To our knowledge, few studies have attempted to combine voice recognition with inertial measurements to control upper limb prostheses, despite this being a promising low-cost control strategy for the design of HMIs. Consequently, this paper proposes the development of a universal HMI system and its dedicated application to the control of a transhumeral prosthesis that tries to overcome the current limitations discussed above.
The proposed HMI recognizes voice commands and inertial signals for the execution of arm movements and the regulation of grip strength, respectively. A voice recognition device and an IMU are used for signal recording. The inertial sensor is mounted on a headband and placed in the user’s head with the intention of reading the orientation of the head around the vertical axis, while the voice recognizer is placed near the chest of the user. A prototype of the prosthesis was built and tested on a user under true-to-life conditions through performing many repetitions of preset hand grips and gestures.

2. System Description

Figure 1 presents a block diagram showing the structure and general idea behind the proposed system. The main component of the system is a six-degree-of-freedom transhumeral prosthesis, the design of which was adapted from [24]. The main task of any healthy muscle is to contract to move the joints of the skeletal system, and on the basis of this fact, the designed transhumeral prosthesis works. Muscles are functionally replaced by a system of servomechanisms, which drive the artificial joints of the prosthesis and thus cause its movement. A hybrid control HMI consisting of two subsystems identifies the user’s demands and commands the servomotors to drive the prosthesis joints to the user’s desired position/configuration. The primary subsystem receives the user’s voice commands and assigns new setpoints to each servomotor to achieve a new prosthesis configuration. The complementary subsystem registers the angular displacement of the head around the saggital axis, also known as the yaw axis, and regulates the grip strength accordingly. The orientation of the head around the yaw axis has been selected to regulate the grip strength, as it does not affect the user’s ability to perform actions involving both hands or legs.
The integration of voice control and head orientation-based control is the approach used by the HMI to control the prosthesis. The voice control system uses the concept of a Finite-State Machine (FSM), which executes a set of instructions that place each of the prosthetic joints in a given position (action) based on the recognized voice command (input). To adjust grip strength, the intervention from the head-orientation-based control system is required. This system is responsible for regulating the position of the servomotor shaft, which drives the finger mechanism according to the movement of the user’s head. This hybrid HMI offers the user the ability to (1) set and activate a discretely predefined configuration of the prosthesis to achieve a predefined grip and (2) continuously adjust the configuration of the finger mechanism to increase grip strength.

2.1. Prosthetic Arm

The considered prosthesis is made up of 36 elements and has six degrees of freedom (DoF), each driven by a servomotor. Each finger has one DoF and consists of three pieces joined by pins, except for the ring and the little finger, whose mechanisms are driven by the same servomotor. Inside the fingers, there are tunnels through which an artificial tendon is threaded, creating a closed loop. In each finger, the artificial tendon passes through the inner tunnel and loops around the fingertip, creating a locking point. When the tendon is stretched, as there is a locking at the fingertip, the joint rotates, which, in turn, causes the fingers to close. Similarly, the application of force at the other end of the tendon opens the fingers from the closed position. The four servomotors located inside the forearm are the end points on the path of the artificial tendons. Figure 2 shows the CAD model of the prosthesis under consideration, in which the path of each artificial tendon is illustrated using different colors.
A kinematic analysis of the finger mechanism is performed to examine the resulting finger motion and determine its dependence relationship with the servomotor motion. The three pieces that make up a prosthetic finger (artificial phalanges) are treated as rigid bodies. The movement of an artificial phalanx is a rotation around a point located in the anterior phalanx, which is caused by the movement of a linear actuator (artificial tendon). Therefore, each phalanx is represented by a three-bar mechanism, in which the anterior phalanx is the base of the mechanism, except for the most proximal phalanx, in which case the base is the palm of the prosthetic hand. This mechanism is the same for each phalanx, so it is sufficient to analyze one mechanism and then link the resulting movements considering the dimensions of each phalanx to obtain the resulting movement of the finger.
Four points of interest are identified in each phalanx mechanism: a joint point of the analyzed phalanx with the anterior phalanx (B), a joint point of the analyzed phalanx with the posterior phalanx (A), a joint point of the analyzed phalanx with the artificial tendon (C), and a joint point of the artificial tendon with the anterior phalanx (D). All connection points function as rotational joints, while the artificial tendon is represented as a linear actuator. A closed kinematic chain is drawn connecting the points A, B, C, and D.
Figure 3a presents the drive mechanism of the prosthetic fingers, showing the kinematic chain analyzed. The vectors that make up the kinematic chain represent the relative positions between the points of interest of the analyzed mechanism. Since the considered chain is closed, the algebraic sum of its component vectors must be equal to zero.
R 2 + R 3 R 4 R 1 = 0
where R 2 is the relative position vector from point B to point A, R 3 is the relative position vector from point A to point C, R 4 is the relative position vector from point D to point C, and R 1 is the relative position vector from point B to point D.
Each of the vectors making up the kinematic chain is presented in polar coordinates using the notation of complex numbers, which results in the following expression:
L 2 e j θ 2 + L 3 e j θ 3 L 4 e j θ 4 L 1 e j θ 1 = 0
where L i is the length of the vector i and θ i is its angle measured with respect to the positive horizontal axis.
Since the vectors R 2 and R 3 represent the relative position of points of the same rigid body, the angle formed between them is constant and equal to 90 in our application; therefore, we can relate the angles θ 2 and θ 3 by means of the following expression:
θ 3 = θ 2 + π
Finally, substituting (3) into (2) and decomposing the vector sum into its horizontal and vertical components using Euler’s identity ( e ± j θ = cos θ ± j sin θ ), the following kinematic equations are obtained:
L 2 cos θ 2 + L 3 cos ( θ 2 + π ) + L 4 cos θ 4 + L 1 cos θ 1 = 0
L 2 sin θ 2 + L 3 sin ( θ 2 + π ) + L 4 sin θ 4 + L 1 sin θ 1 = 0
In (4) and (5), the vector lengths are known constants related to the size of the finger prosthesis elements, except for the length L 4 , which is the new length of the vector R 4 once the artificial tendon is stretched. The elongation of vector R 4 is related to the servo motion through the following kinematic expression:
L 4 = θ s r p
where θ r is the orientation of the servomotor shaft and r p is the radius of the servo-driven pulley.
Solving (4) and (5) for each phalanx and using (6) determines the orientation of the phalanges ( θ 2 ) for each orientation of the servomotor shaft. Since the system of equations to be solved is nonlinear, the motion of the finger mechanism is calculated using numerical methods. For this purpose, the Linkage Mechanism Designer and Simulator computational tool is used [25]. Figure 3b,c show the configuration achieved for the finger mechanism at different orientations of the motor shaft.
The remaining two servomotors perform the prosthetic movements of wrist rotation and biceps flexion. In the case of wrist rotation, the wrist is connected directly to the servomotor shaft, while in the case of biceps flexion, a spur gear is used to increase the torque of the servomotor. The motion of these two joints is directly proportional to the motion of the servomotor, whose constant of proportionality depends on the gear ratio. Therefore, a thorough kinematic analysis is not necessary.

2.2. Voice Control System

The voice control system takes advantage of voice recognition to detect if the user pronounced one of the voice commands stored in the memory. If the HMI detects one of the predefined voice commands, the servomotors move the prosthesis joints to the desired position, as previously determined by the set of instructions created for each voice command. The voice control system consists of three main components: a microphone Horn EM9745P, an EasyVR Shield module, and an Arduino Uno board. The flow of data transmitted from the microphone to the servomotor, shown in Figure 4, is achieved through wiring.
The first component of the data flow is a Horn EM9745P-382 omnidirectional electret condenser microphone, which has an audio sensitivity of −38 dB and operates on a 3 V supply voltage. This element is connected directly to the EasyVR shield 3 Plus module. The selection of EasyVR to perform voice command recognition is based on the simplicity of its integration with the Arduino boards offered by the device, in addition to its low power consumption. This device uses Speaker-Dependent (SD) triggers defined by users for system activation and action execution, allowing users to define and use up to 256 voice commands in any language. Its principle of operation uses a Hidden Markov Model (HMM) to train the command of the system [26]. Once the voice command is recognized, an Arduino Uno board assigns the corresponding setpoint to each of the six servomotors that drive the prosthesis to a new position.

2.3. Head-Orientation-Based Control System

The strategy for regulating the grip strength consists in driving the prosthetic finger servomechanisms to a position in which the wrist is more closed (increase in grip strength) or more relaxed (decrease in grip strength). This effect is achieved by setting a new setpoint for the servomotors, whose value is proportional to the angular displacement of the user’s head around the yaw axis. Therefore, an angular displacement of the user’s head around the yaw axis will correspond to an angular displacement of the fingers of the prosthesis. To record the angular displacement of the user’s head, an IMU is mounted on a headband, which must be worn by the user. The IMU used in this application is a MPU-6050, which is a six-degree-of-freedom Micro-Electromechanical System (MEMS) that combines a three-axis accelerometer with a three-axis gyroscope.
MEMS accelerometers measure the linear acceleration of bodies along a selected axis. If there are no external forces acting on the body, the measured acceleration is a component of the acceleration due to gravity, the magnitude of which depends on the orientation of the body. On the other hand, MEMS gyroscopes take advantage of the appearance of Coriolis acceleration to measure the angular velocity of the body. Subsequently, through the numerical integration of the angular velocity, the orientation of the body can be estimated.
In this application, both the accelerometers and gyroscopes integrated in the MPU-6050 are used to obtain estimates of the user’s head orientation around the yaw axis.
From the accelerometer measurements, the orientation of the user’s head is estimated using trigonometric identities that relate the three components of the linear acceleration measured by the sensor.
θ a c c e l = arctan ( a y a x 2 + a z 2 )
where θ a c c e l is the estimated orientation of the accelerometer, and a x , a y , and a z denote the linear acceleration measured along the x, y, and z axes, respectively.
From gyroscope measurements, the orientation of the user’s head is estimated via numerical integration using the midpoint rule [27].
θ g y r o ( t ) = θ g y r o ( t i 1 ) + ω g y r o ( i ) d t
where θ g y r o is the estimated orientation of the gyroscope, ω g y r o is the measured angular velocity, and d t is the time between consecutive samples.
Although accelerometers and gyroscopes can be used individually to estimate the orientation of an object, their estimation is not error-free. The presence of external forces, the drift error introduced by numerical integration, and the inevitable noise in the measurement reduce the accuracy of the signal [28]. To improve signal quality, different filter techniques are frequently used in IMU applications, among which the complementary filter, the Kalman filter, and the Madgwick filter [29] stand out.
Due to its simplicity and effectiveness, as documented in multiple studies [30,31], the complementary filter was used to process the IMU signals in this application.
While the error in the estimation of body orientation by the gyroscope is mainly low-frequency, the error contribution of the accelerometer can be considered high-frequency. Therefore, it is possible to use the accelerometer to estimate when the body motion is slow (i.e., low frequency), and to use the gyroscope to estimate when the body motion is fast (i.e., high frequency). This is achieved by applying a high pass filter (HPF) to the gyroscope estimate and a low pass filter (LPF) to the accelerometer estimate and combining the two estimates. Since LPF and HPF are complementary to each other in the frequency domain, this sensor fusion approach is known as a complementary filter [32]. Figure 5 shows the operation of the complementary filter in the frequency domain.
In the discrete-time domain, integrating the estimated body orientation from the accelerometer and gyroscope using complementary filtering yields the following expression:
θ e s t = α θ a c c e l + 1 α θ g y r o
where the parameter α depends on the chosen cutoff frequency and the IMU sampling rate.
In this application, MP-6050 measurements were recorded at a sampling rate close to 1 kHz and a value of α of 0.98 was selected, which is one of the typical values found in applications that use IMUs combined with complementary filtering to estimate object orientation [33].

3. Control System Algorithm

User-defined voice commands are stored in EasyVR 3 plus memory in groups of up to 64 commands. Voice recognition only works for one group of commands at a time; therefore, all voice commands intended for use by the same user must be registered in the same group. A list of user-defined voice commands is presented in Table 1. The prefix G 1 refers to the commands stored in group 1.
Fourteen voice commands were defined and stored in the module memory, corresponding to the eleven hand gestures and grips, one elbow rotation, one wrist rotation, and one command to put the device in sleep mode. Data transmission from the EasyVR module takes place through the serial port, with a fixed speed of 9600 bits per second. The module’s waiting time for issuing a voice command is set to 5 s, but this setting can easily be adjusted to suit user preferences. Each voice command executes a set of instructions, defined based on biomechanical rules, that determine a particular grip or gesture of the prosthesis. Since servomotors have an internal control system for position regulation, setting a grip consists of adjusting the setpoints of the servomotors. A number from 0 to 13 was assigned to each voice command, and the Servo library available for Arduino boards was used to drive the servomotors. Table 1 shows the number assigned to each vocal command and the corresponding action performed by the prosthesis.
The number of DoFs of the prosthesis determines the number of servomotor setpoints to be assigned by each voice command. Since the servomotors are of the rotary type, the setpoint of their internal position control corresponds to an angular position. A range of motion was established from 0 to 180 for each DoF of the prosthesis, whereby each voice command consists of six values from 0 to 180 that modify the position of the servomotors.
The head-orientation-based control system is the complementary system that regulates the grip strength. Two grips have been selected for which grip strength can be continuously regulated: the spherical grip and the cylindrical grip. By controlling the servomotor angular position, the head-orientation-based control system drives the prosthesis finger mechanism, allowing the user to grasp objects of different thicknesses and even deform them if desired.
The control system algorithm for the proposed hybrid HMI is presented in Figure 6. The user initiates the operation of the prosthesis by issuing a voice command that the EasyVR module attempts to recognize. Once the voice command is recognized, the servomotors drive each joint of the prosthesis to a specific position, according to the set of instructions defined for the given grip, gesture, or movement. This first stage corresponds to the operation of an FSM, in which the voice command is the input and the transition to a new state is the movement of a certain DoF of the prosthesis. If the voice command is not recognized, the prosthesis will remain in its current state. In the second stage, the HMI performs the function of detecting whether there is a change in the orientation of the user’s head around the yaw axis. If a change in orientation is detected, the grip strength regulation system is activated, which produces a change in the position of the servomotors proportional to the detected movement. Any change in the user’s head orientation has no influence on the operation of the prosthesis if the commands concerning the cylindrical or circular grip are not recognized.
The head-orientation-based control system is a closed-loop control in which the user wearing the prosthesis intervenes directly by observing the effect of the resulting grip strength and adjusting the orientation of his or her head to obtain the desired effect.

4. System Assembly

The components of the prosthetic arm were manufactured using the fused deposition modeling technique (FDM) with an Ender-3 Pro 3D printer. PLA filament was employed as the printing material. Propylene pins were utilized to assemble the three pieces that comprise the finger mechanism and to attach the fingers to the metacarpus component, while thin polypropylene strings served as artificial tendons.
To avoid the entanglement of the strings and errors in finger movement due to the lack of space in the narrow wrist isthmus, a heat-shrink sleeve was applied to the tendons to prevent the strings from rubbing against the sharp edges of the printed elements. The microphone was placed in a plastic housing with a protective sponge and connected to the recognition system using a shielded cable.
The weight of the prototype is about 800 g, without considering the weight of electronics and instrumentation. Six Tower Pro MG995 servos were chosen to operate the finger mechanism, wrist, and elbow joints. These actuators were selected due to their capability to deliver torque of up to 10 kg-cm (0.98 N-m) while operating within a relatively low voltage range of 4.8 V to 6 V. The use of MG995 servos in the design of limb prostheses has been documented in several studies [34,35].
In the case of the elbow rotation mechanism, the input torque must support the weight of the entire forearm, hand, and the possible weight of the object being lifted. To build a functional elbow joint, a gearbox was designed and connected to the output of the corresponding servomotor. This designed gearbox reduction increases the maximum torque of the servo to 21.6 kg cm (2.1 N m), which is enough for the application.
The power source consists of two 18650 lithium-ion cells, each with a capacity of 10.8 Ah and a voltage of 4.2 V. The serial connection of the cells satisfies the power requirement of the six servomotors, the Arduino board, and the voice control module. Due to the high power used, a security module was added to the system. The BMS2S 10 A model was selected and combined with 18650 Li-ion cells, which protects the system against short circuits, overload, and the excessive discharge of a single cell.

5. Performance Test

The functionality of the designed arm prosthesis and its HMI were tested under controlled conditions. The artificial fingers open and close without major complaints. However, the fact that the same servomotor drives the mechanism of the little and ring fingers disables the possibility of performing gestures or grips that require the independent movement of these fingers. The battery life during continuous operation was about 3 h.
The programmed grips allow the user to perform common activities of daily living that require the involvement of a functional arm. The gesture “Indicator Finger” allows the user to use the computer keyboard and operate buttons and switches (Figure 7), while tweezers, spherical, and cylindrical grips allow for holding objects of different shapes (Figure 8). In addition to basic grips, the HMI prosthesis is also designed to express emotions through simple gestures, such as the “Like Gesture”, which can be used to express agreement.
The implemented HMI system was tested by three different users (aged 20–30 years) with a fully functional human arm. The tested users wore the headband with the IMU attached to it, while the prosthesis was placed on a set-up that facilitated the observation of the prosthetic movements by the experimenter. The voice recognition system was trained to recognize the voice commands issued by each user. Participants were asked to adjust the microphone position to 5 to 10 cm from the lips and execute each of the voice commands.
The voice control algorithm test consisted of repeating all voice commands in random order and registering the response of the device. Each of the 14 commands was repeated 15 times. If the command was recognized, a value of 1 was registered; otherwise, a value of 0 was recorded. The accuracy of command recognition was calculated by averaging the results of all tests performed. The results are presented in Table 2. During the voice control system testing, there was no situation in which one voice command was confused with another. This fact is significantly important in a positive sense, as the misinterpretation of a voice command by the HMI can damage the objects the user manipulates, express an incorrect emotion, or, in the worst case, harm the user or other people.
The head-orientation-based control system was also tested by performing repetitive actions. The voice command for the cylindrical grip was requested to be executed so that the prosthesis grabbed a cylindrical roll of cardboard wrapping paper. Once the object was held by the prosthesis, the user was asked to slightly turn their head to activate the complementary control system. The same test was performed while holding a plastic cup to test the performance of the prosthesis when grasping objects of different shapes. The algorithm worked appropriately in both cases, as shown in Figure 9. However, it was not an easy task to find the precise orientation of the head that would grip the object sufficiently to be able to manipulate it without damaging it.
The designed prosthesis does not have a system to measure grip strength. The only existing feedback present in the grip strength regulation system is the user’s visual perception of what happens to the grasped object when they turns their head.
To estimate the grip strength produced by the prosthesis at different user head orientations, a numerical simulation was performed based on the dynamic modeling of the MG995 servomotor.
The dynamic elements that comprise the servomotor are shown in Figure 10. In the formulated model, the rotation of the servomotor causes the deformation of an elastic element, which is the element that grips the user. This element is modeled as a rotational spring whose stiffness is linearly proportional to the angular displacement of the motor and is added to the servomotor output. The comprehensive electromechanical system under consideration includes the electrical circuit of the armature, the internal motor position control, and the mechanical system consisting of the motor shaft, the internal gear drive, and the external load.
Employing Newton’s second law and Kirchhoff’s voltage law, the governing equations for the servomotor are derived as follows:
u = R i + L d i d t + e
τ l = η N τ m b m N θ ˙ l c l θ l J m N θ ¨ l = J l θ ¨ l
where u is the input voltage, i is the armature current, R is the armature electric resistance, L is the armature electric inductance, N is the internal gear transmission ratio, e is the back Electromotive Force (EMF), θ l is the angular displacement of the servomotor shaft, τ l is the load torque, τ m is the servomotor torque, b m is the servomotor viscous friction constant, c l is the rotational stiffness of the element gripped by the prosthesis, J m is the moment of inertia of the rotor, J l is the moment of inertia of the external load (finger mechanism), and η is the internal gear efficiency.
The back EMF and motor torque are linked to the rotor angular velocity and armature current, respectively, via the following expressions:
τ m = K t i
e = K ω θ ˙ m
where K t is the motor torque constant and K ω is the back EMF constant.
In SI units, the motor torque and the back EMF constants are equal, i.e., K t = K ω ; therefore, the term K is used to represent both the motor torque constant and the back EMF constant.
The expressions (12) and (13) are substituted into (10) and (11), followed by the application of the Laplace transform. The resulting equations are then combined and subjected to algebraic manipulation to derive the transfer function of the open-loop system ( G ( s ) ), establishing the relationship between the input voltage ( U ( s ) ) and the angular position of the servomotor ( Θ l ( s ) ).
G ( s ) = Θ l s U s = η N K R + L s J m η N 2 + J l s 2 + b m η N 2 s + c l + η N 2 K 2 s
To integrate the operation of the controller into the derived transfer function of the open-loop system, it is assumed that the servomotor controller operates solely on the basis of proportional control. The transfer function relating the commanded servomotor setpoint to the servo angular position consists of the closed-loop transfer function of the system, as given by the following expression:
Θ l s Θ r s = K p G ( s ) 1 + K p G ( s )
where K p is the proportional gain of the controller.
Using (15), the closed-loop transfer function of the servomotor is obtained, thus establishing the relationship between the servomotor setpoint ( θ r ( s ) ), which is proportional to the orientation of the head of the user, and the angular position of the servomotor ( Θ l ( s ) ).
Θ l s Θ r s = η N K K p R + L s J m η N 2 + J l s 2 + b m η N 2 s + c l + η N 2 K 2 s + η N K K p
To estimate the rotational stiffness c l of the element gripped by the prosthesis, a simplified model of its deformation is formulated. Figure 11 presents the proposed model.
The rotation of the servomotor produces the rotation of the three joints of the finger mechanism. Assuming that the rotation of each joint produces an axial deformation (represented in Figure 11 by linear springs) of the element gripped by the prosthesis, the net torque generated by the servomotor is given by the following expression:
τ l = i = 1 3 F i r i
where F i is the axial force produced by each finger joint and r i is its corresponding force arm.
The rotation of the finger mechanism joints is related to the rotation of the servomotor by (4)–(6). To simplify the estimation of c l , a more general case is considered, which occurs when the angle of rotation of each phalanx is equal to the angular rotation of the servomotor. Under this condition, the net torque is given by the following expression:
τ l = i = 1 3 F i l i sin θ l
where l i refers to the distance between the axis of rotation (servomotor shaft) and the corresponding spring attachment point i.
The force acting on each linear spring is given by:
F i = k l i cos θ l
where k is the axial stiffness of the gripped element.
Combining (18) and (19) and applying the trigonometric identity sin 2 θ = 2 cos θ sin θ results in the following expression:
c l θ l = i = 1 3 k l i 2 sin 2 θ l 2
Finally, assuming that the angular displacements of the servomotor when deforming the element are small ( sin θ θ ), the rotational stiffness of the element deformed by the finger mechanism is given by the following expression:
c l = k L e q
where L e q is the equivalent length of the finger mechanism and is given by 1 3 l i 2 .
The equations derived from the servomotor model are solved using the Matlab Simulink environment. The dynamic parameters of the servomotor are taken from the study carried out in [36], where a MG995 servomotor is modeled considering that the external load is a cylinder coupled at a distance r from the rotor axis. Similarly, the rotational stiffness of the element gripped by the prosthesis is estimated from the results of the research carried out in [37], where the deformation of disposable cups of different materials is studied.
During the designed simulations, different rotational speeds of the user’s head around the yaw axis and rotational stiffness coefficients of the gripped element were evaluated. It has been assumed that the user wearing the prosthesis performs an angular displacement with their head at a constant speed until the desired angular position is reached. Figure 12 and Figure 13 show the torque generated by the servomotor and the resulting grip torque applied to the element gripped by the prosthetic hand.
From the results obtained, it can be noticed that the grip torque (measure of grip strength) reaches higher peaks during the transient state, when the user rotates their head at a higher speed. When a steady state is reached, the grip torque maintains its value as long as the user does not make another movement. It is also observed that the required grip torque increases linearly, proportional to the stiffness of the deforming element.

6. Discussion

Based on the tests performed on the designed transhumeral prosthesis, it can be seen that the developed HMI is a feasible solution to the interaction between the user and the prosthesis. The main advantage of the HMI is its relatively high reliability, although its performance is still far from that of more advanced limb prostheses based on electrical biosignals. The components used for the design of the proposed control system are accessible at a low cost, and the programming of the necessary software does not involve high complexity for people who want to adapt the system to their own needs. The integration of discrete voice commands with a linear control system based on inertial measurements allows the execution of appropriate DoFs that ensure the uninterrupted linear regulation of prosthesis movements, which is an advantage over control strategies based solely on voice commands. The ability to continuously regulate selected movements opens up a world of possibilities in terms of the activities that the prosthesis user can perform. The developed HMI was tested on a low-cost prosthesis that allows for the movement of six DoFs. More features can be expected when tested on a more robust prototype prosthesis.
Despite the good performance exhibited by the developed HMI, there is certainly room for improvement. In this application, the complementary system only works to regulate the grip strength; however, the prosthesis movements assigned to each of the set commands could be executed continuously and not discretely, as in the current version. A possible improvement could be to incorporate into the complementary system the option of controlling the position of the elbow joint upon the detection of the voice command related to this joint. In this way, this movement of the prosthesis would gain a wider range of operation. Another possible improvement would be to add some kind of feedback, since in the current version the user receives no information from the environment other than the information received by their eyes. The implementation of force sensors in the fingers could be used to predict whether the head movement is sufficient or if a little more is needed, without the risk of damaging the object to be manipulated.
Through numerical simulations, an estimate of the grip torque generated by the prosthesis finger mechanism was obtained. In this analysis, it is assumed that the prosthetic hand tries to deform an element whose torque–strain relationship is linear. The simulation allows us to observe not only the resulting steady-state torque but also the time it takes for the system to reach that state. The servomotor used does not exceed its maximum torque under the conditions evaluated; however, a high-speed rotation of the head trying to deform a resistant material can cause the system to exceed its limit. In this case, the security module is activated.
From a biomechanical point of view, it is important to note that the uniform contribution of the six tendons to the grip strength is a limitation of the proposed system, since the real hand has the ability to individually adjust the force of each tendon. The proposed HMI adjusts the grip strength by adjusting the force of each artificial tendon using the same angular force–displacement relationship of the user’s head.

7. Conclusions

In this article, the development of an HMI based on a hybrid control system was carried out for the control of a transhumeral prosthesis. The control strategy consists of the implementation of voice commands for the execution of the predefined movements and inertia-based measurement for the regulation of the grip strength. The developed HMI was implemented to control a six-degree-of-freedom low-cost upper limb prosthesis that corresponds to the rotation of the four fingers, wrist, and elbow. The primary control system uses a voice recognition device, which triggers a specific set of movements when a predefined voice command is detected. The complementary system adjusts the grip strength using a continuous proportional control strategy based on measurements from an IMU placed in a headband worn by the user.
The novel HMI allows the user to perform many different grips, and the only limitations are the DoFs of the upper prototype prosthesis in which it is implemented. Fourteen voice commands were programmed, including grasps and hand gestures. Each was configured to execute a set of predefined instructions corresponding to a specific grasp or gesture. This number can be increased in the future, depending on additional needs and broader purposes. The complete system was tested under normal conditions by users with a healthy motor control system. The tests performed consisted of performing repetitive movements and observing the performance of the prototype when manipulating objects of different shapes.
Numerical simulations based on the electromechanical model of servomotors have been carried out to estimate the grip torque generated when deforming an element whose torque–strain relationship is linear. The simulations performed make it possible to measure not only the value of the grip torque generated, but also the time it takes for the system to reach a steady state. Both characteristics depend both on the movement performed by the user with their head and on the element grasped; therefore, it can be concluded that the operation of the grip strength regulation system is highly dependent on the operating conditions of the prosthesis.
From the tests performed, from which a voice command recognition accuracy of 90% was obtained, it is concluded that the proposed HMI based on voice recognition and inertial measurement is a feasible and promising technical solution for the management of a low-cost upper limb prosthesis.

Author Contributions

Conceptualization, P.P. and M.I.; methodology, L.M.A. and M.I.; software, P.P. and L.M.A.; validation, P.P.; formal analysis, L.M.A., M.S. and M.I.; investigation, L.M.A. and P.P.; resources, M.I. and P.A.; data curation, P.P. and L.M.A.; writing—original draft preparation, L.M.A.; writing—review and editing, M.S., P.A. and M.I.; visualization, L.M.A.; supervision, M.I. and P.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been funded by AGH University of Krakow in 2023, as part of the “Initiative of Excellence—Research University" program under grant No. 6350, and as research project No. 501.696.7996.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Abbreviations

The following abbreviations are used in this manuscript:
DoFDegree of Freedom
ECoGElectrocorticography
EMFElectromotive Force
EEGElectroencephalography
EMGElectromyography
FSMFinite-State Machine
HMIHuman–Machine Interface
IMUInertial Measurement Unit
MEMSMicro-Electromechanical System
SDSpeaker-Dependent

References

  1. Beckerle, P.; Willwacher, S.; Liarokapis, M.; Bowers, M.P.; Popovic, M.B. Prosthetic Limbs; Elsevier: Amsterdam, The Netherlands, 2019; pp. 235–278. [Google Scholar] [CrossRef]
  2. Castellini, C. Upper Limb Active Prosthetic Systems-Overview; Elsevier: Amsterdam, The Netherlands, 2019; pp. 365–376. [Google Scholar] [CrossRef]
  3. Osborn, L.E.; Iskarous, M.M.; Thakor, N.V. Sensing and Control for Prosthetic Hands in Clinical and Research Applications; Elsevier: Amsterdam, The Netherlands, 2019; pp. 445–468. [Google Scholar] [CrossRef]
  4. Kaniusas, E. Fundamentals of Biosignals; Springer: Berlin/Heidelberg, Germany, 2012; pp. 1–26. [Google Scholar] [CrossRef]
  5. Zecca, M.; Micera, S.; Carrozza, M.C.; Dario, P. Control of multifunctional prosthetic hands by processing the electromyographic signal. Crit. Rev. Biomed. Eng. 2002, 30, 459–485. [Google Scholar] [CrossRef] [PubMed]
  6. Jafarzadeh, M.; Hussey, D.C.; Tadesse, Y. Deep learning approach to control of prosthetic hands with electromyography signals. In Proceedings of the 2019 22nd IEEE International Symposium on Measurement and Control in Robotics: Robotics for the Benefit of Humanity, ISMCR 2019, Houston, TX, USA, 19–21 September 2019; pp. A1-4-1–A1-4-11. [Google Scholar] [CrossRef]
  7. Hye, N.M.; Hany, U.; Chakravarty, S.; Akter, L.; Ahmed, I. Artificial Intelligence for sEMG-based Muscular Movement Recognition for Hand Prosthesis. IEEE Access 2023, 11, 38850–38863. [Google Scholar] [CrossRef]
  8. Krasoulis, A.; Kyranou, I.; Erden, M.S.; Nazarpour, K.; Vijayakumar, S. Improved prosthetic hand control with concurrent use of myoelectric and inertial measurements. J. Neuroeng. Rehabil. 2017, 14, 71. [Google Scholar] [CrossRef] [PubMed]
  9. Beyrouthy, T.; Al Kork, S.K.; Korbane, J.A.; Abdulmonem, A. EEG Mind controlled Smart Prosthetic Arm. In Proceedings of the 2016 IEEE International Conference on Emerging Technologies and Innovative Business Practices for the Transformation of Societies (EmergiTech 2016), Balaclava, Mauritius, 3–6 August 2016; pp. 404–409. [Google Scholar] [CrossRef]
  10. Cho, E.; Chen, R.; Merhi, L.K.; Xiao, Z.; Pousett, B.; Menon, C. Force myography to control robotic upper extremity prostheses: A feasibility study. Front. Bioeng. Biotechnol. 2016, 4, 18. [Google Scholar] [CrossRef] [PubMed]
  11. Bennett, D.A.; Goldfarb, M. IMU-Based Wrist Rotation Control of a Transradial Myoelectric Prosthesis. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 419–427. [Google Scholar] [CrossRef]
  12. Ángel-López, J.P.; Arzola de la Peña, N. Voice Controlled Prosthetic Hand with Predefined Grasps and Movements. In International Federation for Medical and Biological Engineering (IFMBE), Proceedings of the VII Latin American Congress on Biomedical Engineering CLAIB 2016, Bucaramanga, Colombia, 26–28 October 2016; Springer: Singapore, 2017; Volume 60, pp. 520–523. [Google Scholar] [CrossRef]
  13. Asyali, M.H.; Yilmaz, M.; Tokmakçi, M.; Sedef, K.; Aksebzeci, B.H.; Mittal, R. Design and implementation of a voice-controlled prosthetic hand. Turk. J. Electr. Eng. Comput. Sci. 2011, 19, 33–46. [Google Scholar] [CrossRef]
  14. Nagaraja, V.H.; Moulic, S.G.; D’Souza, J.V.; Limesh, M.; Walters, P.; Bergmann, J.H. A Novel Respiratory Control and Actuation System for Upper-Limb Prosthesis Users: Clinical Evaluation Study. IEEE Access 2022, 10, 128764–128778. [Google Scholar] [CrossRef]
  15. Navaraj, W.T.; Heidari, H.; Polishchuk, A.; Shakthivel, D.; Bhatia, D.; Dahiya, R. Upper limb prosthetic control using toe gesture sensors. In Proceedings of the 2015 IEEE SENSORS, Busan, Republic of Korea, 1–4 November 2015. [Google Scholar] [CrossRef]
  16. Resnik, L.; Klinger, S.L.; Etter, K.; Fantini, C. Controlling a multi-degree of freedom upper limb prosthesis using foot controls: User experience. Disabil. Rehabil. Assist. Technol. 2014, 9, 318–329. [Google Scholar] [CrossRef]
  17. Chaudhry, A.; Batra, M.; Gupta, P.; Lamba, S.; Gupta, S. Arduino Based Voice Controlled Robot. In Proceedings of the 2019 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS 2019), Greater Noida, India, 18–19 October 2019; pp. 415–417. [Google Scholar] [CrossRef]
  18. Samant, P.; Agarwal, R. Real-time speech recognition system for prosthetic arm control. Int. J. Sens. Comput. Control 2015, 5, 39–46. [Google Scholar]
  19. Anwer, S.; Waris, A.; Sultan, H.; Butt, S.I.; Zafar, M.H.; Sarwar, M.; Niazi, I.K.; Shafique, M.; Pujari, A.N. Eye and Voice-Controlled Human Machine Interface System for Wheelchairs Using Image Gradient Approach. Sensors 2020, 20, 5510. [Google Scholar] [CrossRef]
  20. Fezari, M.; Atoui, H.; Lakel, R. Implementation of a Hybrid Voice guiding System for Robot Arm. In Proceedings of the 4th International Conference on Computer Integrated Manufacturing (CIP 2007), Sétif, Algeria, 3–4 November 2007. [Google Scholar]
  21. Alkhafaf, O.S.; Wali, M.K.; Al-Timemy, A.H. Improved Prosthetic Hand Control with Synchronous Use of Voice Recognition and Inertial Measurements. IOP Conf. Ser. Mater. Sci. Eng. 2020, 745, 012088. [Google Scholar] [CrossRef]
  22. He, Y.; Shima, R.; Fukuda, O.; Bu, N.; Yamaguchi, N.; Okumura, H. Development of distributed control system for vision-based myoelectric prosthetic hand. IEEE Access 2019, 7, 54542–54549. [Google Scholar] [CrossRef]
  23. Sattar, N.Y.; Kausar, Z.; Usama, S.A.; Naseer, N.; Farooq, U.; Abdullah, A.; Hussain, S.Z.; Khan, U.S.; Khan, H.; Mirtaheri, P. Enhancing classification accuracy of transhumeral prosthesis: A hybrid sEMG and fNIRS approach. IEEE Access 2021, 9, 113246–113257. [Google Scholar] [CrossRef]
  24. Hussein, M.E. 3D Printed Myoelectric Prosthetic Arm. Ph.D. Thesis, University of Sydney, Camperdown, Australia, 2014. [Google Scholar]
  25. Rector, D. Linkage Mechanism Designer and Simulator. Available online: https://blog.rectorsquid.com/linkage-mechanism-designer-and-simulator (accessed on 12 November 2023).
  26. Hasan, N.F.; Ruzaimi, M.; Rejab, M.; Sapar, N.H. Implementation of Speech Recognition Home Control System Using Arduino. ARPN J. Eng. Appl. Sci. 2015, 10, 17492–17498. [Google Scholar]
  27. Chapra, S.C.; Canale, R.P. Numerical Methods for Engineers, 7th ed.; McGraw-Hill Education: New York, NY, USA, 2015. [Google Scholar]
  28. Edwards, T.S. An improved wavelet correction for zero shifted accelerometer data. Shock Vib. 2003, 10, 159–167. [Google Scholar] [CrossRef]
  29. Lin, C.-L.; Chiu, W.-C.; Chu, T.-C.; Ho, Y.-H.; Chen, F.-H.; Hsu, C.-C.; Hsieh, P.-H.; Chen, C.-H.; Lin, C.-C.K.; Sung, P.-S.; et al. Innovative head-mounted system based on inertial sensors and magnetometer for detecting falling movements. Sensors 2020, 20, 5774. [Google Scholar] [CrossRef] [PubMed]
  30. Zhang, H.; Lee, S. Robot Bionic Eye Motion Posture Control System. Electronics 2023, 12, 698. [Google Scholar] [CrossRef]
  31. Shi, Y.; Zhang, Y.; Li, Z.; Yuan, S.; Zhu, S. IMU/UWB Fusion Method Using a Complementary Filter and a Kalman Filter for Hybrid Upper Limb Motion Estimation. Sensors 2023, 23, 6700. [Google Scholar] [CrossRef]
  32. Ngo, H.Q.T.; Nguyen, T.P.; Son, H.; Le, T.S.; Nguyen, C.T. Experimental comparison of Complementary filter and Kalman filter design for low-cost sensor in quadcopter. In Proceedings of the 2017 International Conference on System Science and Engineering (ICSSE), Ho Chi Minh City, Vietnam, 21–23 July 2017; pp. 488–493. [Google Scholar] [CrossRef]
  33. Albaghdadi, A.; Ali, A. An Optimized Complementary Filter For An Inertial Measurement Unit Contain MPU6050 Sensor. Iraqi J. Electr. Electron. Eng. 2019, 15, 71–77. [Google Scholar] [CrossRef]
  34. Koudelkova, Z.; Mizera, A.; Karhankova, M.; Mach, V.; Stoklasek, P.; Krupciak, M.; Minarcik, J.; Jasek, R. Verification of Finger Positioning Accuracy of an Affordable Transradial Prosthesis. Designs 2023, 7, 14. [Google Scholar] [CrossRef]
  35. Das, S.; Nandi, D.; Neogi, B. Design Analysis of Prosthetic Unilateral Transtibial Lower Limb with Gait Coordination. Prosthesis 2023, 5, 575–586. [Google Scholar] [CrossRef]
  36. Santos Santana, L.M.; Maximo, M.; Goes, L.C.S. Physical Modeling and Parameters Identification of the MG995 Servomotor. In Proceedings of the 26th International Congress of Mechanical Engineering, Florianopolis, Brazil, 22–26 November 2021. [Google Scholar] [CrossRef]
  37. Da Silva, W.A.; Luna, C.B.B.; de Melo, J.B.d.C.A.; Araújo, E.M.; Filho, E.A.d.S.; Duarte, R.N.C. Feasibility of Manufacturing Disposable Cups using PLA/PCL Composites Reinforced with Wood Powder. J. Polym. Environ. 2021, 29, 2932–2951. [Google Scholar] [CrossRef]
Figure 1. Block diagram presenting the components of the proposed system.
Figure 1. Block diagram presenting the components of the proposed system.
Electronics 12 04770 g001
Figure 2. (a) CAD model of the prosthetic arm. The prosthetic arm has 6 DoFs that correspond to the rotation of the elbow, the rotation of the wrist, and the movement of each finger. The ring and little fingers are driven by the same servomotor. (b) Finger mechanism. The servomotor stretches the string, causing the rotation of the three finger joints.
Figure 2. (a) CAD model of the prosthetic arm. The prosthetic arm has 6 DoFs that correspond to the rotation of the elbow, the rotation of the wrist, and the movement of each finger. The ring and little fingers are driven by the same servomotor. (b) Finger mechanism. The servomotor stretches the string, causing the rotation of the three finger joints.
Electronics 12 04770 g002
Figure 3. (a) Finger mechanism at resting position/configuration. Each phalanx is represented by a three-bar mechanism and analyzed by drawing a closed kinematic chain. (b) Selected position/configuration of the finger mechanism during the closing movement of the hand. (c) Selected position/configuration of the finger mechanism during the closing movement of the hand.
Figure 3. (a) Finger mechanism at resting position/configuration. Each phalanx is represented by a three-bar mechanism and analyzed by drawing a closed kinematic chain. (b) Selected position/configuration of the finger mechanism during the closing movement of the hand. (c) Selected position/configuration of the finger mechanism during the closing movement of the hand.
Electronics 12 04770 g003
Figure 4. Block diagram presenting the main components of the voice control system.
Figure 4. Block diagram presenting the main components of the voice control system.
Electronics 12 04770 g004
Figure 5. Principle of operation of the complementary filter in the frequency domain.
Figure 5. Principle of operation of the complementary filter in the frequency domain.
Electronics 12 04770 g005
Figure 6. Flow chart describing the operation of the hybrid control system.
Figure 6. Flow chart describing the operation of the hybrid control system.
Electronics 12 04770 g006
Figure 7. Gesture “Indicator Finger” used for operating keyboard buttons.
Figure 7. Gesture “Indicator Finger” used for operating keyboard buttons.
Electronics 12 04770 g007
Figure 8. (a) Spherical and (b) Tweezers grip, used for holding objects.
Figure 8. (a) Spherical and (b) Tweezers grip, used for holding objects.
Electronics 12 04770 g008
Figure 9. Cylindrical grip tested on a plastic cup and a cardboard cylinder (a) before and (b) after the intervention of the grip strength regulation system.
Figure 9. Cylindrical grip tested on a plastic cup and a cardboard cylinder (a) before and (b) after the intervention of the grip strength regulation system.
Electronics 12 04770 g009
Figure 10. Dynamic model of the servomotor. A rotational spring has been added to the servo output to simulate the grip of an element by the prosthesis.
Figure 10. Dynamic model of the servomotor. A rotational spring has been added to the servo output to simulate the grip of an element by the prosthesis.
Electronics 12 04770 g010
Figure 11. Model of the deformation of the element gripped by the prosthesis. Each joint of the finger mechanism produced an axial deformation of the gripped element.
Figure 11. Model of the deformation of the element gripped by the prosthesis. Each joint of the finger mechanism produced an axial deformation of the gripped element.
Electronics 12 04770 g011
Figure 12. (a) Servomotor torque ( τ m ) and (b) grip torque considering a rotational stiffness coefficient ( c l ) of 0.2 and different angular velocities of the user’s head.
Figure 12. (a) Servomotor torque ( τ m ) and (b) grip torque considering a rotational stiffness coefficient ( c l ) of 0.2 and different angular velocities of the user’s head.
Electronics 12 04770 g012
Figure 13. (a) Servomotor torque ( τ m ) and (b) grip torque considering an angular velocity of the user’s head ( θ ˙ r ) of 4 rad/s and different rotational stiffness coefficients of the element gripped by the prosthesis.
Figure 13. (a) Servomotor torque ( τ m ) and (b) grip torque considering an angular velocity of the user’s head ( θ ˙ r ) of 4 rad/s and different rotational stiffness coefficients of the element gripped by the prosthesis.
Electronics 12 04770 g013
Table 1. List of user-defined voice commands.
Table 1. List of user-defined voice commands.
Command NameNumberAction Executed by the Prosthesis
G 1 _Open0Opens the hand
G 1 _Hello1Performs gesture “Hello”
G 1 _Welcome2Performs gesture “Welcome”
G 1 _Fist3Performs gesture “Fist”
G 1 _Indicate4Performs gesture “Indicator Finger”
G 1 _Hook5Performs hook grip
G 1 _Cylinder6Performs cylindrical grip
G 1 _Sphere7Performs spherical grip
G 1 _Tweezer8Performs tweezers grip
G 1 _Like9Performs gesture “Like”
G 1 _Victory10Performs gesture “Victory”
G 1 _Sleep11Puts the device into sleep mode
G 1 _Elbow12Rotates the elbow joint θ degrees
G 1 _Wrist13Rotates the wrist joint θ degrees
Table 2. Accuracy of the voice command recognition system.
Table 2. Accuracy of the voice command recognition system.
Voice CommandAccuracy
G 1 _Open80%
G 1 _Hello73.3%
G 1 _Welcome86.7%
G 1 _Fist86.7%
G 1 _Indicate93.3%
G 1 _Hook86.7%
G 1 _Cylinder93.3%
G 1 _Sphera93.3%
G 1 _Tweezer86.7%
G 1 _Like100%
G 1 _Victory80%
G 1 _Sleep100%
G 1 _Elbow93.3%
G 1 _Wrist100%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Molina Arias, L.; Iwaniec, M.; Pirowska, P.; Smoleń, M.; Augustyniak, P. Head and Voice-Controlled Human-Machine Interface System for Transhumeral Prosthesis. Electronics 2023, 12, 4770. https://doi.org/10.3390/electronics12234770

AMA Style

Molina Arias L, Iwaniec M, Pirowska P, Smoleń M, Augustyniak P. Head and Voice-Controlled Human-Machine Interface System for Transhumeral Prosthesis. Electronics. 2023; 12(23):4770. https://doi.org/10.3390/electronics12234770

Chicago/Turabian Style

Molina Arias, Ludwin, Marek Iwaniec, Paulina Pirowska, Magdalena Smoleń, and Piotr Augustyniak. 2023. "Head and Voice-Controlled Human-Machine Interface System for Transhumeral Prosthesis" Electronics 12, no. 23: 4770. https://doi.org/10.3390/electronics12234770

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop