Next Article in Journal
Critical Design and Characterization Methodology for a Homemade Three-Axis Fluxgate Magnetometer Measuring Ultra-Low Magnetic Fields
Previous Article in Journal
Investigation into the Performance of TDR and FDR Techniques for Measuring the Water Content of Biochar-Amended Loess
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Highly Efficient HMI Algorithm for Controlling a Multi-Degree-of-Freedom Prosthetic Hand Using Sonomyography

by
Vaheh Nazari
1 and
Yong-Ping Zheng
1,2,*
1
Department of Biomedical Engineering, The Hong Kong Polytechnic University, Hong Kong 999077, China
2
Research Institute for Smart Ageing, The Hong Kong Polytechnic University, Hong Kong 999077, China
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(13), 3968; https://doi.org/10.3390/s25133968
Submission received: 14 April 2025 / Revised: 2 June 2025 / Accepted: 24 June 2025 / Published: 26 June 2025
(This article belongs to the Section Sensors and Robotics)

Abstract

Sonomyography (SMG) is a method of controlling upper-limb prostheses through an innovative human–machine interface by monitoring forearm muscle activity through ultrasonic imaging. Over the past two decades, SMG has shown promise, achieving over 90% accuracy in classifying hand gestures when combined with artificial intelligence, making it a viable alternative to electromyography (EMG). However, up to now, there are few reports of a system integrating SMG together with a prosthesis for testing on amputee subjects to demonstrate its capability in relation to daily activities. In this study, we developed a highly efficient human–machine interface algorithm for controlling a prosthetic hand with 6-DOF using a wireless and wearable ultrasound imaging probe. We first evaluated the accuracy of our model in classifying nine different hand gestures to determine its reliability and precision. The results from the offline study, which included ten healthy participants, indicated that nine different hand gestures could be classified with a success rate of 100%. Additionally, the developed controlling system was tested in real-time experiments on two amputees, using a variety of hand function test kits. The results from the hand function tests confirmed that the prosthesis, controlled by the SMG system, could assist amputees in performing a variety of hand movements needed in daily activities.

1. Introduction

Hands help to perform the majority of human activities in daily living, and losing one or both hands results in independence reduction [1]. Even though most artificial limbs used today are either purely cosmetic or serve a practical purpose with limited functionalities (such as a hook-like gripper), various multi-fingered prosthetic hands have been developed and commercialized [2,3,4], including the i-Limb Hand, KIT hand, Michelangelo Hand, Bebionic Hand, and Vincent Hand, all of which depend on electrical motors and complex mechanical components. Moreover, the invention of additive technology revolutionized manufacturing methods by decreasing the cost of production and the weight of robots, as well as speeding up the product development process. This invention also affects the industry of prostheses, encouraging researchers and engineers to create numerous 3D-printed prosthetic hands [5,6,7,8,9,10].
Despite the advancements in developing novel, dexterous, and state-of-the-art prosthetic hands with the ability to assist amputees in performing different daily activities [11,12,13], around 50–70% of patients refuse to wear and use current prosthetic hands due to their poor functionality, high cost [8,14,15,16,17,18], low comfort, lack of sensory feedback, and most importantly, inaccurate controlling system, not being able to effectively predict users’ intended movements and provide natural-like control over prostheses [11,19].
To identify the most important features of upper-limb prostheses, several studies have been conducted. The key factors can be listed as anthropomorphic characteristics (kinematics, size, weight, and appearance) [3,20], performance (speed, force, and dexterity) [21,22,23], and strong and integrative grasping [14,24,25]. Bioinspired motion speeds and adequate grip force are necessary for the device to be useful for carrying out the activities of daily living (ADLs) [26]. However, among the most fundamental needs for a robotic prosthesis is the capability to control the robot with sufficient precision and responsiveness of the fingers so that it may be used effectively and with sufficient dexterity [14,27,28,29].
Despite the study of various human–machine interfaces (HMIs), there is still a lack of prosthetics with the reliable control of multiple degrees of freedom [11]. For instance, using biological signals such as electromyography (EMG) or electroencephalography (EEG) as non-invasive approaches have been studied and proposed as popular HMIs, enabling users to control rehabilitation devices not only for prostheses but also rehabilitation robots [30] and exoskeletons [31,32]. However, these techniques are very noisy, and the recorded signals can be affected by electrode movements as well as sweating [33]. Also, EMG sensors are not able to monitor deep muscle activities, making this controlling system unable to be used in predicting more complex hand gestures with acceptable accuracy. For EEG control, the response time is still relatively slow [34,35,36]. In addition, the intended hand gestures performed by robots are limited, and the most commercialized EMG-controlled prostheses still only have close and open functions, although different approaches for controlling robots with high dexterity have been proposed at the research level.
In recent years, in order to improve the quality of signals recorded from sensors as well as decrease the amount of noise, invasive techniques such as implanted EMG, targeted muscle reinnervation, myoelectric implantable recording arrays (MIRAs) [37], magnetomicrometry (MM) [38], and others have been proposed. However, invasive approaches raise numerous questions regarding safety and efficacy since the electrodes need to be implanted into the body [33]. The field has been searching for a signal which can represent individual muscle activation and be collected non-invasively.
Over the last two decades, using signals extracted from the ultrasound images of muscle during contraction to control prosthetic hands has been a popular research topic. Zheng et al. first studied the feasibility of controlling robotic hands using an ultrasound device in 2006, in which the term “sonomyography” (SMG) was proposed by the team for this non-invasive HMI approach [39]. Basically, SMG refers to the signal representing architectural changes in a muscle detected via real-time ultrasound images during its contraction [40]. Since ultrasound imaging can inherently differentiate the activities of both deep and superficial muscles as well as a group of neighboring muscles simultaneously and non-invasively, SMG has attracted the attention of a lot of researchers since it was proposed [41,42,43,44,45,46,47]. Recently, a unique mobile SMG system to monitor muscles’ activities was evaluated regarding its reliability and validity by Ma et al. in 2019, paving the way for the real-time monitoring of muscle activity throughout both indoor and outdoor activities, especially for controlling prostheses using a wireless SMG system [48].
A number of SMG-based prosthesis control systems have already been reported in the literature, which mainly focus on the demonstration of feasibility, including using single-element transducers [49]. A low-power SMG system designed for wearable use with a prosthetic hand was proposed by Engdahl et al. in 2020 [50]. Using AI to classify intended hand gestures, the authors demonstrated that their suggested technique successfully classified nine distinct finger motions with an accuracy of around 95%. In 2020, Yang et al. [51] advocated for the use of wearable 1D SMG (A-mode ultrasound transducer) equipment in combination with subclass discriminant analysis (SDA) and principal component analysis (PCA) to predict wrist rotation (pronation/supination) and finger movements. This research demonstrated that the SDA machine learning method could be used to identify both finger gesture and wrist rotation concurrently with accuracies of around 99.89% and 95.2%, respectively.
To overcome the difficulties caused by single-element transducers, a number of studies reported the use of B-mode imaging transducers [49]. In a study published in 2019, Akhlaghi et al. [52] evaluated the effect of using a sparse set of ultrasound scanlines to determine the optimal location on the forearm for capturing the maximal deformation of the primary forearm muscles during finger motions and classifying various types of hand gestures and finger movements. The results indicated that the ultrasonic probe should be placed over around 40–50% of the forearm’s length in order to identify distinct hand motions with greater precision. This is because the largest muscle activation occurs in this region. In addition, the categorization result demonstrated that employing B-mode ultrasound to operate a prosthetic hand was a viable option, since the accuracy was almost 95%. In 2019, Li et al. [53] tested the capabilities of M-mode and B-mode ultrasound to detect 13 various hand and finger movements in eight able-bodied subjects. Using the Support Vector Machine (SVM) algorithm to classify various hand gestures, the accuracy of the M-mode classification was determined to be 98.83 ± 1.03%, and that of the B-mode classification was determined to be 98.77 ± 1.02%. On the other hand, the accuracy of the Backpropagation Artificial Neural Network (BP-ANN) classifier was 98.77% in M-mode and 98.76% in B-mode. They discovered that M-mode SMG transducers were equally as accurate as B-mode SMG signals when it came to detecting wrist and finger movements, as well as in differentiating between a variety of hand gestures, which suggests their possible utility in human–machine interfaces.
Zheng et al. in 2006 [39] and Guo et al. in 2008 [54] conducted the first experiments to evaluate the relationship between morphological changes in forearm muscles and wrist angle. The results of their studies showed that muscle deformation measured by ultrasound correlated linearly with wrist angle. Moreover, in 2011 and 2012, Castellini et al. [55,56] conducted exciting experiments to assess the potential of an SMG system in predicting the position of the fingers using ultrasound images collected from human forearms. The results of their studies, by discovering a linear relationship between finger position and the extracted features from ultrasound images, showed that this novel controlling system had great potential not only for predicting intended hand gestures, but also for providing information regarding finger position and the amount of flexion, enabling the SMG controlling system to provide a proportional and natural-like control experience to people with amputations.
For a more complete understanding of the various systems and techniques developed using the ultrasound imaging of muscle, or SMG, for controlling upper-limb prostheses, readers can refer to a review paper recently published by Vaheh et al. in 2023 [57], which conducted a comprehensive evaluation and comparison of the results and findings of previously published works on SMG systems as novel human–machine interfaces. The outcomes of this review paper demonstrated the promise of ultrasonic sensing as a practical human–machine interface for the control of bionic hands with multiple degrees of freedom. In addition, this review showed that a variety of machine learning algorithms combined with feature extraction models could correctly classify various hand gestures with an accuracy of about 95% [57].
Building on these insights, and considering other mainstream technologies including EMG, we prepared a comparison table (Table S1) that summarized the key factors—accuracy, latency, weight, and cost—across these leading technologies used in prosthetic hand control [57,58,59,60,61,62,63,64,65].
Despite all the feasibilities demonstrated about using SMG together with machine learning or deep learning methods for detecting hand gestures with the potential for prothesis control, there are few reports about testing an SMG-control-based robot on real amputees [66]. Considering that residual muscles after amputation surgery are very different from those in normal subjects, the promising results demonstrated in earlier papers may not necessarily stand with residual limbs. In addition, up to now, there is still no report of a system integrating SMG together with a prothesis for testing on amputee subjects to demonstrate its performance in daily activities.
In this study, we report the design and performance of a novel SMG controlling system, a lightweight (360 g), functional, and cost-effective 6-DOF prosthetic hand called ProRuka (Figure 1). The novel ProRuka was developed and tested by considering anthropomorphism, functionality, safety, and comfort, all of which were inspired by the structure of the human hand. To evaluate the accuracy of the proposed machine learning model in classifying the different hand gestures needed in daily activities, ten able-bodied volunteers were recruited to attend our first experiment. For the offline evaluation experiment, data were first collected from the ten able-bodied participants before being used to train the model and assess the accuracy of the AI model. Among all the data collected from the ten volunteers, around 70% of it was used for training and the rest for validation. For the amputee subjects, data were collected from the individual residual limbs and used to train their individual models. This step is very similar to the training session for using conventional EMG-controlled protheses. The trained model, together with the prosthesis and the controlling system, was evaluated with two amputee subjects, who performed standardized hand function tests including the Box and Blocks (B&B) test, Targeted Box and Blocks (TB&B) test, and Action Research Arm Test (ARAT).

2. System and AI Model Development

2.1. Programming Environment and Tools

In this study, the primary programming language used was Python 3.10. The main libraries utilized in our code included TensorFlow for deep learning and NumPy for numerical computations. Additionally, the scikit-learn (sklearn) library was employed for implementing various machine learning algorithms and model evaluation. All the key parameters for the models have been specified to facilitate the replication of the results.

2.2. Classification of Different Hand Gestures Using Ultrasound Imaging

For the control part reported in this paper, different classification methods were studied. The participants were divided into able-bodied and amputee groups. Each volunteer was asked to sit in a comfortable position and put their hand on a cushion. Then, the muscle activities in different hand gestures were captured using a palm-sized wireless ultrasound probe. The collected images were first resized into 32 × 32 pixels (from 912 × 912 pixels) and then normalized from a [0, 255] to [0, 1] pixel intensity. After that, a convolutional neural network (CNN) was used to extract the features from each image, and these features were used to train a model with a machine learning or regression algorithm. We used Random Forest (RF) with 100 estimators and the random state of 100, k-nearest neighbors (KNN) with 10 neighbors, SVM, and a Decision Tree Classifier (DTC) with a maximum depth of 10 and a minimum sample split of 2 as the machine learning algorithms to train the model, while to train the model using a regression algorithm, we used decision tree regression (max_depth = 3), nearest neighbor regression (n_neighbour = 5), and support vector regression (C = 100, gamma = 0.1, epsilon = 0.1) with two different kernels, linear (SVR-L) and polynomial (SVR-P). Then, the accuracy of each machine learning algorithm was examined and compared.

2.2.1. Feature Extraction

Since the machine learning algorithms could not process all the raw information contained in the images, a CNN algorithm with pretrained weights was used to extract the features from the collected data. Then, these extracted features were utilized to train the AI model. Three different pretrained models including VGG16, VGG19, and InceptionResNetV2 were individually used for feature extraction. To select and extract features, 64 filters from the first convolutional layer were utilized. The features extracted from the training data were then used for classification. It is important to note that using more filters increases the number of extracted features, which can increase the accuracy of AI models. However, more time and GPU memory are required to train models with more extracted features.

2.2.2. Classification

Figure 2 shows the overall schematic of the whole classification process. After extracting the features, these data were used for training three different machine learning classification algorithms, including RF, KNN, DTC, and SVM, as well as four regression algorithms (decision tree regression, nearest neighbor regression, SVR-L, SVR-P) to classify different hand gestures and finger movements. Two-thirds of the collected data were used for training and the rest were applied for validation.

2.3. Replacing Ultrasound Gel and Gel Pad with Sticky Silicone Pad

For the sticky silicone pad, biocompatible silicone liquid (Deping, Guangdong Province, China) was used to create a pad using the molding technique. In the experiment, two different silicones with hardness ratings of Shore 00-00 and Shore 00-05 were utilized. Three different silicone pads were created for the experiment. The first one was a silicone pad with a hardness of 0. The ultrasound images had a good resolution using this pad, but it was too sticky, and it was difficult to put it on the hand with the prosthesis. A second silicone pad with a hardness of 05 was created. The resolution was good for controlling the prosthesis, but the pad was fragile and could be damaged easily during application and removal. Thus, for the third pad, a combination of silicones with 00 and 05 hardness were mixed together in a 3:1 ratio. The testing results demonstrated that the image quality provided by this silicone gel pad was good enough for controlling the robot, and it was sticky enough to minimize transducer movement. Additionally, the flexibility of the pad was good enough to be used with a socket without any damage (Figure 3).

2.4. Designing a Novel Prosthetic Hand

To evaluate the novel SMG controlling system in this study, we developed a low-cost, lightweight, user-friendly, dexterous, multiple-degree-of-freedom, functional prosthetic hand (Figure 4A). To make the prosthetic hand resemble a normal human hand, a 3D model of a normal human hand was first prepared using a portable industrial 3D scanner (EinScan Pro 2x, Shining 3D, Hangzhou, Zhejiang, China). Then, the hand joints of the scanned model were replaced with mechanical joints to make the prosthesis functional (Figure 4B). Moreover, by considering the important role of the abduction and adduction of the thumb in grasping different types of objects and performing 80% of daily living hand activities, we considered the rotational movement in the MCP joint in the design of our prosthetic hand (Figure 4B). In addition, to increase the friction between objects and the prosthesis and decrease the chance of objects slipping and falling from the prosthesis, the fingertip of each finger as well as the palm of prosthesis was made of silicone (Platinum Cure Silicone Rubber Compound with a shore hardness of 00-50, Smooth-On, Macungie, PA, USA). Furthermore, 3D printing technology with black nylon material was utilized (VPrint 3D, Hong Kong, China) to make the prosthesis cost-effective and lightweight.

3. Experiment and Results

3.1. Participants

Since musculoskeletal anatomy is different between able-bodied people and those with transradial limb loss, it was important to assess the accuracy of the proposed classification method for both groups. Consequently, we separated the participants into able-bodied and amputee groups. The study was approved by the Human Subjects Ethics Sub-committee of The Hong Kong Polytechnic University (HSEARS20220720001).
Ten able-bodied volunteers (five males and five females, aged between 22 and 33) were recruited for experiment 1 (the healthy group), and two amputee subjects (both males aged between 45 and 69, respectively, referred to as A1 and A2) were recruited for experiment 2 (the amputee group). Both amputee subjects had undergone left-hand transradial amputation, 26 and 45 years after their injury, respectively. Each participant completed an informed consent form after receiving information about the research and the experimental design.

3.2. Experimental Setup

The volunteers were asked individually to sit in a comfortable position, put their hand on a cushion, and keep their palm upwards. A B-mode lightweight (only 67 g) wireless ultrasound module (Model UL-1C, Beijing SonopTek Limited, Beijing, China) was fixed on the forearm using a customized case (Figure 5A,B). To collect the maximum amount of muscle activity, the probe was placed perpendicular (transverse) to the forearm within 30% to 50% of the length of the forearm from the elbow (Figure 5B). Moreover, to minimize the effect of transducer relocation on accuracy, data were collected at different transducer locations.

3.3. Experiment 1: Performance of Offline Classification

In the first experiment, an offline classification experiment was conducted in the able-bodied group to evaluate the potential of SMG as a novel HMI method. The accuracy of the classification method with different machine learning classification/regression algorithms including DTC, nearest neighbor regression (NNR), decision tree regression (DTR), KNN, SVR-L, SVR-P, SVM, and RF were compared after training and validation data were collected from 10 able-bodied people. In the final stage, for further evaluation of the developed model, nested and non-nested cross-validations were utilized.

Data Collection for Offline Testing

In the offline test, the able-bodied group were asked to sit comfortably on a chair and place their elbow on a pillow, with the palm facing upward. Before collecting data for training and validation, the position of the ultrasound transducer was first defined and fixed, making sure that key muscles, including the flexor digitorum superficialis (FDS), flexor pollicis longus (FPL), and flexor digitorum profundus (FDP), were covered by the transducer (Figure 5A). Each subject was then asked to perform one of nine different hand gestures, including rest, individual finger flexion (index, middle, ring, little, and thumb), fist, pinch, and key pinch, and hold it for 5 s. All nine hand gestures were repeated three times. To avoid fatigue and spasm in the muscles, there were 15 s of rest between each hand gesture. In the offline testing of the able-bodied group, in total, 11,625 images were collected, and 8350 of them were used for training, while 3275 images (384 × 400 pixels) were used for validation.

3.4. Experiment 2: Real-Time Functional Performance

To evaluate the functionality and performance of ProRuka, the developed controlling system and prosthetic hand were tested in real-time experiments with two amputees, using a variety of hand function test kits. In experiment 2, the B&B test as well as the TB&B test, which is a modified version of the B&B test, and Action Research Arm Test (ARAT) were utilized to evaluate the functional performance of the prosthesis in daily activities. Before the evaluation session, the two participants with transradial amputation were asked to attend two training sessions to improve their skills in controlling the robot as well as become familiar with the prosthetic hand and the process of the evaluation session.
Box and blocks test: Gross manual dexterity is often evaluated using a test called the B&B test [67]. The evaluation kit consisted of a box with two squared compartments which are separated by a partition (Figure S1). One of the compartments was filled with 150 wooden cubes (25 mm3), combining in such a way that the blocks may be found to rest in a wide variety of positions. The number of blocks that were moved over the barrier in the allotted time of 60 s was how the test was scored. The participants were free to carry the blocks in whatever order they wanted, provided that their fingers passed the partition between the two compartments before releasing the block into the desired location.
Targeted box and blocks test: The TB&B test was performed with 16 (for 4 × 4 TB&B Test [68]) and 9 (for 3 × 3 TB&B Test [69]) blocks. The TB&B Test is an upper-limb functional task designed to elicit ecologically meaningful activities such as movement initiation and the grasping, transporting, and controlled releasing of items. In addition to its use in assessing patients’ functional improvement after undergoing rehabilitation, this test may also be used as an outcome measure in clinical studies of upper-limb transradial prosthetic devices [70]. A standard grid was placed on both sides of the compartment, and the volunteers were asked to move each block to the other side of the compartment into its mirrored location. The box was turned upside down so that the outside area could be used for the assessment, which made it simpler to complete and also enabled the prosthetic hand to avoid colliding with the box’s walls (Figure S2).
Action Research Arm Test: The ARAT, which is extensively used to measure arm function, is one of the most prominent hand function evaluation kits. The testing kit consists of 19 different items to assess the different grasping types and arm movement (Figure S3). The whole assessment process takes approximately 10 min, scores are given based on the participants’ arm movement and functionality, and for each item, the score is rated between 0 (no movement) and 3 (normal movement) [71,72]. ARAT scores vary from 0 to 57, with 57 indicating higher performance. The final score indicates weak (less than 10), moderate (10–56), or excellent (57) hand function [73].

Data Collection for Real-Time Classification Testing

In the real-time classification experiment (experiment 2), to evaluate the whole SMG controlling system in the last session, different functional hand gestures including rest, pinch, key pinch, and cylindrical grip (fist) were classified as useful grasping types to help the participants use the robot in their ADLs. It is worth mentioning that out of the four available grips, the AI model was only trained with a cylindrical grip, since the robot was not able to perform other grips.
During the experiment for real-time classification, data were collected using one static and two different dynamic strategies, as the ultrasound image for each hand gesture varied due to hand movements while performing different tasks. In the static strategy, the same as in the previous experiment, ultrasound images from forearm muscles were collected while the participant’s hand was placed on the table with the palm upward (Figure 5C). In the first dynamic strategy, the participants were first asked to extend their hands and keep their palm in a supination position, then flip their hand without trying to move their wrist (flipping 90 degrees) while performing and holding one of the hand gestures (Figure S4). This activity was performed at a moderate speed and repeated three times for each hand gesture (rest, pinch, key grip, and fist). In the second dynamic strategy, the volunteers were asked to extend their elbow and then rotate their forearm 180 degrees three times while performing and holding one of the four hand gestures. They were instructed to perform the rotations at a moderate speed, defined as 1–2 s per complete rotation (180 degrees). This speed was monitored to ensure consistency across trials, with a target rotation speed of approximately 90 degrees per second. The amputee subjects were asked to repeat this process twice. The whole process for each hand gesture took 120 s, with a total of 480 s for the four hand gestures.

3.5. Results

3.5.1. Offline Classification Results

The offline classification results showed that the combining of a transfer learning model with one of the machine learning classification algorithms (KNN, RF, SVM, DTC) as well as regression methods, including nearest neighbor regression and decision tree regression, had the potential to classify the nine different hand gestures with an accuracy of more than 91% (Figure 6). However, training the model using SVR-L, SVR-P, and MLP showed significantly poorer performance in classifying the different hand gestures, with accuracies of 55.96%, 55.38, and 23%, respectively. Table 1 and Table 2 summarize the offline classification results obtained using various machine learning and regression algorithms and transfer learning approaches.
Figure 7 shows the 2D t-SNE visualization of the extracted features from the transfer learning model with decision boundaries learned by different classifiers. The t-SNE projection reveals well-formed clusters corresponding to each hand gesture class, indicating effective feature extraction. The KNN classifier produces smooth, well-separated decision regions, reflecting its strong ability to discriminate classes in this reduced space. In contrast, the Random Forest shows more fragmented boundaries due to its ensemble nature, while the Decision Tree Classifier creates axis-aligned, blocky partitions that result in less-smooth class separation. These visualizations provide intuitive insights into the classifiers’ differing strengths in handling complex feature distributions, with smoother boundaries suggesting better generalization and more fragmented or blocky boundaries indicating sensitivity to local patterns or abrupt transitions in the data.
To evaluate the performance and generalizability of our model, we applied a cross-validation (CV) scheme to the offline experiment. We tested the CV on models trained using the RF, DTC, or KNN machine learning algorithms, since these models showed the maximum accuracy in the offline test (100% accuracy). The statistical analysis of the cross-validation scores revealed a mean performance of approximately 99.8, indicating strong model effectiveness. The median score of 99.5 and mode of 99.5 further emphasize consistent performance across evaluations. With a range of 0.6, variance of 0.0712, and standard deviation of 0.2667, the scores exhibited minimal variability, suggesting the reliability of the results. The calculated margin of error for a 95% confidence level was ±0.197, leading to a confidence interval of [99.54, 99.94], which indicates that the true mean performance is likely to fall within this range. These findings collectively highlight the model’s robustness, with a reliable performance.
It is worth mentioning that more time was needed to train the models using SVR-L, SVR-P, decision tree regression, nearest neighbor regression, DTR, and SVM, while the RF and KNN models were the fastest in training using the collected datasets (around 205 s for the ten able-bodied volunteers, with 8350 images for training and 3275 images for validation).

3.5.2. Real-Time Performance Results

Based on the offline test results, VGG16 was used to extract features and an RF machine learning algorithm was utilized to train the model (the accuracy of classifying the different hand gestures using this method was the highest). The two volunteers were invited to attend the experiment conducted to evaluate the functionality of the developed prosthesis. They were asked to complete the different hand function tests with the prosthesis in addition to their healthy hand to compare the results.
The final scores and results of the hand function test are summarized in Table 3. During the experiment, it was observed that a minimum of 120 s was needed to collect the training data for each hand gesture with an accuracy of 100%. It was also observed that the accuracy of classification was minimally reduced after transducer replacement due to putting on and taking the prosthesis. However, during data collection for training, data were collected at different transducer locations to minimize the effect of transducer relocation on accuracy.
The results of the B&B and TB&B tests demonstrated that the volunteers were able to pick up the blocks via pinching and successfully move them to the other side of the box without any misclassification during hand movements. Both of the participants were able to easily transfer around 13 blocks within 60 s without any prior training. However, the prosthetic hand exhibited limited fine dexterity, as evidenced by the relatively low number of blocks transferred compared to typical healthy hand performance. These limitations are likely attributable to the lack of sensory feedback and the rigid structure of the prosthetic hand, which reduce the user’s ability to modulate their grip force and adapt movements dynamically, thereby contributing to reduced efficiency and increased user fatigue during task execution.
The outcome of the ARAT demonstrates good performance in grasping and gripping different objects of different sizes, indicating reliable power and precision grips. Nonetheless, ProRuka struggled with fine motor tasks, such as picking up small objects, which require a delicate pinch gesture. Based on the scores earned by the volunteers, the prosthetic hand’s overall functional performance corresponds to that of a hand with moderate function. More specifically, individuals can perform basic tasks but may struggle with more complex or fine motor activities.

3.5.3. Evaluating the Potential of Using a Silicone Pad Instead of Ultrasound Gel or a Gel Pad

In the experiment conducted to evaluate the potential of the silicone pad to be replaced with ultrasound gel in order to control the prosthesis using ultrasound imaging, we observed that a silicone pad could provide real-time images of the muscle with good image quality and that the captured data could be utilized to enable real-time control over the prosthetic hand. Moreover, it was also discovered that the sticky silicon pad did not only stop transducer relocation but also reduced the stress on the skin by dampening the transducer’s reaction force.

4. Discussion

SMG is a novel HMI method that allows users to control a prosthetic hand by capturing the residual muscles’ activities using ultrasound imaging. Figure 1 illustrates the SMG method as a new HMI technique for controlling prostheses with multiple degrees of freedom. In this study, the potential to control a prosthetic hand using SMG was evaluated. To classify different hand gestures, a combination of transfer learning models (including VGG16, VGG19, and InceptionResNetV2) and machine learning algorithms were utilized. And the results show that this new method has high potential to be utilized in the control of prosthetic hands.
The offline classification results showed that combining one of the transfer learnings with algorithms such as Random Forest, k-nearest neighbors, Decision Tree Classifier, Support Vector Machine, and regression methods yielded accuracies exceeding 91% for hand gesture recognition. Conversely, models based on support vector regression (linear and polynomial) and a multi-layer perceptron demonstrated substantially lower accuracy, between 23% and 56%. Cross-validation further validated the robustness of the top-performing models, with an average accuracy of approximately 99.8% and minimal variability across evaluations.
In the functional evaluation test, we found that the volunteers who attended our study were able to control the prothesis and execute the different hand gestures needed for ADLs without any previous training. We found that collecting data from participants’ hands while they performed movements and rotated their wrists (see Figure S1) significantly reduced the misclassification errors during changes in arm position. This approach enhances the reliability of the control system, making it suitable for use in real-world settings beyond the laboratory. Moreover, the scores achieved by the two volunteers in the ARAT show that the developed SMG system to control the prosthetic hand has the potential to assist people with transradial hand amputation to perform different hand gestures needed for ADLs (Figure 8), and the scores also prove that the functionality of the prosthesis is as good as a hand with moderate hand function. The B&B and TB&B tests showed the functionality of the developed robot with this novel controlling system in regard to manipulating objects using pinching. Moreover, in the experiments, no misclassification during hand movements was observed when the volunteers wanted to transfer blocks (Supplementary Videos S1–S3).
During data collection and the testing of the SMG controlling system, we noticed that gel pads and ultrasonic gels increased the possibility of probe movement, which significantly lowered the precision and reliability of the SMG controlling system. In addition to this, the skin is in jeopardy due to the prolonged contact with moisture. Additionally, gels contaminate the environment in which the ultrasound probe is mounted. Several potential solutions to these problems have been proposed and evaluated by researchers in the last few years. For instance, Wang et al. recently created a bioadhesive ultrasound (BAUS) device that can provide pictures from organs for 48 h. To securely adhere an array of piezoelectric elements to the skin without ultrasound gel, they utilized a soft, tough, anti-drying, and bioadhesive hydrogel–elastomer hybrid couplant layer [74].
In this study, we proposed the utilization of a biocompatible sticky silicone pad as an alternative to ultrasound gel. It was discovered that a silicone pad has the potential to be used instead of ultrasound gel or gel pads, avoiding skin contact with moisture and thereby serious skin problems. We also observed that sticky silicone pads can not only help to capture images from muscles with good resolutions but also, by increasing the friction between the transducer and the skin, prevent the relocation of the transducer, resulting in a decrease in misclassification in real-time control. In addition, during the offline test, the accuracy of hand gesture classification in the able-bodied group was around 99% when ultrasound gel was used to collect the data. However, when the ultrasound gel was replaced with a silicone pad, the accuracy increased to 100%.
It is well known that muscles become fatigued under continuous contraction. The effect of this phenomenon for prothesis control using electromyography has been reported in previous studies [75,76]. The EMG signal magnitude increases during muscle fatigue, while the center frequency of the EMG reduces. Therefore, a change in frequency can be used to compensate the EMG magnitude, as used in Park and Meek’s study in 1993 [75]. In a recent study in 2019 [76], the data for training a model also included EMG signals collected under the fatigue situation; thus, the prediction model for providing prosthesis signals could work with both un-fatigued and fatigued muscle. Shi et al. (2007) demonstrated that SMG signals could also be used to evaluate muscle fatigue; i.e., muscle thickness increased during muscle fatigue [77]. This finding also indicated that muscle architectures change under the muscle fatigue status. In future studies related to SMG prosthesis control, it is important to include training images collected under the fatigued status when developing the model.
In addition, after the prosthesis is used for a period, the residual muscle of the amputee subjects may change as time goes by. For example, the residual muscle may become stronger, leading to morphological changes, after the user continuously uses the prothesis for a certain period. Under such situations, the originally trained model may not be able to achieve very high accuracy. We propose two possible solutions for future further investigations. Firstly, users can periodically collect new images of their forearm muscles and update their training dataset, retraining the controlling system. Alternatively, the AI model can be designed to automatically update its dataset by capturing new images while the user is using the prosthetic hand. Eventually the model can retrain itself with the updated dataset during the charging process of the prosthetic hand.

Limitations and Future Works

In this study, we used a large number of images collected from each amputee’s hand to train the model for real-time evaluation. In order to put our focus more on the functionality of the prosthetic hand and the capabilities of the controlling system, we decided to spend more time evaluating the functionality of the hand by training the model with the essential and functional hand gestures. In future studies, the control of more complete sets of hand gestures can be evaluated.
Even though the volunteers in this study were able to complete the various tasks, they found it difficult to pick up small objects due to a lack of sensory feedback. In the hand function test, they tried to control the prothesis only by looking at the hand movements without sensing the location of each finger, making it difficult for them to exercise excellent control over the prosthesis. Moreover, to develop a cost-effective prosthetic hand, a minimum number of actuators and electronic items was used. However, in the hand function test, it was observed that it was difficult for the participants to perform some daily activities due to the lack of wrist rotation. They could pick up blocks, but they needed to move their entire body to grasp and hold a glass, especially when simulating the pouring of water from one glass to another. Furthermore, sometimes the participants complained about the prothesis blocking their view, making it difficult for them to see the objects that they wanted to pick up. In addition, based on the test results, it was observed that the prothesis could perform the pinch gesture, but it was difficult for the subjects to pick up small objects like coins, paper clips, ball bearings, etc., via pinching. In addition, to control the robot using the SMG technique, a wireless ultrasound transducer was mounted in the socket. The ultrasound device used in this study had the dimensions of 110 × 56 × 10 mm3 and weighed approximately 80 g. To accommodate this device in the socket, we were obligated to make the entire prosthesis slightly thicker than typical myoelectric prosthetics.
One of the main concerns regarding the SMG system is its long-term feasibility and power consumption. Although the SMG control system demonstrated promising accuracy, this performance can be influenced by changes in muscle structure due to fatigue or alterations in muscle tone over time. Additionally, ultrasound devices typically have higher power consumption compared to other sensing methods such as electromyography (EMG), which may limit their battery life and continuous use duration. These factors could impact the practicality and user comfort of the prosthesis during extended daily use, underscoring the need for the ongoing optimization of sensor design and improvements in energy efficiency.
In the future, different non-invasive methods for giving sensory feedback to amputees will be studied to not only increase the functionality of the hand but also decrease the occurrence of phantom pain in people with hand amputations [78,79,80,81]. Moreover, by increasing the DOF of the prosthesis and adding one more rotational joint in the thumb and one in the wrist, we will improve the dexterity, pinching, and wrist rotational movement of the prosthetic hand [82,83,84]. To remedy the limitations caused due to the rigidity of the prosthetic hand, in the future, a combination of rigid items and soft materials will be utilized to modify the prothesis and make it more like a human hand, with higher dexterity and flexibility [10,13,85,86,87]. Finally, different AI methods will be used to predict not only the intended hand gestures, but also the amount of intended finger flexion. This will provide proportional and natural control over prosthetic hands.

5. Patents

US patent, US application no. 18/305,4415, pub. No. US2024/0225861 A1 title: PROSTHETIC HAND DEVICE USING A WEARABLE ULTRASOUND MODULE AS A HUMAN MACHINE INTERFACE [88]

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/s25133968/s1. Figure S1: The box and blocks test kit; Figure S2: The targeted box and blocks test; Figure S3: The Action Research Arm Test kit; Figure S4: Flipping hand without moving wrist; Video S1: Assessing the functionality of the ProRuka using the box and blocks (B&B) test; Video S2: Evaluating the functionality of the ProRuka using the 4 × 4 targeted box and blocks (TB&B) test; Video S3: Assessment of ProRuka’s functionality using 3 × 3 TB&B test; Table S1: Comparison of mainstream prosthetic hand control technologies including SMG and EMG [57,58,59,60,61,62,63,64,65].

Author Contributions

Y.-P.Z. and V.N. conducted the mechatronic integration, tested the device, and performed all the clinical trials. V.N. conducted the mechanical development of the systems. V.N. developed the electronics. V.N. wrote the manuscript. Y.-P.Z. and V.N. contributed to the writing of the paper. V.N. prepared the figures and videos. Y.-P.Z. supervised the teams involved in the study and collected the funding to perform the study. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by the Telefield Charitable Fund (ZH3V) and The Research Grant Council of Hong Kong (15217224).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved on 20 July 2022, by the Institutional Review Board of The Hong Kong Polytechnic University (reference number: HSEARS20220720001).

Informed Consent Statement

Informed consent was obtained from all the subjects involved in the study.

Data Availability Statement

All data and materials are available in the main text or the Supplementary Materials.

Acknowledgments

We are really grateful to the subjects who participated in the research for their patience and commitment to our study. In addition, we would like to acknowledge and express our gratitude to May Wai Yoyo Lau for her help in facilitating the training sessions, and Lyn Wong for her assistance in administrative support, as well as Yan To Ling, Tsung Kwan Shea, and Ka Shing Lee for their significant contributions made to this project.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
HMIHuman–Machine Interface
SMGSonomyography
EMGElectromyography
EEGElectroencephalography
MIRAMyoelectric Implantable Recording Array
MMMagnetomicrometry
SDASubclass Discriminant Analysis
PCAPrincipal Component Analysis
SVMSupport Vector Machine
BP-ANNBackpropagation Artificial Neural Network
DOFDegree of Freedom
B&BBox and Blocks
TB&BTargeted Box and Blocks
ARATAction Research Arm Test
CNNConvolutional Neural Network
RFRandom Forest
KNNK-Nearest Neighbors
DTCDecision Tree Classifier
SVRSupport Vector Regression
NNRNearest Neighbor Regression
DTRDecision Tree Regression
FDSFlexor Digitorum Superficialis
FPLFlexor Pollicis Longus
FDPFlexor Digitorum Profundus
ADLActivity of Daily Living

References

  1. Damerla, R.; Qiu, Y.; Sun, T.M.; Awtar, S. A review of the performance of extrinsically powered prosthetic hands. IEEE Trans. Med. Robot. Bionics 2021, 3, 640–660. [Google Scholar] [CrossRef]
  2. Cordella, F.; Ciancio, A.L.; Sacchetti, R.; Davalli, A.; Cutti, A.G.; Guglielmelli, E.; Zollo, L. Literature review on needs of upper limb prosthesis users. Front. Neurosci. 2016, 10, 209. [Google Scholar] [CrossRef] [PubMed]
  3. Belter, J.T.; Segil, J.L.; Dollar, A.M.; Weir, R.F. Mechanical design and performance specifications of anthropomorphic prosthetic hands: A review. J. Rehabil. Res. Dev. 2013, 50, 599. [Google Scholar] [CrossRef]
  4. Xu, K.; Guo, W.; Hua, L.; Sheng, X.; Zhu, X. A prosthetic arm based on EMG pattern recognition. In Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, China, 3–7 December 2016; pp. 1179–1184. [Google Scholar]
  5. O’Neill, C. An advanced, low cost prosthetic arm. In Proceedings of the 2014 IEEE Sensors Conference, Valencia, Spain, 2–5 November 2014; pp. 494–498. [Google Scholar]
  6. Gretsch, K.F.; Lather, H.D.; Peddada, K.V.; Deeken, C.R.; Wall, L.B.; Goldfarb, C.A. Development of novel 3D-printed robotic prosthetic for transradial amputees. Prosthet. Orthot. Int. 2016, 40, 400–403. [Google Scholar] [CrossRef]
  7. Kontoudis, G.P.; Liarokapis, M.V.; Zisimatos, A.G.; Mavrogiannis, C.I.; Kyriakopoulos, K.J. Open-source, anthropomorphic, underactuated robot hands with a selectively lockable differential mechanism: Towards affordable prostheses. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 5857–5862. [Google Scholar]
  8. Ng, K.H.; Nazari, V.; Alam, M. Can Prosthetic Hands Mimic a Healthy Human Hand? Prosthesis 2021, 3, 11–23. [Google Scholar] [CrossRef]
  9. Ke, A.; Huang, J.; Wang, J.; Xiong, C.; He, J. Optimal design of dexterous prosthetic hand with five-joint thumb and fingertip tactile sensors based on novel precision grasp metric. Mech. Mach. Theory 2022, 171, 104759. [Google Scholar] [CrossRef]
  10. Mohammadi, A.; Lavranos, J.; Zhou, H.; Mutlu, R.; Alici, G.; Tan, Y.; Choong, P.; Oetomo, D. A practical 3D-printed soft robotic prosthetic hand with multi-articulating capabilities. PLoS ONE 2020, 15, e0232766. [Google Scholar] [CrossRef]
  11. Yang, D.; Liu, H. Human-machine shared control: New avenue to dexterous prosthetic hand manipulation. Sci. China Technol. Sci. 2021, 64, 767–773. [Google Scholar] [CrossRef]
  12. Furui, A.; Eto, S.; Nakagaki, K.; Shimada, K.; Nakamura, G.; Masuda, A.; Chin, T.; Tsuji, T. A myoelectric prosthetic hand with muscle synergy–based motion determination and impedance model–based biomimetic control. Sci. Robot. 2019, 4, eaaw6339. [Google Scholar] [CrossRef]
  13. Gu, G.; Zhang, N.; Xu, H.; Lin, S.; Yu, Y.; Chai, G.; Ge, L.; Yang, H.; Shao, Q.; Sheng, X. A soft neuroprosthetic hand providing simultaneous myoelectric control and tactile feedback. Nat. Biomed. Eng. 2021, 7, 589–598. [Google Scholar] [CrossRef]
  14. Laffranchi, M.; Boccardo, N.; Traverso, S.; Lombardi, L.; Canepa, M.; Lince, A.; Semprini, M.; Saglia, J.A.; Naceri, A.; Sacchetti, R. The Hannes hand prosthesis replicates the key biological properties of the human hand. Sci. Robot. 2020, 5, eabb0467. [Google Scholar] [CrossRef] [PubMed]
  15. Biddiss, E.A.; Chau, T.T. Upper limb prosthesis use and abandonment: A survey of the last 25 years. Prosthet. Orthot. Int. 2007, 31, 236–257. [Google Scholar] [CrossRef] [PubMed]
  16. Resnik, L.; Meucci, M.R.; Lieberman-Klinger, S.; Fantini, C.; Kelty, D.L.; Disla, R.; Sasson, N. Advanced upper limb prosthetic devices: Implications for upper limb prosthetic rehabilitation. Arch. Phys. Med. Rehabil. 2012, 93, 710–717. [Google Scholar] [CrossRef]
  17. Lewis, S.; Russold, M.; Dietl, H.; Kaniusas, E. Satisfaction of prosthesis users with electrical hand prostheses and their sugggested improvements. Biomed. Eng./Biomed. Tech. 2013, 58, 000010151520134385. [Google Scholar] [CrossRef]
  18. Weiner, P.; Starke, J.; Rader, S.; Hundhausen, F.; Asfour, T. Designing prosthetic hands with embodied intelligence: The kit prosthetic hands. Front. Neurorobotics 2022, 16, 815716. [Google Scholar] [CrossRef] [PubMed]
  19. Nazarpour, K. A more human prosthetic hand. Sci. Robot. 2020, 5, eabd9341. [Google Scholar] [CrossRef]
  20. Balasubramanian, R.; Santos, V.J. The Human Hand as an Inspiration for Robot Hand Development; Springer: Berlin/Heidelberg, Germany, 2014; Volume 95. [Google Scholar]
  21. Varol, H.A.; Dalley, S.A.; Wiste, T.E.; Goldfarb, M. Biomimicry and the design of multigrasp transradial prostheses. In The Human Hand as an Inspiration for Robot Hand Development; Springer: Berlin/Heidelberg, Germany, 2014; pp. 431–451. [Google Scholar]
  22. Controzzi, M.; Cipriani, C.; Carrozza, M.C. Design of artificial hands: A review. In The Human Hand as an Inspiration for Robot Hand Development; Springer: Berlin/Heidelberg, Germany, 2014; pp. 219–246. [Google Scholar]
  23. Marinelli, A.; Boccardo, N.; Tessari, F.; Di Domenico, D.; Caserta, G.; Canepa, M.; Gini, G.; Barresi, G.; Laffranchi, M.; De Michieli, L. Active upper limb prostheses: A review on current state and upcoming breakthroughs. Prog. Biomed. Eng. 2023, 5, 012001. [Google Scholar] [CrossRef]
  24. Bicchi, A.; Gabiccini, M.; Santello, M. Modelling natural and artificial hands with synergies. Philos. Trans. R. Soc. B Biol. Sci. 2011, 366, 3153–3161. [Google Scholar] [CrossRef]
  25. Leo, A.; Handjaras, G.; Bianchi, M.; Marino, H.; Gabiccini, M.; Guidi, A.; Scilingo, E.P.; Pietrini, P.; Bicchi, A.; Santello, M. A synergy-based hand control is encoded in human motor cortical areas. Elife 2016, 5, e13420. [Google Scholar] [CrossRef]
  26. Weiner, P.; Starke, J.; Hundhausen, F.; Beil, J.; Asfour, T. The KIT prosthetic hand: Design and control. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 3328–3334. [Google Scholar]
  27. Murray, C.D. Embodiment and prosthetics. In Psychoprosthetics; Springer: Berlin/Heidelberg, Germany, 2008; pp. 119–129. [Google Scholar]
  28. De Vignemont, F. Embodiment, ownership and disownership. Conscious. Cogn. 2011, 20, 82–93. [Google Scholar] [CrossRef]
  29. D’Anna, E.; Valle, G.; Mazzoni, A.; Strauss, I.; Iberite, F.; Patton, J.; Petrini, F.M.; Raspopovic, S.; Granata, G.; Di Iorio, R. A closed-loop hand prosthesis with simultaneous intraneural tactile and position feedback. Sci. Robot. 2019, 4, eaau8892. [Google Scholar] [CrossRef] [PubMed]
  30. Krebs, H.I.; Palazzolo, J.J.; Dipietro, L.; Ferraro, M.; Krol, J.; Rannekleiv, K.; Volpe, B.T.; Hogan, N. Rehabilitation robotics: Performance-based progressive robot-assisted therapy. Auton. Robot. 2003, 15, 7–20. [Google Scholar] [CrossRef]
  31. Nazari, V.; Pouladian, M.; Zheng, Y.-P.; Alam, M. A compact and lightweight rehabilitative exoskeleton to restore grasping functions for people with hand paralysis. Sensors 2021, 21, 6900. [Google Scholar] [CrossRef] [PubMed]
  32. Qassim, H.M.; Wan Hasan, W. A review on upper limb rehabilitation robots. Appl. Sci. 2020, 10, 6976. [Google Scholar] [CrossRef]
  33. Ribeiro, J.; Mota, F.; Cavalcante, T.; Nogueira, I.; Gondim, V.; Albuquerque, V.; Alexandria, A. Analysis of man-machine interfaces in upper-limb prosthesis: A review. Robotics 2019, 8, 16. [Google Scholar] [CrossRef]
  34. Begovic, H.; Zhou, G.-Q.; Li, T.; Wang, Y.; Zheng, Y.-P. Detection of the electromechanical delay and its components during voluntary isometric contraction of the quadriceps femoris muscle. Front. Physiol. 2014, 5, 494. [Google Scholar] [CrossRef] [PubMed]
  35. Setiawan, J.D.; Alwy, F.; Ariyanto, M.; Samudro, L.; Ismail, R. Flexion and Extension Motion for Prosthetic Hand Controlled by Single-Channel EEG. In Proceedings of the 2021 8th International Conference on Information Technology, Computer and Electrical Engineering (ICITACEE), Semarang, Indonesia, 23–24 September 2021; pp. 47–52. [Google Scholar]
  36. Ahmadian, P.; Cagnoni, S.; Ascari, L. How capable is non-invasive EEG data of predicting the next movement? A mini review. Front. Hum. Neurosci. 2013, 7, 124. [Google Scholar] [CrossRef]
  37. Gstoettner, C.; Festin, C.; Prahm, C.; Bergmeister, K.D.; Salminger, S.; Sturma, A.; Hofer, C.; Russold, M.F.; Howard, C.L.; McDonnall, D. Feasibility of a wireless implantable multi-electrode system for high-bandwidth prosthetic interfacing: Animal and cadaver study. Clin. Orthop. Relat. Res. 2022, 480, 1191–1204. [Google Scholar] [CrossRef]
  38. Taylor, C.R.; Srinivasan, S.S.; Yeon, S.H.; O’Donnell, M.; Roberts, T.; Herr, H.M. Magnetomicrometry. Sci. Robot. 2021, 6, eabg0656. [Google Scholar] [CrossRef]
  39. Zheng, Y.-P.; Chan, M.; Shi, J.; Chen, X.; Huang, Q.-H. Sonomyography: Monitoring morphological changes of forearm muscles in actions with the feasibility for the control of powered prosthesis. Med. Eng. Phys. 2006, 28, 405–415. [Google Scholar] [CrossRef]
  40. Zhou, Y.; Zheng, Y.P. Sonomyography: Dynamic and Functional Assessment of Muscle Using Ultrasound Imaging, 1st ed.; Springer: Singapore, 2021; pp. 1–252. [Google Scholar]
  41. Guo, J.Y.; Zheng, Y.P.; Huang, Q.H.; Chen, X.; He, J.F.; Chan, H.L. Performances of one-dimensional sonomyography and surface electromyography in tracking guided patterns of wrist extension. Ultrasound Med. Biol. 2009, 35, 894–902. [Google Scholar] [CrossRef] [PubMed]
  42. Chen, X.; Zheng, Y.P.; Guo, J.Y.; Shi, J. Sonomyography (SMG) control for powered prosthetic hand: A study with normal subjects. Ultrasound Med. Biol. 2010, 36, 1076–1088. [Google Scholar] [CrossRef]
  43. Shi, J.; Chang, Q.; Zheng, Y.P. Feasibility of controlling prosthetic hand using sonomyography signal in real time: Preliminary study. J. Rehabil. Res. Dev. 2010, 47, 87–98. [Google Scholar] [CrossRef]
  44. Shi, J.; Guo, J.Y.; Hu, S.X.; Zheng, Y.P. Recognition of finger flexion motion from ultrasound image: A feasibility study. Ultrasound Med. Biol. 2012, 38, 1695–1704. [Google Scholar] [CrossRef]
  45. Guo, J.Y.; Zheng, Y.P.; Xie, H.B.; Koo, T.K. Towards the application of one-dimensional sonomyography for powered upper-limb prosthetic control using machine learning models. Prosthet. Orthot. Int. 2013, 37, 43–49. [Google Scholar] [CrossRef]
  46. Kamatham, A.T.; Mukherjee, B. Design and optimization of a wearable sonomyography sensor for dynamic muscle activity monitoring. In Proceedings of the 2023 IEEE Applied Sensing Conference (APSCON), Bengaluru, India, 23–25 January 2023; pp. 1–3. [Google Scholar]
  47. Manikandan, S.; Prasad, A.; Mukherjee, B.; Sridar, P. Towards Deep Learning-Based Classification of Multiple Gestures using Sonomyography for Prosthetic Control Applications. In Proceedings of the 2023 3rd International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME), Tenerife, Spain, 19–21 July 2023; pp. 1–5. [Google Scholar]
  48. Ma, C.Z.-H.; Ling, Y.T.; Shea, Q.T.K.; Wang, L.-K.; Wang, X.-Y.; Zheng, Y.-P. Towards wearable comprehensive capture and analysis of skeletal muscle activity during human locomotion. Sensors 2019, 19, 195. [Google Scholar] [CrossRef] [PubMed]
  49. Fifer, M.S.; McMullen, D.P.; Osborn, L.E.; Thomas, T.M.; Christie, B.; Nickl, R.W.; Candrea, D.N.; Pohlmeyer, E.A.; Thompson, M.C.; Anaya, M.A.; et al. Intracortical Somatosensory Stimulation to Elicit Fingertip Sensations in an Individual with Spinal Cord Injury. Neurology 2022, 98, e679–e687. [Google Scholar] [CrossRef] [PubMed]
  50. Engdahl, S.; Mukherjee, B.; Akhlaghi, N.; Dhawan, A.; Bashatah, A.; Patwardhan, S.; Holley, R.; Kaliki, R.; Monroe, B.; Sikdar, S. A novel method for achieving dexterous, proportional prosthetic control using sonomyography. In Proceedings of the MEC20 Symposium, Fredericton, NB, Canada, 10–13 August 2020; The Institute of Biomedical Engineering, University of New Brunswick: Fredericton, NB, Canada, 2020. [Google Scholar]
  51. Yang, X.; Yan, J.; Fang, Y.; Zhou, D.; Liu, H. Simultaneous prediction of wrist/hand motion via wearable ultrasound sensing. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 970–977. [Google Scholar] [CrossRef]
  52. Akhlaghi, N.; Dhawan, A.; Khan, A.A.; Mukherjee, B.; Diao, G.; Truong, C.; Sikdar, S. Sparsity analysis of a sonomyographic muscle–computer interface. IEEE Trans. Biomed. Eng. 2019, 67, 688–696. [Google Scholar] [CrossRef]
  53. Li, J.; Zhu, K.; Pan, L. Wrist and finger motion recognition via M-mode ultrasound signal: A feasibility study. Biomed. Signal Process. Control. 2022, 71, 103112. [Google Scholar] [CrossRef]
  54. Guo, J.-Y.; Zheng, Y.-P.; Huang, Q.-H.; Chen, X. Dynamic monitoring of forearm muscles using one-dimensional sonomyography system. J. Rehabil. Res. Dev. 2008, 45, 187–195. [Google Scholar] [CrossRef] [PubMed]
  55. Castellini, C.; Passig, G.; Zarka, E. Using ultrasound images of the forearm to predict finger positions. IEEE Trans. Neural Syst. Rehabil. Eng. 2012, 20, 788–797. [Google Scholar] [CrossRef]
  56. Castellini, C.; Passig, G. Ultrasound image features of the wrist are linearly related to finger positions. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 2108–2114. [Google Scholar]
  57. Nazari, V.; Zheng, Y.-P. Controlling Upper Limb Prostheses Using Sonomyography (SMG): A Review. Sensors 2023, 23, 1885. [Google Scholar] [CrossRef]
  58. Maibam, P.C.; Pei, D.; Olikkal, P.; Vinjamuri, R.K.; Kakoty, N.M. Enhancing prosthetic hand control: A synergistic multi-channel electroencephalogram. Wearable Technol. 2024, 5, e18. [Google Scholar] [CrossRef] [PubMed]
  59. Prakash, A.; Sharma, S.; Sharma, N. A compact-sized surface EMG sensor for myoelectric hand prosthesis. Biomed. Eng. Lett. 2019, 9, 467–479. [Google Scholar] [CrossRef]
  60. Dunai, L.; Segui, V.I.; Tsurcanu, D.; Bostan, V. Prosthetic Hand Based on Human Hand Anatomy Controlled by Surface Electromyography and Artificial Neural Network. Technologies 2025, 13, 21. [Google Scholar] [CrossRef]
  61. Kalita, A.J.; Chanu, M.P.; Kakoty, N.M.; Vinjamuri, R.K.; Borah, S. Functional evaluation of a real-time EMG controlled prosthetic hand. Wearable Technol. 2025, 6, e18. [Google Scholar] [CrossRef]
  62. Chen, Z.; Min, H.; Wang, D.; Xia, Z.; Sun, F.; Fang, B. A review of myoelectric control for prosthetic hand manipulation. Biomimetics 2023, 8, 328. [Google Scholar] [CrossRef] [PubMed]
  63. Li, W.; Shi, P.; Yu, H. Gesture recognition using surface electromyography and deep learning for prostheses hand: State-of-the-art, challenges, and future. Front. Neurosci. 2021, 15, 621885. [Google Scholar] [CrossRef]
  64. Parajuli, N.; Sreenivasan, N.; Bifulco, P.; Cesarelli, M.; Savino, S.; Niola, V.; Esposito, D.; Hamilton, T.J.; Naik, G.R.; Gunawardana, U. Real-time EMG based pattern recognition control for hand prostheses: A review on existing methods, challenges and future implementation. Sensors 2019, 19, 4596. [Google Scholar] [CrossRef]
  65. Kinugasa, R.; Kubo, S. Development of consumer-friendly surface electromyography system for muscle fatigue detection. IEEE Access 2023, 11, 6394–6403. [Google Scholar] [CrossRef]
  66. Engdahl, S.M.; Acuña, S.A.; King, E.L.; Bashatah, A.; Sikdar, S. First demonstration of functional task performance using a sonomyographic prosthesis: A case study. Front. Bioeng. Biotechnol. 2022, 10, 876836. [Google Scholar] [CrossRef] [PubMed]
  67. Mathiowetz, V.; Volland, G.; Kashman, N.; Weber, K. Adult norms for the Box and Block Test of manual dexterity. Am. J. Occup. Ther. 1985, 39, 386–391. [Google Scholar] [CrossRef]
  68. Kontson, K.; Marcus, I.; Myklebust, B.; Civillico, E. Targeted box and blocks test: Normative data and comparison to standard tests. PLoS ONE 2017, 12, e0177965. [Google Scholar] [CrossRef] [PubMed]
  69. Kontson, K.; Ruhde, L.; Trent, L.; Miguelez, J.; Baun, K. Targeted Box and Blocks Test: Evidence of Convergent Validity in Upper Limb Prosthesis User Population. Arch. Phys. Med. Rehabil. 2022, 103, e132. [Google Scholar] [CrossRef]
  70. Kontson, K.; Wang, B.; Leung, N.; Miguelez, J.M.; Trent, L. Test-retest reliability, inter-rater reliability, and convergent validity of the targeted Box and Blocks Test (tBBT) in an upper extremity prosthesis user population. Arch. Rehabil. Res. Clin. Transl. 2025, 100427. [Google Scholar]
  71. McDonnell, M. Action research arm test. Aust. J. Physiother. 2008, 54, 220. [Google Scholar] [CrossRef]
  72. Yozbatiran, N.; Der-Yeghiaian, L.; Cramer, S.C. A standardized approach to performing the action research arm test. Neurorehabilit. Neural Repair 2008, 22, 78–90. [Google Scholar] [CrossRef]
  73. Buma, F.E.; Raemaekers, M.; Kwakkel, G.; Ramsey, N.F. Brain function and upper limb outcome in stroke: A cross-sectional fMRI study. PLoS ONE 2015, 10, e0139746. [Google Scholar] [CrossRef]
  74. Wang, C.; Chen, X.; Wang, L.; Makihata, M.; Liu, H.-C.; Zhou, T.; Zhao, X. Bioadhesive ultrasound for long-term continuous imaging of diverse organs. Science 2022, 377, 517–523. [Google Scholar] [CrossRef]
  75. Park, E.; Meek, S.G. Fatigue compensation of the electromyographic signal for prosthetic control and force estimation. IEEE Trans. Biomed. Eng. 1993, 40, 1019–1023. [Google Scholar] [CrossRef] [PubMed]
  76. Díaz-Amador, R.; Mendoza-Reyes, M.A. Towards the reduction of the effects of muscle fatigue on myoelectric control of upper limb prostheses. Dyna 2019, 86, 110–116. [Google Scholar] [CrossRef]
  77. Shi, J.; Zheng, Y.-P.; Chen, X.; Huang, Q.-H. Assessment of muscle fatigue using sonomyography: Muscle thickness change detected from ultrasound images. Med. Eng. Phys. 2007, 29, 472–479. [Google Scholar] [CrossRef]
  78. Middleton, A.; Ortiz-Catalan, M. Neuromusculoskeletal arm prostheses: Personal and social implications of living with an intimately integrated bionic arm. Front. Neurorobotics 2020, 14, 39. [Google Scholar] [CrossRef] [PubMed]
  79. Raspopovic, S.; Valle, G.; Petrini, F.M. Sensory feedback for limb prostheses in amputees. Nat. Mater. 2021, 20, 925–939. [Google Scholar] [CrossRef] [PubMed]
  80. Marasco, P.D.; Hebert, J.S.; Sensinger, J.W.; Beckler, D.T.; Thumser, Z.C.; Shehata, A.W.; Williams, H.E.; Wilson, K.R. Neurorobotic fusion of prosthetic touch, kinesthesia, and movement in bionic upper limbs promotes intrinsic brain behaviors. Sci. Robot. 2021, 6, eabf3368. [Google Scholar] [CrossRef]
  81. Clemente, F.; D’Alonzo, M.; Controzzi, M.; Edin, B.B.; Cipriani, C. Non-invasive, temporally discrete feedback of object contact and release improves grasp control of closed-loop myoelectric transradial prostheses. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 24, 1314–1322. [Google Scholar] [CrossRef]
  82. Deijs, M.; Bongers, R.; Ringeling-van Leusen, N.; Van Der Sluis, C. Flexible and static wrist units in upper limb prosthesis users: Functionality scores, user satisfaction and compensatory movements. J. Neuroeng. Rehabil. 2016, 13, 1–13. [Google Scholar] [CrossRef]
  83. Montagnani, F.; Controzzi, M.; Cipriani, C. Is it finger or wrist dexterity that is missing in current hand prostheses? IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 23, 600–609. [Google Scholar] [CrossRef]
  84. Choi, S.; Cho, W.; Kim, K. Restoring natural upper limb movement through a wrist prosthetic module for partial hand amputees. J. Neuroeng. Rehabil. 2023, 20, 135. [Google Scholar] [CrossRef]
  85. Zhao, H.; O’brien, K.; Li, S.; Shepherd, R.F. Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides. Sci. Robot. 2016, 1, eaai7529. [Google Scholar] [CrossRef] [PubMed]
  86. Dunai, L.; Novak, M.; García Espert, C. Human hand anatomy-based prosthetic hand. Sensors 2020, 21, 137. [Google Scholar] [CrossRef] [PubMed]
  87. Capsi-Morales, P.; Piazza, C.; Catalano, M.G.; Grioli, G.; Schiavon, L.; Fiaschi, E.; Bicchi, A. Comparison between rigid and soft poly-articulated prosthetic hands in non-expert myo-electric users shows advantages of soft robotics. Sci. Rep. 2021, 11, 23952. [Google Scholar] [CrossRef] [PubMed]
  88. Nazari, V.; Zheng, Y.; Shea, T.K. Prosthetic Hand Device Using a Wearable Ultrasound Module as a Human Machine Interface. U.S. Patent 18/305,415, 1 July 2024. [Google Scholar]
Figure 1. Sonomyography as a novel HMI method: The overall schematic of the sonomyography (SMG) setup and control of the prosthetic hand using an ultrasound probe.
Figure 1. Sonomyography as a novel HMI method: The overall schematic of the sonomyography (SMG) setup and control of the prosthetic hand using an ultrasound probe.
Sensors 25 03968 g001
Figure 2. The overall schematic of the classification process: A transfer learning model was used to extract features from the images and the extracted features were then utilized to train the model using machine learning algorithms.
Figure 2. The overall schematic of the classification process: A transfer learning model was used to extract features from the images and the extracted features were then utilized to train the model using machine learning algorithms.
Sensors 25 03968 g002
Figure 3. Utilizing silicone pad instead of ultrasound gel: Image quality using silicone pad (A) and ultrasound gel (B).
Figure 3. Utilizing silicone pad instead of ultrasound gel: Image quality using silicone pad (A) and ultrasound gel (B).
Sensors 25 03968 g003
Figure 4. The design of ProRuka: (A) The design of ProRuka, a 3D-printed, lightweight, cost-effective, and multi-degree-of-freedom prosthetic hand; (B) an exploded view of the prosthesis.
Figure 4. The design of ProRuka: (A) The design of ProRuka, a 3D-printed, lightweight, cost-effective, and multi-degree-of-freedom prosthetic hand; (B) an exploded view of the prosthesis.
Sensors 25 03968 g004
Figure 5. The experimental setup: (A) The setup for data collection to test the performance of offline classification (experiment 1) and collect ultrasound images of the main muscles responsible for finger flexion. (B) The area on the forearm used to capture the best muscle activities to control the robot. (C) The data collection setup used to train the model for functional testing (experiment 2).
Figure 5. The experimental setup: (A) The setup for data collection to test the performance of offline classification (experiment 1) and collect ultrasound images of the main muscles responsible for finger flexion. (B) The area on the forearm used to capture the best muscle activities to control the robot. (C) The data collection setup used to train the model for functional testing (experiment 2).
Sensors 25 03968 g005
Figure 6. The results of the offline test: Offline test results for classifying nine different hand gestures in the 10 able-bodied participants group. VGG16 was used to extract features from the collected data, and these features were used to train an RF, KNN, or DTC model. The figure shows confusion matrices for hand gesture classification using the (A) Random Forest, (B) KNN, and (C) decision tree algorithms.
Figure 6. The results of the offline test: Offline test results for classifying nine different hand gestures in the 10 able-bodied participants group. VGG16 was used to extract features from the collected data, and these features were used to train an RF, KNN, or DTC model. The figure shows confusion matrices for hand gesture classification using the (A) Random Forest, (B) KNN, and (C) decision tree algorithms.
Sensors 25 03968 g006
Figure 7. Two-dimensional t-SNE visualization: 2D t-SNE visualization of extracted features from transfer learning model with decision boundaries learned by (A) KNN, (B) RF, (C) DTC.
Figure 7. Two-dimensional t-SNE visualization: 2D t-SNE visualization of extracted features from transfer learning model with decision boundaries learned by (A) KNN, (B) RF, (C) DTC.
Sensors 25 03968 g007
Figure 8. ProRuka in activities of daily living: Novel SMG system enabling multi-degree-of-freedom prosthetic hand to be used in daily activities.
Figure 8. ProRuka in activities of daily living: Novel SMG system enabling multi-degree-of-freedom prosthetic hand to be used in daily activities.
Sensors 25 03968 g008
Table 1. This table summarizes the performance of various machine learning algorithms using different transfer learning models in classifying nine different hand gestures. Accuracy is presented as a percentage, indicating the effectiveness of each method.
Table 1. This table summarizes the performance of various machine learning algorithms using different transfer learning models in classifying nine different hand gestures. Accuracy is presented as a percentage, indicating the effectiveness of each method.
Machine Learning AlgorithmTransfer Learning ModelAccuracy
Random Forest (RF)InceptionResNetV2100%
K-Nearest Neighbors (KNN)InceptionResNetV2100%
Decision Tree Classifier (DCT)InceptionResNetV2100%
Support Vector Machine (SVM)InceptionResNetV2100%
Random Forest (RF)VGG19100%
K-Nearest Neighbors (KNN)VGG19100%
Decision Tree Classifier (DCT)VGG19100%
Support Vector Machine (SVM)VGG19100%
Random Forest (RF)VGG16100%
K-Nearest Neighbors (KNN)VGG16100%
Decision Tree Classifier (DCT)VGG16100%
Support Vector Machine (SVM)VGG16100%
Multi-Layer Perceptron (MLP)VGG1623%
Table 2. This table displays the performance of various regression algorithms using VGG16 for transfer learning in classifying nine different hand gestures. Accuracy is shown as a percentage, reflecting the effectiveness of each regression method.
Table 2. This table displays the performance of various regression algorithms using VGG16 for transfer learning in classifying nine different hand gestures. Accuracy is shown as a percentage, reflecting the effectiveness of each regression method.
Regression AlgorithmAccuracy
Neural Network Regression (NNR)100%
Decision Tree Regression (DTR)91.72%
Support Vector Regression (SVR-L)55.96%
Support Vector Regression (SVR-P)55.38%
Table 3. Result of hand function evaluation using B&B test, TB&B (4 × 4) test, TB&B (3 × 3) test, and ARAT. Both volunteers were right-handed and had left-hand transradial amputations.
Table 3. Result of hand function evaluation using B&B test, TB&B (4 × 4) test, TB&B (3 × 3) test, and ARAT. Both volunteers were right-handed and had left-hand transradial amputations.
TestHandResult
B&BNumber of blocks
A1A2
Left128
Right4547
TB&B (4 × 4)Time (seconds)
A1A2
Left86.66136.79
Right31.3121.23
TB&B (3 × 3) Time (seconds)
A1A2
Left41.4067.18
Right17.0012.28
ARATScore (total)
A1A2
Left4040
Right5757
B&B: box and blocks; TB&B: targeted box and blocks; ARAT: Action Research Hand Test.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nazari, V.; Zheng, Y.-P. A Highly Efficient HMI Algorithm for Controlling a Multi-Degree-of-Freedom Prosthetic Hand Using Sonomyography. Sensors 2025, 25, 3968. https://doi.org/10.3390/s25133968

AMA Style

Nazari V, Zheng Y-P. A Highly Efficient HMI Algorithm for Controlling a Multi-Degree-of-Freedom Prosthetic Hand Using Sonomyography. Sensors. 2025; 25(13):3968. https://doi.org/10.3390/s25133968

Chicago/Turabian Style

Nazari, Vaheh, and Yong-Ping Zheng. 2025. "A Highly Efficient HMI Algorithm for Controlling a Multi-Degree-of-Freedom Prosthetic Hand Using Sonomyography" Sensors 25, no. 13: 3968. https://doi.org/10.3390/s25133968

APA Style

Nazari, V., & Zheng, Y.-P. (2025). A Highly Efficient HMI Algorithm for Controlling a Multi-Degree-of-Freedom Prosthetic Hand Using Sonomyography. Sensors, 25(13), 3968. https://doi.org/10.3390/s25133968

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop