Next Article in Journal
Motion Compensation for Radar Terrain Imaging Based on INS/GPS System
Previous Article in Journal
Fabrication of Microarrays for the Analysis of Serological Antibody Isotypes against Food Antigens
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Flex Sensor Compensator via Hammerstein–Wiener Modeling Approach for Improved Dynamic Goniometry and Constrained Control of a Bionic Hand

by
Syed Afdar Ali Syed Mubarak Ali
,
Nur Syazreen Ahmad
* and
Patrick Goh
School of Electrical and Electronic Engineering, Engineering Campus, Universiti Sains Malaysia, 14300 Nibong Tebal, Pulau Pinang, Malaysia
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(18), 3896; https://doi.org/10.3390/s19183896
Submission received: 2 August 2019 / Revised: 30 August 2019 / Accepted: 7 September 2019 / Published: 10 September 2019
(This article belongs to the Section Physical Sensors)

Abstract

:
In this paper, a new control-centric approach is introduced to model the characteristics of flex sensors on a goniometric glove, which is designed to capture the user hand gesture that can be used to wirelessly control a bionic hand. The main technique employs the inverse dynamic model strategy along with a black-box identification for the compensator design, which is aimed to provide an approximate linear mapping between the raw sensor output and the dynamic finger goniometry. To smoothly recover the goniometry on the bionic hand’s side during the wireless transmission, the compensator is restructured into a Hammerstein–Wiener model, which consists of a linear dynamic system and two static nonlinearities. A series of real-time experiments involving several hand gestures have been conducted to analyze the performance of the proposed method. The associated temporal and spatial gesture data from both the glove and the bionic hand are recorded, and the performance is evaluated in terms of the integral of absolute error between the glove’s and the bionic hand’s dynamic goniometry. The proposed method is also compared with the raw sensor data, which has been preliminarily calibrated with the finger goniometry, and the Wiener model, which is based on the initial inverse dynamic design strategy. Experimental results with several trials for each gesture show that a great improvement is obtained via the Hammerstein–Wiener compensator approach where the resulting average errors are significantly smaller than the other two methods. This concludes that the proposed strategy can remarkably improve the dynamic goniometry of the glove, and thus provides a smooth human–robot collaboration with the bionic hand.

Graphical Abstract

1. Introduction

Hand gesture recognition refers to the process of understanding the mathematical interpretation of the hand’s movement by a computing device [1]. It is one of the popular research topics in the past few decades due to the rapid advancements in sensor and smart device technologies [2,3,4]. Its applications are not just limited to human–computer or human–machine interaction [5], but also include a diverse range of fields such as sign language recognition [6], clinical rehabilitations [7,8], human–robot collaborations [9,10], gaming and virtual reality control [11]. In a typical hand gesture recognition system, the initial stage is the data acquisition which can be performed either via vision-based [12] or non-vision-based [13] techniques. Both have their own advantages and disadvantages regarding their applications. A hybrid approach which combines the two methods has also been employed in some specific areas that require high speed and precision such as augmented reality technologies [14].
The vision-based technique, as the name suggests, uses cameras or optical sensors to capture hand gestures. A notable superiority of this technique is that it eliminates the need for using wearable interfaces or multi-sensory devices, hence providing a natural interaction with the computer. In other words, the minimal interference offered allows the user to perform any hand motion in the most convenient way possible. Nonetheless, the tracking performance is greatly affected by environmental factors such as illumination or lighting variations, cluttered background, and interruption from other moving objects in the scene particularly with the same hand skin color. Other major problems caused by the user’s motion include out-of-range detection, high-speed movement, and self-occlusion. Such circumstances entail high specification cameras [15,16] or optical sensors such as Leap Motion controller [17,18,19,20,21,22] and Microsoft Kinect sensors [23] to enhance its performance and robustness. In some cases, multiple cameras at different view angles and positions are required, which may eventually increase the computational burden and associated costs [24,25]. This can also lead to bulkiness and may not be desirable for mobile applications.
The non-vision-based method, on the other hand, has the advantage of not being susceptible to occlusion or environmental factors. This approach is also frequently termed as sensor-based method, although literally it encompasses any kind of sensing devices but image or vision sensors. The most commonly used interface is the glove-based gestural system which is constructed by several sensors attached to a cloth glove, a computing device for data processing and transmission, and a power supply. A major benefit of this method over the vision-based technique is its fast response and precise tracking [10]. The performance can be boosted by increasing the number of sensing devices, but it will pose a trade off with the size, power consumption, building costs, and most importantly, user convenience. In the non-vision-based technique, different sensors serve to capture different types of hand gestures, such as palm orientation, wrist movement, hand rotation, and fingers goniometry. The first three types mentioned can be simply registered using accelerometers, gyroscopes or inertial measurement units (IMUs). With proper calibrations, the combination of the gestures allows the overall 3D motion to be captured with a good performance.
The predominant gesture recognition that is of interest in much research and yet remains a significant technical challenge is nevertheless the finger goniometry [6,8,26,27]. Goniometry in general refers to the measurement of angles created at human joints by the bones of the body [28]. A crucial stage before the goniometry can be acquired is the procedure to model the hand, where the variations can be recorded either temporally, spatially, or both, depending on the target applications. Via non-image-based data acquisition approach, the 2D hand modeling for temporal or spatial pattern assessment can be accomplished by using the flex sensor as it has a prominent advantage of being able to change its resistance when bent. This characteristic makes it suitable to be positioned on the finger’s joints where the goniometry can be correlated with the sensor’s bend angle. Plus, this type of sensor only requires a simple electronic interface to translate the resistance into a voltage output. For specific applications that only need a nonlinear gesture mapping to discrete interpretations [29], for instance, the sign language recognition [24,30], the difficulty level of the task will be minimized as a static analysis will suffice. Simply put, only the initial and final postures of the fingers can be prioritized for further extraction and classifications. For some others where a linear gesture mapping is the main design requirement, monitoring a patient’s hand’s functionality or motor performance [31,32], as an example, the task will be relatively more cumbersome. In this scenario, if one is to use the flex sensor as the primary tool, a dynamic analysis with a high degree of accuracy within the sensor data acquisition subsystem is needed to provide a precise mapping to the target applications.
One major challenge in using the flex sensor for dynamic goniometry purposes is correlating the flex sensor’s bending angle with the goniometry as the resistance tends to be time-varying and is prone to uncertainties particularly when sewn or attached to a cloth glove [33,34]. Moreover, some cloth materials used may provide stiffness or are bound to wear and tear, which consequently cause erroneous representation of the bending angle [32]. Owing to this, several data processing algorithms have been introduced in the literature that operate on the sensor raw output to improve the gesture tracking performance. One of the most common approaches is by using machine learning tools such as artificial neural network [35,36] and Hidden Markov Model [37,38]. While this can provide flexibility of training and verification, and is useful to detect nonlinear relationships between variables, a large training data set is required and many free parameters need to be optimized in order to obtain a model with great accuracy [39]. In some cases where a high sample rate in data acquisition is used, or the sensor output is frequently perturbed by the Gaussian noise, Kalman filtering approach will be better suited, provided that the sensor bending angle has been well calibrated with the static goniometry [26]. Another recent technique via first principle modeling is proposed in [27] where the mathematical representation that relates the flex sensor’s resistance and the bending angle is derived based on the understanding of the system’s underlying physics. The advantage of this approach is the detailed insight into the behavior of the system and hence leads to a better prediction on the performance, whereas the disadvantage is the difficulties in determining the phenomenological parameters caused by internal and external disturbances.
It is also worth to note that even though similar types of sensors are used in a glove-based gestural system, a straightforward comparison between the methods that have been introduced over the years may not be appropriate due to different application-specific tasks and design requirements particularly those concerning the types and speed of gestures, sampling time, sensor locations, as well as data mapping [13]. In this work, a new control-centric approach is introduced to model the characteristic of flex sensors on a goniometric glove, which is designed to capture the user hand gesture that can be used to wirelessly control a bionic hand. The main technique employs the inverse dynamic model strategy along with a black-box identification for the compensator design, which is aimed to provide an approximate linear mapping between the raw sensor output and the dynamic finger goniometry. To smoothly recover the goniometry on the bionic hand’s side during the wireless transmission, the linearity of the mapping needs to be improved. Hence, we propose a Hammerstein–Wiener model to enhance the structure of the compensator, which consists of a linear dynamic system and two static nonlinearities. The linear system is constructed by simplifying the dynamic model from the inverse dynamic design technique, while the static nonlinearities are introduced based on the constraints of the bionic hand, and to account for the uncertain behavior of the sensors as well as the unmodeled dynamics from the black-box identification method. A series of real-time experiments involving several hand gestures have been conducted to analyze the performance of the proposed method. In the experiments, the goniometric speed for each finger is controlled at approximately 83 /s for all gestures. The associated temporal and spatial data from both the glove and the bionic hand are recorded via an image processing technique, and the performance is evaluated in terms of the integral of absolute error between the glove’s and the bionic hand’s dynamic goniometry. The proposed method is also compared with the raw sensor data, which has been preliminarily calibrated with the finger goniometry, and the Wiener model, which is based on the initial inverse dynamic design strategy. Experimental results with several trials for each gesture show that the raw sensor data result in average errors between 515 s and 1347 s, whereas for the Wiener model, the average errors lie between 186 s and 370 s, which are well below the range from the raw data. A significant improvement is obtained via the Hammerstein–Wiener compensator where the resulting average errors are no greater than 102 s. This concludes that the proposed strategy can remarkably improve the dynamic goniometry of the glove, and thus, provides a smooth human–robot collaboration with the bionic hand.
The remainder of the paper proceeds as follows: Section 2 discusses the background and statement of the problem concerning the nonlinear characteristics of the flex sensor from a preliminary analysis. The bionic hand description, the goniometric glove structure, the proposed compensator design method, and the experimental setup are explained in detail in Section 3. Section 4 presents the experimental results from several hand gesture tests, and the average error for each method. The results are finally concluded and discussed in Section 5, together with future work.

2. Background and Problem Statement

A flex sensor is basically a variable resistor that reacts to bends, i.e., it changes its resistance when the bending angle increases. The flex sensor considered in this work is of unidirectional type as shown in Figure 1. When in default position (i.e., flat/relaxed), the resistance measures around 25 k Ω , and may increase up to 125 k Ω as it bends towards 180 . This is illustrated as in Figure 2 where θ and R 1 denote the bending angle and resistance, respectively.
The sensor can be configured to act as a voltage divider where the corresponding output, V o u t is simply:
V o u t = R 1 R 1 + R 2 V i n .
Theoretically, the value of R 1 (in k Ω ) relates to the bending angle as follows:
R 1 = 100 · θ 180 + 25 .
When the value of R 2 is fixed, we have the relation
V o u t = 100 θ + 4500 100 θ + 180 ( 25 + R 2 ) V i n ,
which implies a linear relationship between θ and V o u t . Nevertheless, when the sensor is attached to a moving finger, the exact value of θ will not be smoothly recovered due to the non-smooth finger’s movement. Also, the position of the sensor with respect to the finger may additionally affect the resistance, leading to unpredictable behavior. A preliminary analysis has been conducted to investigate the correlation between the sensor output voltage and the bending angle when the sensor is tied on a cloth glove as shown in Figure 3. Results from four tests when R 2 = 35 k Ω and V i n = 5 V have been recorded in Figure 4, which are also compared with the theoretical values from Equation (3). From the figure, the inconsistencies of the sensor output and the mismatch with the theoretical values reflect the existence of nonlinearities and uncertainties in the sensor model itself.
The focus of this research is to propose a compensator for the goniometric glove with the aforementioned flex sensors which can dynamically recover the gesture of each finger. To wirelessly control a bionic hand using the recovered gesture, the glove has also been preliminarily designed by taking into account the constraints of the bionic hand. The main strategy to achieve this objective is explained in detail in the next section.

3. Methodology and Experimental Setup

3.1. Bionic Hand Description

Throughout this paper, the index i = 1 , 2 , 3 , 4 and 5 will represent signals associated with the thumb, pointer, middle, ring, and pinky fingers, respectively. The bionic hand system used in this work is controlled by five dc motors where the input, β ˜ = [ β 1 , β 2 , β 3 , β 4 , β 5 ] T , is supplied by the signals from an ATMega microcontroller (denoted as μ C 1 ). The overall closed-loop system can be illustrated as in Figure 5 where C ( s ) and H ( s ) denote the proportional-integral-based motor controller and the bionic hand, respectively. Each motor is assigned to control the flexion or extension of each finger from the metacarpophalangeal (MCP) joint, and the system is subject to possible bounded disturbances, d i n . In this work, the effects of the disturbance are assumed only in terms of slow and slightly nonlinear movements due to deadzones or frictions, and do not lead to instability of the system.
The bionic hand has also been designed to mimic the normal behavior of finger movements which can be mathematically described by
θ i D 2 3 θ i for i = 2 , 3 , 4 , 5 ;
θ i P 3 4 θ i for i = 2 , 3 , 4 , 5 and θ i P 1 2 θ i for i = 1
where θ i D , θ i P and θ i are the goniometry measured at distal interphalangeal (DIP), proximal interphalangeal (PIP) and MCP joints respectively (see the corresponding counterparts in Figure 6). Equations (4) and () imply that the fingers’ bending angles from DIP and PIP joints are often influenced by the movement from the MCP joints [26]. For i = 2 , 3 , 4 and 5, the goniometry share the same reference line as illustrated by the pointer finger in Figure 6, while for i = 1 , the reference line is - 90 below that of those for i = 2 , 3 , 4 , 5 (shown in Figure 7), and only MCP and interphalangeal (IP) joints exist.
The overall movement is controlled by the motors attached at the MCP joints, which automatically changes θ i D and θ i P when θ i is changed. Ideally, θ ˜ = [ θ 1 , θ 2 , θ 3 , θ 4 , θ 5 ] T should follow the reference command, β ˜ , but it may not always be the case due to the presence of d i n which can randomly enter the system at any time instance. Apart from that, a constraint, Ψ = d i a g ( ψ 1 , ψ 2 , ψ 3 , ψ 4 , ψ 5 ) , is imposed on θ ˜ to resemble the typical range of joint motions, which is described as
ρ i = Ψ i ( θ i ) = θ L if θ i < θ L θ i if θ L θ i θ U θ U if θ i > θ U
where θ L and θ U denote the lower and upper bounds, respectively. For i = 1 (i.e., thumb), the movement is limited by θ L = 90 and θ U = 170 , whereas for i = 2 , 3 , 4 , 5 , the bounds are θ L = 28 and θ U = 113 .

3.2. Goniometric Glove with Compensators

In this research, the goniometric glove is made of cotton with a thickness of around 2 mm. Preliminary analyses with a grab and release movement have been conducted to investigate suitable positions of the sensors on the goniometric glove. Let ρ ˜ r = [ ρ r 1 , ρ r 2 , ρ r 3 , ρ r 4 , ρ r 5 ] T be the angles measured at the MCP joints of the goniometric glove. An image processing technique in MATLAB is used to capture ρ ˜ r . Figure 8 (left) shows suitable positions of the sensors with respect to the MCP and PIP joints which can register an acceptable and predictable goniometry for each finger. The outputs of the sensors are connected to analog pins of an ATMega microcontroller (denoted as μ C 2 ) with a sample rate of 10 Hz. We denote the raw sensor values read by μ C 2 as α ˜ = [ α 1 , α 2 , α 3 , α 4 , α 5 ] T , where each α i ranges from 0 to 1023. The goniometric glove response in one of the tests is shown in Figure 9.
From Figure 9, it can be observed that when the sensors are positioned as depicted in Figure 8, the response does not deviate far from the fingers’ goniometry. On the other hand, it also suggests that the bionic hand requires a good precompensator to minimize the error between the goniometry and the sensors’ response.
To this end, we propose a dynamic compensator, P ( s ) , which is constructed based on the inverse dynamic approach where the structure is designed using the inverse of the internal model that characterizes the behavior of ρ ˜ r and α ˜ . The black-box system identification technique is used to estimate the dynamics of the model where the input is fed from the value of α ˜ while the output is the value from ρ ˜ r . The highest accuracy from several datasets from the black-box system identification is approximately 70%, and the model with the highest accuracy is given by a linear time-invariant (LTI) model, P ( A p , B p , C p , D p ) where A p = diag ( A 1 , A 2 , A 3 , A 4 , A 5 ) , B p = diag ( B 1 , B 2 , B 3 , B 4 , B 5 ) , C p = diag ( C 1 , C 2 , C 3 , C 4 , C 5 ) , with
A 1 = - 84.51 - 56.92 - 21.02 32 0 0 0 16 0 , A i = - 31.6 - 14.7 16 0 , for i = 2 , 3 , 4 , A 5 = - 7863 - 673.9 - 81.33 512 0 0 0 64 0 , B 1 = 4 0 0 , B i = 4 0 , for i = 2 , 3 , 4 , B 5 = 8 0 0 , C 1 = 0 0 5.073 , C 2 = 0 3.252 , C 3 = 0 4.666 , C 4 = 0 4.424 , C 5 = 0 0 11.27 ,
and D p = diag ( 0 , 0 , 0 , 0 , 0 ).
As the output of the bionic hand system is constrained, a static nonlinearity, Φ w = [ ϕ w 1 , ϕ w 2 , ϕ w 3 , ϕ w 4 , ϕ w 5 ] T block is included in the compensator to ensure the input to the bionic hand system stays within the range. The nonlinearity is described as
ϕ w i ( σ w i ) = σ w l if σ w i < σ w l σ w i if σ w l σ w i σ w u σ w u if σ w i > σ w u
where σ w l = 90 and σ w u = 170 when i = 1 , and σ w l = 29 and σ w u = 112 when i = 2 , 3 , 4 , 5 . The combination of P ( s ) and Φ w in series forms a Wiener-type compensator, which is illustrated as in Figure 10.
The accuracy of 70% from the black-box system identification actually reflects the effects of nonlinearities in the model. To further enhance the tracking performance of the compensator, these effects need to be suppressed. Based on the variations of resistance in the preliminary analysis as shown in Figure 4, we propose a slight modification on the compensator where P is partitioned into two blocks as depicted in Figure 11, consisting of a simplified LTI model, P h w , and another static nonlinearity, Φ h = [ ϕ h 1 , ϕ h 2 , ϕ h 3 , ϕ h 4 , ϕ h 5 ] T . The P h w is constructed based on the estimated dominant pole from P, which results in only first order linear model for each finger. The new configuration of the compensator is well-known as the Hammerstein–Wiener structure which, in general, is an LTI system in series with two static nonlinearities.
The simplified dynamic block of the compensator, P h w ( A p h , B p h , C p h , D p h ) , is constructed as follows:
A p h = 54 0 0 0 0 0 19 0 0 0 0 0 19 0 0 0 0 0 21 0 0 0 0 0 15 , B p h = 69 0 0 0 0 0 24 0 0 0 0 0 27 0 0 0 0 0 28 0 0 0 0 0 21 , C p h = 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1
and D p h = diag ( 0 , 0 , 0 , 0 , 0 ), whereas for the static nonlinearity, it can be described as
ϕ h i ( α i ) = α i + ε l for α i < - ε l 0 for - ε l α i ε u α i - ε u for α i > ε u .
where ε l = 10 and ε u = 20 when i = 1 , 2 , 3 , 4 , and ε l = 10 and ε u = 30 when i = 5 .

3.3. Experimental Setup

For wireless communication between the goniometric glove and the bionic hand, an HC-12 serial communication module is connected to μ C 1 as a receiver, and another similar module is connected to μ C 2 and configured as a transmitter. Six different sets of gestures are considered for the experiments as follows:
  • Gesture 1: Grab-release-grab
  • Gesture 2: Number two sign
  • Gesture 3: Call sign
  • Gesture 4: Okay sign
  • Gesture 5: Mixed Gestures A
  • Gesture 6: Mixed Gestures B
These are illustrated as in Figure 12. The first four gestures involve at most three hand movement transitions, while the last two gestures involve six movement transitions. The experimental setup is depicted in Figure 13 where the performance of the overall system is evaluated based on the temporal mismatch between the glove’s and the bionic hand’s goniometry which are registered through cameras connected to a PC desktop via USB cables. The values of ρ ˜ r and ρ ˜ are extracted via image processing techniques in MATLAB.

4. Experimental Results and Performance Evaluations

In the experiments, the bionic hand response, i.e., ρ ˜ , is compared when there is no compensator at all, and when Wiener and Hammerstein–Wiener compensators are implemented, which are denoted by “Raw”, “W”, and “HW” respectively in all figures. The response is also compared with the reference, ρ ˜ r , which is from the glove goniometry. The experiments conducted focus on temporal analysis, and the goniometric speed for each finger is controlled at approximately 83 /s. A series of postures from the bionic hand are also recorded for a simple spatial analysis.
For the first experiment, i.e., Gesture 1, the hand gesture starts from the grab position, and all fingers slowly stretch at t 4 s, remain at this position between t 4.5 s and t 5.5 s, and finally return to the grab position at t 6 s. The values of ρ i for i = 1 , 2 , 3 , 4 and 5 are plotted in Figure 14, and it can be clearly observed that without any compensator, the bionic hand fingers are slightly moving when there is no movement from the glove. The movement becomes worse for certain fingers as shown by the large fluctuations of ρ 2 , ρ 3 and ρ 5 between t = 4 s and t = 6 s. Also, ρ 1 shows an unexpected behavior after t = 6 s when the thumb is supposed to bend. These undesired responses can however be alleviated using both Compensators W and HW. It is also clear that the best response is obtained when the goniometric glove is controlled by Compensator HW, particularly during the “grab” instances. To show the error response, we define e i = ρ r i - ρ i which represents the mismatch between the glove’s and the bionic hand’s goniometry. The corresponding e i for Gesture 1 experiment is presented in Figure 15.
For Gesture 2 experiment, which is showing the number two sign, the values of ρ i for i = 1 , 2 , 3 , 4 and 5 are plotted in Figure 16. The hand gesture starts when all fingers are vertically stretched, and the thumb, middle and pinky fingers slowly flex between t 2.6 s and t 4 s. In this experiment, ρ r 2 and ρ r 3 are not supposed to vary too much, but large fluctuations in ρ 2 and ρ 3 can be seen when there is no compensator applied. A quite similar behavior is also observed after t = 4 s for ρ 4 and ρ 5 , resulting in a large error. In this case, Compensator HW provides a significant improvement as compared to Compensator W due to the erratic readings as seen in ρ 1 , ρ 4 and ρ 5 before t = 4 s. The error can also be clearly seen from the plot of the corresponding e i in Figure 17.
The bionic hand response for Gesture 3 experiment is shown in Figure 18 where the gesture starts when all fingers are vertically stretched. The pointer, middle and ring fingers start to flex at t 3.5 s, and the hand stays in the “call sign” gesture after t 4 s. The figure excludes ρ 1 as the thumb stays stationary in this gesture, and all responses from “Raw”, “W” and “HW” are very close to ρ r 1 . Similar to the behavior seen from Gesture 2 experiment, Compensator HW outperforms the rest as the fluctuations and the error are minimized as can be observed from ρ i ( i = 2 , 3 , 4 and 5), as shown in Figure 19.
For Gesture 4, the response is shown in Figure 20 where the gesture starts when all fingers are vertically stretched. The thumb and pointer fingers start to flex at t 3.5 s, and the hand completely forms the “okay sign” after t 4 s. In this gesture, the middle, ring, and pinky fingers almost stay stationary and ρ 3 , ρ 4 , ρ 5 for “Raw”, “W” and “HW” do not show significant deviations from ρ r . Hence, only ρ 1 and ρ 2 are highlighted in the left column of the figure, together with the corresponding error in the right column. The response shows very large fluctuations when the goniometric glove is not compensated, and the undesired behavior is significantly suppressed by using Compensators W and HW.
The experiments with Gesture 5 and Gesture 6 are slightly different than the previous four gestures as they are designed to analyze the robustness of the proposed strategy when a rapid hand movement is involved. For the Gesture 5 experiment, the hand starts when all fingers are vertically stretched, and then one thumb bends towards the palm, followed by the rest after approximately 0.8 s. The transition proceeds with the pointer until pinky fingers stretch back, close, and stretch again within approximately 2 s. The experiment ends when the thumb is released to its initial position.
As can be observed from Figure 21, by using Compensator HW, the mismatch between the glove’s and bionic hand’s goniometry is drastically minimized as compared to that when the raw sensor data or Compensator W are implemented. The is also clearly seen in Figure 22 where the resulting error from Compensator HW implementation does not deviate too much from the zero value.
For the last gesture, which is Gesture 6, the hand starts when all fingers close, but the pointer is stretched away from, and the middle is slightly bent towards the palm. Then the pointer and the middle fingers exchange their positions, followed by all fingers close. The pointer until pinky fingers stretch back and close again within 1 second, and the transition ends when all fingers are released.
The responses are recorded in Figure 23, and a similar outcome can still be seen from this last experiment where the implementation of Compensator HW provides the least mismatch between the goniometric glove and the bionic hand. The corresponding error response is shown in Figure 24.
Some images taken from the camera during the performance evaluations are shown in Figure 25. Each of them illustrates the final position of each finger for each gesture (i.e., grab, number two, call, and okay signs).
As the closed-loop bionic hand system is susceptible to unknown disturbances, the experiments for Gesture 1 until Gesture 6 are repeated for three times to provide a better evaluation, and the performance is quantified in terms of the integral of absolute error, E ( s), as follows:
E i = 0 t f | e i ( t ) | d t , e i ( t ) = ρ r i ( t ) - ρ i ( t )
where t f denotes the final time of execution. The total error from each finger, which is calculated as
E T = i = 1 5 E i ,
for all trials and gestures are recorded in Table 1 when there is no compensation at all, and Table 2 when Compensators W and HW are applied. Table 1 shows average errors between 515 s and 1347 s for all gestures, which are much bigger than those from Table 2. Interestingly, Compensator HW shows average errors of less than 102 s, while the average errors when Compensator W is applied vary between 186 s and 370 s. This signifies that Compensator HW can provide a major improvement over Compensator W in terms of the temporal mismatch between the goniometric glove’s and the constrained bionic hand’s movements.

5. Discussions and Conclusions

This paper has introduced a new control-centric approach to model the characteristic of flex sensors on a goniometric glove, which is designed to capture the user hand gesture that can be used to wirelessly control a bionic hand subject to some constraints. The main technique employs the inverse dynamic model strategy along with a black-box identification for the compensator design, which is aimed to provide an approximate linear mapping between the raw sensor output and the dynamic finger goniometry. To smoothly recover the goniometry on the bionic hand’s side during the wireless transmission, the compensator is restructured into a Hammerstein–Wiener model, which consists of a linear dynamic system and two static nonlinearities. The linear system is constructed by simplifying the dynamic model from the inverse dynamic design technique, while the static nonlinearities are introduced based on the constraints of the bionic hand, and to account for the uncertain behavior of the sensors as well as the unmodeled dynamics from the black-box identification method. A series of real-time experiments involving several hand gestures have been conducted to analyze the performance of the proposed method. The experimental results with several trials for each gesture show that a great improvement is obtained via the Hammerstein–Wiener compensator approach where the resulting average errors are significantly smaller than the other two methods considered. This concludes that the proposed strategy can remarkably improve the dynamic goniometry of the glove, and thus, provides a smooth human–robot collaboration with the bionic hand.
While the experimental results show a great accuracy via the proposed method, this work only considers one degree-of-freedom movement from the MCP joint. For future work, the framework will be extended to include the overall 3D motion of the goniometric glove to further enhance the bionic hand control system. This however may require some modifications on the bionic hand’s structure to allow more gestures from the glove to be recovered. The proposed method can also be combined with another technique such as artificial neural network to find the correlations between the hand palm and the fingers, as well as the correlation between the fingers itself.

Author Contributions

Data curation, S.A.A.S.M.A.; Formal analysis, N.S.A.; Funding acquisition, N.S.A.; Investigation, S.A.A.S.M.A.; Methodology, S.A.A.S.M.A. and N.S.A.; Project administration, N.S.A.; Supervision, N.S.A.; Validation, N.S.A. and P.G.; Writing—original draft, N.S.A.; Writing—review & editing, P.G.

Funding

Universiti Sains Malaysia, Research Universiti Individual (RUI) Grant (1001/PELECT/8014055).

Acknowledgments

The authors would like to thank Universiti Sains Malaysia for the financial support under RUI Grant (1001/PELECT/8014055).

Conflicts of Interest

The authors declare no conflict of interest.

Notations and Abbreviations

The following notations and abbreviations are used in this manuscript:
iThe subscript i = 1 , 2 , 3 , 4 and 5 on a symbol indicates the signal associated with the thumb, pointer, middle, ring, and pinky fingers, respectively.
β i input signal to the bionic hand’s system
MCPmetacarpophalangeal
DIPdistal interphalangeal
PIPproximal interphalangeal
θ i D angle measured at the DIP joint of the bionic hand
θ i P angle measured at the PIP joint of the bionic hand
θ i angle measured at the MCP joint of the bionic hand (without constraint)
θ L , θ U lower and upper bounds of θ i
ψ i the constraint imposed on θ i
ρ i angle measured at the MCP joint of the bionic hand (with constraint)
ρ r i angle measured at the MCP joint of the goniometric glove
e i error or mismatch between ρ i and ρ r i
α i raw sensor value
ϕ w i static nonlinearity after the compensator’s dynamic model
ϕ h i static nonlinearity before the compensator’s dynamic model
σ w i output of the compensator’s dynamic model
σ h i input of the Hammerstein–Wiener compensator’s dynamic model
d i n unknown input disturbance within the bionic hand system
Pdynamic model of the Wiener compensator
P h w dynamic model of the Hammerstein–Wiener compensator
μ C 1 microcontroller for the goniometric glove
μ C 2 microcontroller for the bionic hand
E i integral of absolute error
t f final time of execution
E T total error from each finger

References

  1. Akl, A.; Feng, C.; Valaee, S. A Novel Accelerometer-Based Gesture Recognition System. IEEE Trans. Signal Process. 2011, 59, 6197–6205. [Google Scholar] [CrossRef]
  2. Pickering, C.A.; Burnham, K.J.; Richardson, M.J. A Research Study of Hand Gesture Recognition Technologies and Applications for Human Vehicle Interaction. In Proceedings of the 2007 3rd Institution of Engineering and Technology Conference on Automotive Electronics, Warwick, UK, 28–29 June 2007; pp. 1–15. [Google Scholar]
  3. Kaur, H.; Rani, J. A review: Study of various techniques of Hand gesture recognition. In Proceedings of the 2016 IEEE 1st International Conference on Power Electronics, Intelligent Control and Energy Systems (ICPEICES), Delhi, India, 4–6 July 2016; pp. 1–5. [Google Scholar]
  4. Sonkusare, J.S.; Chopade, N.B.; Sor, R.; Tade, S.L. A Review on Hand Gesture Recognition System. In Proceedings of the 2015 International Conference on Computing Communication Control and Automation, Pune, India, 26–27 February 2015; pp. 790–794. [Google Scholar]
  5. Rautaray, S.S.; Agrawal, A. Vision based hand gesture recognition for human computer interaction: A survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
  6. Ahmed, M.A.; Zaidan, B.B.; Zaidan, A.A. A Review on Systems-Based Sensory Gloves for Sign Language Recognition State of the Art between 2007 and 2017. Sensors 2018, 18, 2208. [Google Scholar] [CrossRef] [PubMed]
  7. Vivar, G.; Almanza-Ojeda, D.L.; Cheng, I.; Gomez, J.C.; Andrade-Lucio, J.A.; Ibarra-Manzano, M.A. Contrast and Homogeneity Feature Analysis for Classifying Tremor Levels in Parkinson’s Disease Patients. Sensors 2019, 19, 2072. [Google Scholar] [CrossRef] [PubMed]
  8. Simone, L.K.; Sundarrajan, N.; Luo, X.; Jia, Y.; Kamper, D.G. A low cost instrumented glove for extended monitoring and functional hand assessment. J. Neurosci. Methods 2007, 160, 335–348. [Google Scholar] [CrossRef] [PubMed]
  9. Alonso-Martín, F.; Gamboa-Montero, J.J.; Castillo, J.C.; Castro-González, l.; Salichs, M.N. Detecting and Classifying Human Touches in a Social Robot Through Acoustic Sensing and Machine Learning. Sensors 2017, 17, 1138. [Google Scholar] [CrossRef] [PubMed]
  10. Liu, H.; Wang, L. Gesture recognition for human-robot collaboration: A review. Int. J. Ind. Ergon. 2018, 68, 355–367. [Google Scholar] [CrossRef]
  11. Lv, Z.; Halawani, A.; Feng, S.; ur Réhman, S.; Li, H. Touch-less interactive augmented reality game on vision-based wearable device. Pers. Ubiquitous Comput. 2015, 19, 551–567. [Google Scholar] [CrossRef]
  12. Murthy, G.; Jadon, R. A review of vision based hand gesture recognition. Int. J. Inf. Technol. Knowl. Manag. 2009, 2, 405–410. [Google Scholar]
  13. Dipietro, L.; Sabatini, A.M.; Dario, P. A Survey of Glove-Based Systems and Their Applications. IEEE Trans. Syst. Man. Cybern. C 2008, 38, 461–482. [Google Scholar] [CrossRef]
  14. Reifinger, S.; Wallhoff, F.; Ablassmeier, M.; Poitschke, T.; Rigoll, G. Static and Dynamic Hand-Gesture Recognition for Augmented Reality Applications. In Proceedings of the International Conference on Human-Computer Interaction, Beijing, China, 22–27 July 2007; pp. 728–737. [Google Scholar]
  15. Zengeler, N.; Kopinski, T.; Handmann, U. Hand Gesture Recognition in Automotive Human–Machine Interaction Using Depth Cameras. Sensors 2018, 19, 59. [Google Scholar] [CrossRef] [PubMed]
  16. Mueller, F.; Mehta, D.; Sotnychenko, O.; Sridhar, S.; Casas, D.; Theobalt, C. Real-time Hand Tracking under Occlusion from an Egocentric RGB-D Sensor. CoRR 2017, abs/1704.02201, 1284–1293. [Google Scholar]
  17. Du, G.; Zhang, P. A Markerless Human–Robot Interface Using Particle Filter and Kalman Filter for Dual Robots. IEEE Trans. Ind. Electron. 2015, 62, 2257–2264. [Google Scholar] [CrossRef]
  18. Chong, T.W.; Lee, B.G. American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach. Sensors 2018, 18, 3554. [Google Scholar] [CrossRef] [PubMed]
  19. Nizamis, K.; Rijken, N.H.M.; Mendes, A.; Janssen, M.M.H.P.; Bergsma, A.; Koopman, B.F.J.M. A Novel Setup and Protocol to Measure the Range of Motion of the Wrist and the Hand. Sensors 2018, 18, 3230. [Google Scholar] [CrossRef] [PubMed]
  20. Bachmann, D.; Weichert, F.; Rinkenauer, G. Review of Three-Dimensional Human-Computer Interaction with Focus on the Leap Motion Controller. Sensors 2018, 18, 2194. [Google Scholar] [CrossRef] [PubMed]
  21. Wang, S.; Parsons, M.; Stone-McLean, J.; Rogers, P.; Boyd, S.; Hoover, K.; Meruvia-Pastor, O.; Gong, M.; Smith, A. Augmented Reality as a Telemedicine Platform for Remote Procedural Training. Sensors 2017, 17, 2294. [Google Scholar] [CrossRef] [PubMed]
  22. Kim, M.; Jeon, C.; Kim, J. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality. Sensors 2017, 17, 1141. [Google Scholar] [Green Version]
  23. Guzsvinecz, T.; Szucs, V.; Sik-Lanyi, C. Suitability of the Kinect Sensor and Leap Motion Controller—A Literature Review. Sensors 2019, 19, 1702. [Google Scholar] [CrossRef] [PubMed]
  24. Kumar, P.; Gauba, H.; Roy, P.P.; Dogra, D.P. A multimodal framework for sensor based sign language recognition. Neurocomputing 2017, 259, 21–38. [Google Scholar] [CrossRef]
  25. Chen, Z.H.; Kim, J.T.; Liang, J.; Zhang, J.; Yuan, Y.B. Real-Time Hand Gesture Recognition Using Finger Segmentation. Sci. World J. 2017, 2014, 9. [Google Scholar] [CrossRef] [PubMed]
  26. Ponraj, G.; Ren, H. Sensor Fusion of Leap Motion Controller and Flex Sensors Using Kalman Filter for Human Finger Tracking. IEEE Sens. J. 2018, 18, 2042–2049. [Google Scholar] [CrossRef]
  27. Orengo, G.; Lagati, A.; Saggio, G. Modeling Wearable Bend Sensor Behavior for Human Motion Capture. IEEE Sens. J. 2014, 14, 2307–2316. [Google Scholar] [CrossRef]
  28. Norkin, C.C.; White, D.J. Measurement Of Joint Motion: A Guide To Goniometry; F.A. Davis Company: Philadelphia, PA, USA, 2017. [Google Scholar]
  29. Lai, H.Y.; Ke, H.Y.; Hsu, Y.C. Real-time Hand Gesture Recognition System and Application. Sens. Mater. 2018, 30, 869–884. [Google Scholar]
  30. Addin, I.; Sidig, A.; Luqman, H.; Mahmoud, S.A. Transform-based Arabic sign language recognition. Proc. Comput. Sci. 2017, 117, 2–9. [Google Scholar]
  31. Hellebrandt, F.A.; Duvall, E.N.; Moore, M.L. The Measurement of Joint Motion: Part III—Reliability of Goniometry*. Phys. Ther. 1949, 29, 302–307. [Google Scholar] [CrossRef]
  32. Wise, S.L.; Gardner, W.T.; Sabelman, E.E.; Valainis, E.; Wong, Y.C.; Glass, K.; Drace, J.E.; Rosen, J.M. Evaluation of a fiber optic glove for semi-automated goniometric measurements. J. Rehabilit. Res. Dev. 1990, 27, 411–424. [Google Scholar] [CrossRef]
  33. Kessler, G.D.; Hodges, L.F.; Walker, N. Evaluation of the CyberGlove As a Whole-hand Input Device. ACM Trans. Comput.-Hum. Interact. 1995, 2, 263–283. [Google Scholar] [CrossRef]
  34. Dipietro, L.; Sabatini, A.M.; Dario, P. Evaluation of an instrumented glove for hand-movement acquisition. J. Rehabilit. Res. Dev. 2003, 40 2, 179–189. [Google Scholar] [CrossRef]
  35. Tani, B.S.; Maia, R.S.; von Wangenheim, A. A Gesture Interface for Radiological Workstations. In Proceedings of the Twentieth IEEE International Symposium on Computer-Based Medical Systems, Maribor, Slovenia, 20–22 June 2007; pp. 27–32. [Google Scholar]
  36. Vamplew, P. Recognition of Sign Language Gestures Using Neural Networks. In Proceedings of the 1st European Conference on Disability, Virtual Reality and Associated Technologies, Maidenhead, UK, 8–10 July 1996; pp. 39–48. [Google Scholar]
  37. Min, B.-W.; Yoon, H.-S.; Soh, J.; Yang, Y.-M.; Ejima, T. Hand gesture recognition using hidden Markov models. In Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation, Orlando, FL, USA, 12–15 October 1997; pp. 4232–4235. [Google Scholar]
  38. Kao, C.Y.; Fahn, C.S. A Human-Machine Interaction Technique: Hand Gesture Recognition Based on Hidden Markov Models with Trajectory of Hand Motion. Proc. Eng. 2011, 15, 3739–3743. [Google Scholar] [CrossRef] [Green Version]
  39. Tu, J.V. Advantages and disadvantages of using artificial neural networks versus logistic regression for predicting medical outcomes. J. Clin. Epidemiol. 1996, 49, 122–1231. [Google Scholar] [CrossRef]
Figure 1. A 2.2 unidirectional flex sensor.
Figure 1. A 2.2 unidirectional flex sensor.
Sensors 19 03896 g001
Figure 2. Illustration on the relation between the flex sensor bending angle, θ , and the resistance, R 1 .
Figure 2. Illustration on the relation between the flex sensor bending angle, θ , and the resistance, R 1 .
Sensors 19 03896 g002
Figure 3. A preliminary analysis to investigate the correlation between the sensor output voltage and the bending angle when the sensor is tied on a cloth glove.
Figure 3. A preliminary analysis to investigate the correlation between the sensor output voltage and the bending angle when the sensor is tied on a cloth glove.
Sensors 19 03896 g003
Figure 4. Comparisons between real and theoretical sensor output voltages with respect to the bending angle.
Figure 4. Comparisons between real and theoretical sensor output voltages with respect to the bending angle.
Sensors 19 03896 g004
Figure 5. Bionic hand (left); Closed-loop control structure of the bionic hand (right).
Figure 5. Bionic hand (left); Closed-loop control structure of the bionic hand (right).
Sensors 19 03896 g005
Figure 6. Illustration on the goniometry of the pointer finger and its’ reference line. The figure shows θ 2 = 90 .
Figure 6. Illustration on the goniometry of the pointer finger and its’ reference line. The figure shows θ 2 = 90 .
Sensors 19 03896 g006
Figure 7. Illustration on the goniometry of the thumb and its’ reference line. The figure shows θ 1 = 135 .
Figure 7. Illustration on the goniometry of the thumb and its’ reference line. The figure shows θ 1 = 135 .
Sensors 19 03896 g007
Figure 8. Positions of flex sensors with respect to the MCP and PIP joints (left figure); Flex sensors attached to the goniometric glove (right figure).
Figure 8. Positions of flex sensors with respect to the MCP and PIP joints (left figure); Flex sensors attached to the goniometric glove (right figure).
Sensors 19 03896 g008
Figure 9. Goniometric glove response with respect to a grab-release-grab movement. The finger goniometry and the raw sensor outputs are represented by ρ r i (blue line) and α i (orange line) respectively.
Figure 9. Goniometric glove response with respect to a grab-release-grab movement. The finger goniometry and the raw sensor outputs are represented by ρ r i (blue line) and α i (orange line) respectively.
Sensors 19 03896 g009
Figure 10. Bionic hand with a Wiener compensator.
Figure 10. Bionic hand with a Wiener compensator.
Sensors 19 03896 g010
Figure 11. Overall control system structure with the Hammerstein–Wiener compensator.
Figure 11. Overall control system structure with the Hammerstein–Wiener compensator.
Sensors 19 03896 g011
Figure 12. Hand gestures considered for the experiments.
Figure 12. Hand gestures considered for the experiments.
Sensors 19 03896 g012
Figure 13. Schematic diagram for the experimental setup. The goniometric glove with the flex sensors is represented by “Device Under Test (DUT)”, and cameras are used together with image processing in MATLAB for performance evaluations.
Figure 13. Schematic diagram for the experimental setup. The goniometric glove with the flex sensors is represented by “Device Under Test (DUT)”, and cameras are used together with image processing in MATLAB for performance evaluations.
Sensors 19 03896 g013
Figure 14. Gesture 1: The mismatch between the glove’s and bionic hand’s goniometry is significantly reduced by using Compensators W and HW. Compensator HW is seen to provide a better response as compared to Compensator W, particularly before t = 4 s and after t = 6 s.
Figure 14. Gesture 1: The mismatch between the glove’s and bionic hand’s goniometry is significantly reduced by using Compensators W and HW. Compensator HW is seen to provide a better response as compared to Compensator W, particularly before t = 4 s and after t = 6 s.
Sensors 19 03896 g014
Figure 15. Gesture 1: The corresponding error from the response in Figure 14. The error due to the response from Compensator HW is clearly much lower at all time instances as compared to the others.
Figure 15. Gesture 1: The corresponding error from the response in Figure 14. The error due to the response from Compensator HW is clearly much lower at all time instances as compared to the others.
Sensors 19 03896 g015
Figure 16. Gesture 2: The mismatch between the glove’s and bionic hand’s goniometry is significantly reduced by using Compensator HW, which also provides a better response as compared to Compensator W, particularly before t = 4 s.
Figure 16. Gesture 2: The mismatch between the glove’s and bionic hand’s goniometry is significantly reduced by using Compensator HW, which also provides a better response as compared to Compensator W, particularly before t = 4 s.
Sensors 19 03896 g016
Figure 17. Gesture 2: The corresponding error from the response in Figure 16. The error due to the response from Compensator HW is significantly lower at all time instances as compared to the others.
Figure 17. Gesture 2: The corresponding error from the response in Figure 16. The error due to the response from Compensator HW is significantly lower at all time instances as compared to the others.
Sensors 19 03896 g017
Figure 18. Gesture 3: The mismatch between the glove’s and bionic hand’s goniometry is minimal when Compensator HW is applied as compared to the others.
Figure 18. Gesture 3: The mismatch between the glove’s and bionic hand’s goniometry is minimal when Compensator HW is applied as compared to the others.
Sensors 19 03896 g018
Figure 19. Gesture 3: The corresponding error from the response in Figure 18. The error due to the response from Compensator HW is the least at all time instances as compared to the others.
Figure 19. Gesture 3: The corresponding error from the response in Figure 18. The error due to the response from Compensator HW is the least at all time instances as compared to the others.
Sensors 19 03896 g019
Figure 20. Gesture 4: The mismatch between the glove’s and bionic hand’s goniometry is significantly reduced when both Compensators W and HW are applied.
Figure 20. Gesture 4: The mismatch between the glove’s and bionic hand’s goniometry is significantly reduced when both Compensators W and HW are applied.
Sensors 19 03896 g020
Figure 21. Gesture 5: The mismatch between the glove’s and bionic hand’s goniometry is minimal when Compensator HW is applied as compared to the others.
Figure 21. Gesture 5: The mismatch between the glove’s and bionic hand’s goniometry is minimal when Compensator HW is applied as compared to the others.
Sensors 19 03896 g021
Figure 22. Gesture 5: The corresponding error from the response in Figure 21. The error due to the response from Compensator HW is the least at all time instances as compared to the others.
Figure 22. Gesture 5: The corresponding error from the response in Figure 21. The error due to the response from Compensator HW is the least at all time instances as compared to the others.
Sensors 19 03896 g022
Figure 23. Gesture 6: The mismatch between the glove’s and bionic hand’s goniometry is minimal when Compensator HW is applied as compared to the others.
Figure 23. Gesture 6: The mismatch between the glove’s and bionic hand’s goniometry is minimal when Compensator HW is applied as compared to the others.
Sensors 19 03896 g023
Figure 24. Gesture 6: The corresponding error from the response in Figure 23. The error due to the response from Compensator HW is the least at all time instances as compared to the others.
Figure 24. Gesture 6: The corresponding error from the response in Figure 23. The error due to the response from Compensator HW is the least at all time instances as compared to the others.
Sensors 19 03896 g024
Figure 25. Bionic hand gestures during the image processing in MATLAB from the six experiments.
Figure 25. Bionic hand gestures during the image processing in MATLAB from the six experiments.
Sensors 19 03896 g025
Table 1. Total error, E T ( s), for each gesture and its average value when there is no compensator on the goniometric glove.
Table 1. Total error, E T ( s), for each gesture and its average value when there is no compensator on the goniometric glove.
Gesture123456
Trial 11036.21701.52014.21201.21479418.5
Trial 2545.71654.141023.1721.781080.25525.3
Trial 3461.21512.231001.2657.12700.23602.3
Average681.04956.01346.2860.01086.5515.4
Table 2. Total error, E T ( s), for each gesture and its average value when Wiener and Hammerstein–Wiener compensators are applied on the goniometric glove.
Table 2. Total error, E T ( s), for each gesture and its average value when Wiener and Hammerstein–Wiener compensators are applied on the goniometric glove.
WienerHammerstein–Wiener
Gesture 123456123456
Trial 1354.2412.21401.2870.254299.5315.9136.797.75254.376.7639.4748.4
Trial 2144.25152.25101.2598.321441.240149.839.139.3437.03108.381.3
Trial 360.214101.27124.27452.12300.2389.36.1316.13712.083.28385.3104.9
Average186.25221.91208.93206.90347368.764.2147.66101.9039.0277.6978.2

Share and Cite

MDPI and ACS Style

Syed Mubarak Ali, S.A.A.; Ahmad, N.S.; Goh, P. Flex Sensor Compensator via Hammerstein–Wiener Modeling Approach for Improved Dynamic Goniometry and Constrained Control of a Bionic Hand. Sensors 2019, 19, 3896. https://doi.org/10.3390/s19183896

AMA Style

Syed Mubarak Ali SAA, Ahmad NS, Goh P. Flex Sensor Compensator via Hammerstein–Wiener Modeling Approach for Improved Dynamic Goniometry and Constrained Control of a Bionic Hand. Sensors. 2019; 19(18):3896. https://doi.org/10.3390/s19183896

Chicago/Turabian Style

Syed Mubarak Ali, Syed Afdar Ali, Nur Syazreen Ahmad, and Patrick Goh. 2019. "Flex Sensor Compensator via Hammerstein–Wiener Modeling Approach for Improved Dynamic Goniometry and Constrained Control of a Bionic Hand" Sensors 19, no. 18: 3896. https://doi.org/10.3390/s19183896

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop