Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network

: This study proposes a data ‐ driven control method of extra robotic fingers to assist a user in bimanual object manipulation that requires two hands. The robotic system comprises two main parts, i.e., robotic thumb (RT) and robotic fingers (RF). The RT is attached next to the user’s thumb, while the RF is located next to the user’s little finger. The grasp postures of the RT and RF are driven by bending angle inputs of flex sensors, attached to the thumb and other fingers of the user. A modified glove sensor is developed by attaching three flex sensors to the thumb, index, and middle fingers of a wearer. Various hand gestures are then mapped using a neural network. The input data of the robotic system are the bending angles of thumb and index, read by flex sensors, and the outputs are commanded servo angles for the RF and RT. The third flex sensor is attached to the middle finger to hold the extra robotic finger’s posture. Two force ‐ sensitive resistors (FSRs) are attached to the RF and RT for the haptic feedback when the robot is worn to take and grasp a fragile object, such as an egg. The trained neural network is embedded into the wearable extra robotic fingers to control the robotic motion and assist the human fingers in bimanual object manipulation tasks. The developed extra fingers are tested for their capacity to assist the human fingers and perform 10 different bimanual tasks, such as holding a large object, lifting and operate an eight ‐ inch tablet, and lifting a bottle, and opening a bottle cap at the same time. Sequence images of haptic performance test.


Introduction
The use of robotic arms is typically common in manufacturing and service operations [1,2]. The current research is more focused on robots that can be worn due to the growing demand for them. The most widely known of these types of robots are prosthesis and exoskeleton robots. A prosthesis is used to replace the lost limbs of humans due to accidents or birth defects. Many researchers around the world have developed myoelectric prosthetic hands to replace and add upper limbs, [3][4][5][6][7][8][9][10][11], and ankle/foot prosthesis for the lower limb [12][13][14]. Several researchers have attempted to make the cost of the prosthetic hand more affordable, by utilizing 3D printer technologies [3][4][5] and tendon mechanisms [6]. The researchers in Italy strive to replicate the human hand as closely as possible, by developing 16 degrees of freedom and 40 sensors for their prosthetic hand [7]. One method for simplifying the classification procedure of electromyography (EMG) signals was proposed in [8], by using novel 3D electromagnetic positioning sensors. Lightweight prosthetic hands were designed by considering the trade-off between dexterity and cost [9,11]. Support Vector Machine (SVM) classification with a high-level Finite State Machine (FSM) was used to produce accurate and robust control in [10]. These kinds of wearable robots have become established and commercialized products, and are widely available on the market today [15][16][17][18][19]. The bebionic hand from Ottobock has 14 different grip patterns and hand positions [15]. This hand utilizes lead screw motors as the main actuators and linkage as the joint couple method. The Michelangelo hand from Ottobock has seven grip patterns and position modes [16]. The hand utilizes cam design in all fingers as the joint couple method. The Vincent hand uses four motors on individual fingers and two motors on the thumb, for supporting 14 grasp patterns that can be chosen with single trigger control [18] The affordable and opensource myoelectric hands were developed by Open Bionics [17] and exiii HACKberry [19]. These hands are based on 3D printer technology, which makes them lightweight. Open Bionics employ linear actuator and exiii HACKberry utilizes servo motors as the main actuators. The second most commonly used wearable robot is an exoskeleton. This type of robot is worn to provide mechanical support for diminished limb function due to stroke, accident, or other diseases. The exoskeleton robots are worn by a wearer to support mechanical force and to assist human joints such as the elbow [20][21][22], finger/hand [23][24][25], and ankle/foot [26,27]. This type of robot has been developed by using hard and soft robot technology. Many results in the study show that a soft robot has more benefits and greater convenience for use as an exoskeleton, providing mechanical support and force, because of the characteristics of the utilized materials compared to hard robot materials. The robot is also widely used as a stroke rehabilitation device [28,29].
The third type of wearable robot is a supernumerary robotic limb (SRL). This SRL is a robot that adds more robotic limbs for the user, to provide mechanical assistance. Unlike an exoskeleton robot, this kind of robot moves independently from the skeletal user's limbs. The robotic limbs that can be added are arms, fingers, and legs. The developed supernumerary robotic leg can provide balance assistance or sitting/standing assistance [30,31]. Supernumerary robotic arms are worn to provide mechanical support and assist a wearer in performing a task [32][33][34]. The last added robotic limbs are supernumerary/extra robotic fingers.
Supernumerary robotic fingers (SRFs) are additional/extra robotic fingers, which move in a way that mimics the movements of a human finger. SRF development has different objectives, ranging from industrial applications to assistive medical devices. Most of these robots are designed and developed as a device to assist users in the task of bimanual object manipulation. The most challenging issue for this type of robot is to coordinate grasp posture control between the wearer's fingers and the robotic fingers. Most SRFs are used to provide physical assistance when the wearer performs bimanual object manipulation. This coordination can usually be done using two normal healthy hands, and it is challenging to perform bimanual object manipulation using only one hand.
Researchers at Massachusetts Institute of Technology (MIT) have successfully created an SRF for bimanual object manipulation [35][36][37][38]. This SRF consists of two additional robotic fingers placed next to the user's thumb and little finger. Partial least square (PLS) regression is applied for grasp posture control of the developed SRF. The proposed SRFs can be worn to provide physical assistance in bimanual object grasping manipulation. Some researchers have built an extra robotic thumb [39,40]. Other researchers have developed extra robotic fingers or sixth fingers, as presented in [41][42][43][44][45]. The developed extra finger successfully enhances manipulation dexterity and enlarges the wearer's workspace. The robots are also able to assist chronic stroke patients.
In this study, a new type of wearable robot is developed for providing mechanical assistance in bimanual object manipulation tasks. A data-driven control method using neural network regression is applied to control the wearable extra robotic finger's motion more intuitively and efficiently. A flex sensor is applied to read the bending motion from the userʹs thumb, index, and middle fingers. These signals will be fed to the trained neural network regression to estimate the commanded angles for the servo motors in the extra robotic fingers. The prototype is also equipped with force-sensitive resistors (FSRs), and these are attached to the fingertips of the extra robotic fingers to incorporate a haptic feedback system. The developed extra robotic fingers are worn on the userʹs healthy hand for providing assistance in bi-manual manipulation tasks.
The remaining sections of this study are organized as follows: The design and forward kinematics of the extra robotic fingers are presented in Section 2. The data-driven method utilizing neural network regression for extra robotic posture control is summarized in Section 3. The results of the bimanual manipulation tasks are given in Section 4. The summary of conclusions is written in Section 5.

Extra Two Robotic Fingers
This section comprises two subsections, which generally present the following content: (1) The extra robotic fingers' (ERF) design, sensors, microcontrollers, actuators, and other electronic components are discussed in the first subsection. (2) The second subsection presents forward kinematics for the fingertip motions of the RT and RF. This kinematics method is utilized to estimate the RT and RF tip trajectories when they are given with predefined input angles. Finally, the forward kinematics formula will be computed using the Denavit-Hartenberg (DH) parameter to estimate the fingertip trajectories of the RT and RF.

Extra Robotic Fingers Design and Prototyping
In developing the extra robotic fingers for providing mechanical assistance in bimanual object manipulation, the 3D model of the extra robotic fingers is adjusted to the dimensions of the required components and the size of the userʹs hand. A 3D design is carried out in a SolidWorks 3D CAD design Software environment. The 3D design of the proposed extra robotic fingers is shown in Figure  1. The proposed wearable extra robotic fingers consist of a robotic thumb (RT), which is attached near the thumb of the user's hand, and the robotic finger (RF), which is placed next to the index finger of the user's hand. Both the RT and RF are designed with three degrees of freedom (DOF). The extra fingers are driven by six servo motors. The initial length of the extra fingers for both the RT and RF, in the longitudinal axis, is 216 mm.
The RT's mechanism is designed by mimicking a humanʹs thumb motion. It moves like a human thumb, performing circumduction, abduction, and flexion. The motion of the RF is designed by imitating the other four human fingers. The intended motions for the RT are abduction and two flexions. The fingertips of the extra robotic fingers are designed to resemble the shape of a human finger, and are manufactured using a 3D printer. Servo brackets are used to separate one servo motor from another servo. The initial design of this study was inspired by research that has been developed by Faye Wu, and H. Harry Asada at MIT [35,36], with sensor and actuator components that are adjusted according to study needs. Wearable extra robotic fingers are designed using an analog servo motor as the main actuator. The analog servo motor used is a double shaft servo type. In this study, 850 mAh two-cell batteries with an output of 7.4 V, and 500 mAh two-cell batteries with an output of 7.4 V, are utilized. The first battery is used for powering the six servo motors, while the second battery provides the electrical power for a microcontroller. The microcontroller used for the extra robotic fingers' computing center is the Arduino Mega 2560. It has 54 DIO pins, with 14 pins that can be utilized for PWM outputs and another 16 pins for the analog input pins, four pins for UART, as well as 16 MHz oscillators, USB connections, and an ICSP Header. According to the specifications of the microcontroller, Arduino MEGA is very suitable as a neural network regression computing center to move the extra robotic fingers. Two FSRs are attached to the fingertips of the RT and RF, as shown in Figure 2.
The total mass of the wearable extra robotic fingers, excluding the mass of two batteries and an Arduino microcontroller, is 650 g. This mass is quite light when a user wears it on his right hand. Two batteries are placed in the wearerʹs pocket to decrease the total robot mass. To reduce the torque needed by the robot, a metal bracket is applied to join the two servo motors. An Arduino microcontroller is placed below the wrist of the userʹs hand. A Tower Pro MG 995 servo motor is selected for the main actuator of the robot. It can generate torque up to 9.4 kg/cm at 4.8 V. This torque is sufficient for the proposed wearable extra robotic fingers. In bimanual object manipulation tasks, the proposed extra robotic fingers must work together with the userʹs fingers. Three flex sensors, with a length of 2.2 inches, are selected to estimate the bending angle of the user's fingers. This kind of sensor has been successfully applied in a teleoperated robotic hand [46]. The flex sensor is a sensor that works to detect curvature. The voltage divider circuit is used as a circuit to convert the changes in resistance value into the voltage changes when the flex sensor is bent. The bending reading results from the flex sensor attached to the userʹs fingers are applied as the input signal for driving the six servo motors. A resistor of 22 K ohm is selected in the flex sensor circuit for data acquisition of the voltage change generated on the flex sensor. The arrangement and attachment of the flex sensors can be seen in Figure 3. The glove from iGlove is selected because it can be worn to operate the touchscreen on a smartphone and tablet computer. The thumb, index, and middle fingers are used as grasp posture inputs for wearable extra robotic fingers.

Extra Fingers' Forward Kinematics
Forward kinematics is carried out to predict the fingertip trajectory for both the RT and RF. The fingertip trajectory can be obtained by providing the angle data input for the six servo motors. This trajectory can be used to determine the working space of the extra robotic fingers. DH parameters are extracted from the proposed wearable extra fingers for the RT and RF. Table 1 summarizes the DH parameters for the developed extra robotic fingers. The acquired parameters in this table are used as the parameters for computation in the DH transformation matrix, as expressed in Equation (1).
By computing the transformation matrix for both the RT and RF, the three-dimensional space of the fingertip position can be generated. The fingertip position of the RT is expressed by Equations (2) to (4), while the fingertip trajectory for the RF is written in Equations (5) to (7).
Flex sensor

Grasp Posture Control
In this section, the data-driven method based on a neural network is presented and discussed. Neural network regression is applied to grasp posture control for the extra two robotic fingers. In the previous study [47], the extra robotic fingers were controlled using a linear constant-coefficient matrix, without implementing the position hold control coming from the FSR sensors. The matrix was used to multiply the signal inputs sourced from the flex sensors into commanded angles for driving the six servo motors. The robot was worn on the healthy right hand to provide mechanical assistance in bimanual object grasping manipulations, such as: (a) grasping and holding bottle, and opening the bottle cap at the same time; (b) grasping and lifting a volleyball; and (c) lifting and holding a glass while stirring with a spoon. Based on the experimental results, the extra robotic fingers did not perform intuitive control for the extra robotic finger movements commanded by the user's thumb and index finger. The user needed to be careful and slowly move the index and middle finger when holding an object with the robot fingertips touching the surface of the object. Researchers at MIT have developed grasp posture control for SRFs using PLS regression [35][36][37]. By applying the PLS regression method, experimental results reveal that the developed SRFs can assist with object grasping manipulation in bimanual tasks such as grasping a basketball, holding and operating a tablet computer, and taking and grasping bottled-water.
In this study, the grasping posture of the developed extra two robotic fingers is controlled using neural network regression. A feed-forward neural network structure is selected in the data-driven method for controlling the motion of the RT and RF. Considering the memory and the speed of the Arduino MEGA microcontroller, the selected number of neurons is five neurons. The number of neurons in the hidden layer is kept as low as possible, to reduce the computation burden in the Arduino MEGA microcontroller. The Levenberg-Marquardt backpropagation method is chosen for the training algorithm for neural network regression, with the maximum epoch of 1000. A linear transfer function is implemented on the neural network for both the hidden layer and the output layer. The selected neural network regression parameters are summarized in Table 2. In the neural network regression training, 0.001 is selected as the performance goal for the error. The input data set is divided into three subsets-training, validation, and testing-which are selected randomly. The ratios of the data sets for training, testing, and validation are 70%, 15%, and 15%, respectively. Mean squared error (MSE) and sum squared error (SSE) are selected and used as performance functions during training. The formulas for calculating MSE and SSE are expressed in Equations (8) and (9), respectively. These error performances will be used to identify the best neural network regression for the developed extra robotic fingers. The smaller the value of MSE and SSE, the better the regression model. R-value is used to select the better performance result between MSE and SSE. The R-value can be computed using Equation (10) where i y is the actual value from the data set, and i y' is the estimated output from the actual value.
N is the number of data. R computes the fitting degree between actual values and estimated values. The range of R is mapped from 0 to 1; the 0 value of R shows the worst performance of the regression, while 1 value reveals the best regression performance. Before the actual values go to the neural network regression, all data are normalized between −1 and 1, as expressed in Equation (11). The inverse function of Equation (11) will be used as a denormalized function from the neural network regression output to the estimated value.
Data acquisition (DAQ) of input and output data is conducted by providing movement to the developed wearable extra robotic fingers. Input data are obtained from the flex sensor readings when the thumb (x1) and the user's index finger (x2) are bent. The output in the data set is the joint angle command given to the three servo motors [y1,y2,y3] on the RT and the three servo motors [y4,y5,y6] on the RF. The input−output data are collected from the smallest bending angle to the maximum bending angle carried out by the thumb and index fingers. The amount of input and output data obtained is 30 points for each x1, x2, y1, y2, y3, y4, y5, and y6. The data size for the input is a matrix with a size of 30 × 2, while the output data is a matrix with a size of 30 × 6. Five input data points are obtained by grasping some objects, such as a volleyball, an aluminum mug, a bottle and two jars with different sizes, while 25 other data points are determined by taking quiet linear data points, as shown in Figure  4a for RT and Figure 4b for RF movements. Although the two flex sensors attached to the thumb and index fingers had the same model and part number, and supposedly the same resistance, the analog to digital converter (ADC) value of the flex sensor labeled as ×1 generated a higher value than the ADC value of the flex sensor labeled as ×2, when both thumb and index fingers were performing a full flexion. Normally, the maximum ADC is achieved when the finger is not bending or in full extension. To make the control more intuitive, the ADC value obtained from the flex sensor is inverted, so that the maximum ADC value can be achieved when the finger is moving in full flexion. In the bending flexion tests of the thumb and index finger, the maximum values of the inversed and processed ADC are 140 for x1, and 110 for x2.   The data input−output depicted in Figure 4 is mapped using the neural network regression for controlling the six servo motors on the RT and RF using the selected parameters as summarized in Table 2. Two neural network regression models are trained, with MSE and SSE, as the error function performances. The results for the MSE and SSE during training are presented in Figure 5a,b, respectively. The figures reveal the correlation between the epochs and the resulting errors for training, validation, and testing. In the training result graph for MSE, the best validation starts converging when the epoch is at 4, and the MSE value obtained is 0.3819, whereas on the SSE results graph, the best validation occurs when the epoch is equal to 4, and the obtained SSE value is 6.9248. R-value is utilized for determining the better performance between the MSE and SSE neural network regression models. The resulting R-values for both MSE and SSE are summarized in Table  3. Based on Table 3, the resulting R-values for both MSE and SSE have almost the same value for training, validation, and test. The neural network regression model with MSE is selected for the embedded control of the developed wearable robotic fingers. A neural network regression model, developed in a MATLAB environment, is generated in the Simulink block diagram. The generated neural network model in the Simulink environment is depicted in Figure 6. The acquired neural network model is embedded into the microcontroller by employing the "Simulink Support Package for Arduino Hardware" toolbox. To reduce the data memory that must be processed by the Arduino microcontroller, all data types are converted from double to single. This conversion can reduce the data memory by up to 50%. Five neurons and six neurons are generated in the hidden layer and output layer, respectively. This number of neurons will not burden the computation process in the Arduino microcontroller. The values of [x1 x2] T are normalized using the formula expressed in Equation (11). Before the neural network model outputs the estimated values of [y1 y2 y3 y4 y5 y6] T , the outputs are processed using the inverse normalization of Equation (11).

Position Hold Control
Two implemented FSRs are calibrated first before they can be applied in a haptic feedback system. These sensors measure the contact force between the extra robotic fingertips and the object when it is gripped. The calibration and the resulting polynomial regression function of the two FSRs attached to the RT and RF are revealed in Figure 9. The FSR sensor signal is read by the ADC pin in the Arduino microcontroller in the form of an ADC value. This output value from the FSR is calibrated into the mass unit. To prevent the servo motors on the RT and RF moving while gripping an object, position hold control is developed. The continuous movement of the servo motor when the robotic fingers grasp an object can damage the negative feedback operational amplifier circuit. It could also break a fragile object, such as an egg or a grape. The combination of the third flex sensor (x3), worn on the middle finger, and two FSRs sensors attached to the tip of the RT and RF is employed for the position hold control. This control is activated using conventional logic control, as summarized in Table 4. The selected threshold for the third flex sensor (x3) is 115, in the form of an ADC value. This selection means that the position hold control can be activated when the middle finger starts to bend at the maximal flexion. When the RT and RF touch and grasp the object, it can be paused by bending the middle finger, while the thumb and index fingers become free to move without moving the RT and RF. The position hold control can be activated when one of the RT and RF reaches the threshold force, which is more than 9.3 N.

Bimanual Task Experiment and Discussion
The developed extra robotic finger is tested by performing object manipulations in 10 bimanual tasks, and grasping a fragile object, such as an egg. Movement tests of the RT and RF are performed by providing input commands with flexion in the user's thumb and index. The data acquisition for inputs from flex sensors [x1 x2] T , and the predicted angle commands to the six servo motors [y1 y2 y3 y4 y5 y6] T , are shown in Figure 10. The movement test is carried out by bending the thumb and index downwards from initial full extension (A), and then unbending them upwards to full extension again (C). The input-output signals are collected for 60 s. The obtained maximum first flex sensor signal (x1) is higher than the maximum of the output signal from the second flex sensor (x2) when the thumb and index are in full flexion (B). The neural network outputs for controlling the motion of the six servo motors are depicted in Figure 10. In this trajectory simulation, the wearable extra robotic finger is worn on a user's right hand. Based on the acquired signals from the commanded angles [y1 y2 y3 y4 y5 y6] T , the fingertip trajectories for the RT and RF are calculated based on Equation (11). The calculated trajectory is verified by SimMechanics First Generation. The fingertip trajectories in three-dimensional space for the RT and RF are presented in Figure 11. Based on the figure, the maximum vertical movements for the RT and RF can be achieved at about 70 mm and 80 mm, respectively. The user's wrist can bend upward to increase the working space of the wearable extra robotic fingers, especially in a vertical motion. By bending the wrist upward, a larger object can be grasped, such as a volleyball, bottled-water, or a jar. When the RT and RF move for grasping an object, the length of the RT and RF reduce in the longitudinal axis (X-axis), as shown in Figure 11, while in the Z-axis, both of the fingertips always move to the object to be grasped. This fingertip movement in a longitudinal motion will enable the user's fingers to collaborate with the RT and RF more easily and intuitively. Based on the generated fingertip trajectory in three-dimensional space, the bending upward of the user's wrist will be used as a grasping strategy for larger objects in bimanual tasks.
In this test, the developed extra wearable robotic finger is worn on a healthy and normal human hand. The robot will be tested to provide mechanical assistance for bimanual tasks which are typically done with two healthy hands in the activities of daily living (ADL). The assigned bimanual tasks are challenging to perform using only one healthy hand. A total of 10 assigned bimanual manipulation tasks are summarized in Table 5. Generally, driving the coordination control between the RT and RF can be performed by bending the user's thumb and index downwards. When the fingertips of the RT and RF reach and touch the object, position hold control is activated by bending the middle finger downward. The position hold control will enable the user's thumb and index to move freely to perform another object manipulation, such as the motion of opening a bottle cap while the extra fingers are grasping and holding the bottled-water, unplugging the AC power plug from the extension cord reel while the extra fingers hold the extension cord reel, opening the jar lid while the extra fingers are holding the jar, stirring water and sugar while the extra fingers are holding and lifting an aluminum mug, and tightening a bolt while the extra fingers are holding an electronic device such as a multimeter.  The experimental test with predefined bimanual object manipulations, as shown in Table 5, is repeated eight times. Five bimanual manipulation tasks, such as volleyball, aluminum mug, bottle, and two jars with different sizes, have been trained before. The other bimanual tasks have not been trained yet. Three failure types are identified when the extra robotic fingers and the user's hand perform the assigned bimanual object manipulation tasks, as summarized in Table 6. It shows the success rate of the eight bimanual tasks carried out using the right hand and extra fingers. The highest success rate is achieved when the robot takes and lifts the dipper and bucket simultaneously because it is easy to perform. Bimanual object manipulations on the bottled-water and egg have the lowest success rate, which accounts for 50%. The bottled-water and egg have a smaller size (contact area) compared to the others, making them harder to reach and grasp. Lifting and opening a bottle cap has the lowest success rate, along with grasping an egg. A slip occurs when human fingers have opened and taken away the bottle cap. This slip causes the bottle to loosen from the grip of the extra fingers.
Based on the experimental work, the extra robotic fingers can lift an object with a mass up to 1.4 kg, when the robotic fingertip is used to lift an object from the lower surface of the object. The extra robotic fingers cannot be applied to lift an object weighing more than 400 g when the robotic fingertips touch and grip the object from the right and left side. A slip will occur, and the object will fall from the extra fingers' grip, although the extra fingers are attached with a latex glove to increase the friction force. To increase the cooperative grasping manipulation between the extra fingers and the user's fingers for an object with higher mass, when the extra fingers are touching and gripping the object, the human fingers are applied to grasp and lift the object from the upper side. Thus, these extra fingers are not suitable for bimanual object manipulation tasks for a large object with a mass higher than 1.5 kg, because it can cause fatigue in the user's hand. It is anticipated that the mass of the robotic fingers worn by a user is 650 g, excluding two LiPo batteries and a microcontroller. A haptic feedback test is carried out by holding and lifting an egg using two wearable extra robotic fingers, i.e., the RT and RF. The sequence of pictures of the grasping test of an egg can be seen in Figure 12. According to the results of haptic feedback testing, it can be concluded that the testing process of lifting and holding eggs using wearable extra robotic fingers has been successfully carried out without breaking the egg. The egg does not crack or break due to the haptic feedback system released from the FSR signal. The haptic feedback system can work well, because the force that occurs when grasping the egg does not exceed the specified threshold of 9.3 N. A grasping force received by the egg that exceeds 9.3 N will activate the position hold control, so that a holding force of more than 9.3 N will be terminated by position hold control.

Conclusions
In this study, a wearable extra robotic finger is successfully developed to assist bimanual tasks that are commonly performed by two normal healthy hands. A data-driven method based on neural network regression is utilized for controlling the coordination between the userʹs fingers and an extra two fingers, i.e., the RT and RF. The movement of the RT and RF are commanded by bending angle inputs from the userʹs thumb and index fingers, which are read by the flex sensors. The sensors are attached to a modified glove sensor that can be worn by a user. Various hand gestures are mapped using neural network regression. The third flex sensor is attached to the middle finger to provide a command for position hold of the extra robotic fingers. For the haptic feedback system, two FSRs are attached to the RF and RT.
The trained neural network regression is embedded into Arduino MEGA, to control the robotic finger motion and assist the human fingers in object manipulation, especially in a bimanual task. The developed extra fingers are tested on a user with a normal healthy hand, to assist the human fingers and perform 10 varied bimanual tasks. Based on the experimental results, the proposed extra robotic fingers can successfully provide mechanical assistance in the performance of 10 bimanual tasks. Haptic system test results show that the wearable extra robotic fingers can grasp a fragile object like an egg without breaking it. Based on the promising results for bimanual task assistance, the wearable extra fingers can be a potential assistive device for hemiparetic patients, who have weakness of one entire side of their body. The robot can be attached to the healthy remaining hand of a user who has lost his/her hand function, such as in hemiplegia or hemiparesis. It could enable people who have diminished hand function to perform object manipulation without assistance from others.
For healthy users, the developed extra robotic fingers can provide mechanical assistance for object manipulation in bimanual tasks when this is hard to perform with only one hand. The robot could also improve a user's work, making it more productive and efficient. Data-driven posture control, based on the neural network, provides more intuitive and dexterous control for extra robotic fingers working collaboratively with the user's hands.