Next Article in Journal
Dynamic Modeling of McKibben Muscle Using Empirical Model and Particle Swarm Optimization Method
Next Article in Special Issue
Compensating Uncertainties in Force Sensing for Robotic-Assisted Palpation
Previous Article in Journal
Numerical Study on Gaseous CO2 Leakage and Thermal Characteristics of Containers in a Transport Ship
Previous Article in Special Issue
A Robot Learning Method with Physiological Interface for Teleoperation Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Human-Touch-Inspired Material Recognition for Robotic Tactile Sensing

1
School of Aerospace Engineering, Xiamen University, Xiamen 361102, China
2
Graduate School at Shenzhen, Tsinghua University, Shenzhen 518055, China
3
Shenzhen Research Institute of Xiamen University, Shenzhen 518000, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(12), 2537; https://doi.org/10.3390/app9122537
Submission received: 29 April 2019 / Accepted: 12 June 2019 / Published: 21 June 2019
(This article belongs to the Special Issue Human Friendly Robotics)

Abstract

:
This paper proposes a novel material recognition method for robotic tactile sensing. The method is composed of two steps. Firstly, a human-touch-inspired short-duration (1 s) slide action is conducted by the robot to obtain the tactile data. Then, the tactile data is processed with a machine learning algorithm, where 11 bioinspired features were designed to imitate the mechanical stimuli towards the four main types of tactile receptors in the skin. In this paper, a material database consisting of 144,000 tactile images is used to train seven classifiers, and the most accurate classifier is selected to recognize 12 household objects according to their properties and materials. In the property recognition, the materials are classified into 4 categories according to their compliance and texture, and the best accuracy reaches 96% in 36 ms. In the material recognition, the specific materials are recognized, and the best accuracy reaches 90% in 37 ms. The results verify the effectiveness of the proposed method.

1. Introduction

As robots are required to cooperate with humans in unstructured environments, developing intelligent tactile sensing for robots is urgently needed [1,2]. For instance, surgical robots are required to ablate the diseased tissue, while avoiding harm to the human body [3,4]. A prosthetic device with tactile sensing feedback improves the success rate of the grasp task [5]. Additionally, tactile sensing is particularly important, since many tasks require robots to recognize unintentional collisions or to make intentional physical contact with objects or humans [6]. Material recognition is an important capability for robotic tactile sensing, since the ability enables robots not only the sense of touch but also compliance, roughness, and texture, which improves environmental awareness when interacting with objects or humans. Specifically, the roughness and texture sensed by material can help robots adjust the control strategies during grasp, manipulation, and other tasks, while the awareness of compliance can keep humans or objects away from potentially destructive effects [7]. Hence, incorporating material recognition in robotic tactile sensing is important.
In order to realize material recognition for robots, dynamic tactile sensing, which humans employ to explore and sense materials, is a good reference. By perceiving high spatial and temporal frequencies during motion, dynamic tactile sensing is able to sense fine surface features and monitor contact conditions, further recognizing materials [8,9,10].
In recent years, integrated tactile sensors that are large-scale and high-density have been well developed to realize dynamic tactile sensing for robots [10,11]. Compared with force or torque sensors that record a single value on a fingertip or joint, these tactile sensors can be integrated on robots over a large area, meanwhile monitoring pressure distribution during interaction, which enables robots to localize objects, detect slippage, and recognize objects’ sizes and surface properties [12,13,14,15]. Since tactile sensors can provide abundant contact data, in one study kinds of dynamic tactile data that are relevant to vibration, pressure distribution, and thermal properties were obtained during pressing, sliding, scratching, and grasping, then the data were further analyzed to identify the material with its compliance, texture, and other properties [15,16,17,18]. However, it is still challenging for robots to recognize interactive objects quickly and accurately, which is significant for robotic tactile sensing.
Mimicking the manner of dynamic touch and tactile perception of the human hand, this paper presents a novel and effective material recognition method using a high-density tactile sensor. This method is composed of two steps. This first is a short-duration slide action conducted by the robot to obtain the tactile data of the material. The second step is the processing of tactile data with machine learning, which includes feature extraction and classifier training. In this method, a one-second human-touch-inspired slide action was employed as the exploratory procedure (EP), which assures the rapidness of the method. For robots, implementing the slide action is easier than grasping and other actions; for safety, the slide action is less harmful than grasp or press actions. Secondly, three material-related time sequences were calculated to abstract the tactile data. Then, 11 bioinspired features were extracted from the sequences based on the statistics and Wavelet transformation to mimic human tactile perception. Since the features extracted by statistic methods characterize the consistency and complexity of the sequence, the information imitates the signals of Ruffini Corpuscle and Merkel Cells of human touch. In addition, the features extracted by Wavelet transformation provide the information of vibration at different frequencies, which imitates the signals of the Pacinian Corpuscle and Meissner’s Corpuscle. The finial feature vector was generated by principal component analysis (PCA). At last, the feature vector was analyzed by seven trained classifiers to classify the sample, according to its properties or materials. The contribution of this work is the development of a novel two-step material recognition method for robotic tactile sensing. The novelty of the presented method is due to two factors. First, a human-touch-inspired short-duration slide exploratory procedure only needs 1 s, which makes the material recognition more efficient and is also less harmful than grasp or press actions. Second, bioinspired tactile data processing improves the distinguishability of features and makes recognition more accurate.

2. Related Work

Material properties, such as texture, compliance, density, and thermal properties, are especially useful in material recognition, and many studies have been conducted on this topic [19,20]. For example, Kerr et al. used surface texture and thermal properties as key characteristics for material recognition, and the best classification accuracy reached 86.19% across 14 materials. Their method is computationally efficient (training a fold takes 0.55 s), and the accuracy is acceptable, although the press action and slide action that were performed to collect tactile data are time consuming, and the two actions took more than 20 s [16]. If the interactive target is human, and the interactive force is large enough to cause harm, the large time overhead will exacerbate the harm. Bhattacharjee et al. used a tactile sensing forearm to acquire the timing sequences of contact force, contact area, and contact motion, and the sequences can be used to determine the mobility and compliance, and furthermore, classify the objects. Sensing mobility is significant for autonomous manipulation, although it would generate large interactive forces that would harm humans, and 72% accuracy needs to be improved [21]. Based on the deformability and texture, Khasnobish et al. extracted the features of different material surfaces by analyzing tactile images, which are obtained during the dynamic interaction between the items and the gripper. The accuracy of the classifier is 78% across four biomembranes [17]. Deformability and texture are two promising characteristics for material recognition. However, the features they used for the classifier are statistical features of a single tactile image, which cannot effectively reflect the texture. In this study, features related to compliance and texture were extracted from tactile data based on the statistics and Wavelet transformation. Then, time domain and frequency domain features are both used as the input of the classifier to discriminate materials.
In order to perceive the material properties efficiently, dynamic tactile sensing of humans usually employs various exploratory procedures (EPs) or hand movements to explore the target. Take the example of hand movement: texture, hardness, weight, and global shape are perceived by lateral motion, pressure, unsupported holding, enclosure, or contour following [22]. In existing works, Kaboli et al. made a series of exploratory procedures for material recognition based on a shadow hand and humanoid robot to verify the performance and robustness of the tactile descriptors. Lateral, medial, circular, and combined motion slide actions were used to collect tactile data. Almost 100% accuracy was realized for a large set of materials, although any exploration took more than one second (2 s and 10 s for medial and circular slide actions, respectively) [23]. Differing from the instructed movement, Strese et al. made the device explore the surface arbitrarily to ensure the intraclass variance of the material database, which means there was no specific velocity, force, or movement patterns. However, free exploration influenced the accuracy of the classification, and the best accuracy was around 74% for 69 materials [19]. In order to achieve the active object recognition, Tanaka et al. sequentially performed informative actions to collect tactile data. The new exploratory trajectory was generated by teaching trajectory and feedback tactile data to automatically collect the tactile data of different objects [24]. Considering accuracy and efficiency, our approach selected a simple lateral slide action as the EP. Here, a tactile sensor slid across material surfaces for one second to save time, and the specific interactive force and slide speed were used to improve the accuracy.
After collecting tactile data, human dynamic tactile sensing is also a good inspiration for feature extraction. In human skin, various mechanoreceptors sense the stimuli with different frequencies and characteristics during the exploration. For example, Pacinian corpuscles detect the high frequency vibration, which depends on the texture of the contacted surface, Ruffini corpuscles sense the sustained downward pressure, and the pressure distribution indicates the contact state, which is related to the material compliance [25]. In order to mimic human SA-I, FA-I, and FA-II channels, Romano et al. processed measurements from a pressure sensor array and an accelerometer. They summed the readings of the sensor array and corresponding high frequency components as the signals of SA-I and FA-I channels, and took the magnitude of the high-pass-filtered 3-D acceleration vector of the accelerometer as the signal of the FA-II channel [26,27]. Summing readings simply worked well in their study, although this approach does not utilize the ability of a high-density tactile sensor to sense the contact condition, which is significant for compliance perception. Instead of pressure data, Hughes et al. collected data using an individual microphone. The data are processed by signal processing methods, such as Fast Fourier Transformation, to mimic the tasks attributed to Pacinian corpuscles [14]. However, the density of their sensor network is low—only 10 sensing nodes in a 2.8 square meter area. This density is far lower than the density of Pacinian corpuscles and other mechanoreceptors in the human hand [28,29], which results in an accuracy of 71% across 15 textures. Since dynamic tactile sensing is important for humans to identify materials [9,30], this paper abstracted compliance- and texture-related time sequences from the data, as the compliance and texture are highly recognizable for dynamic tactile sensing. Then, features that mimic the signals generated by the mechanoreceptors in human skin were extracted from the sequences based on Wavelet transformation and statistics.
The paper is organized as follows. In Section 3, our method is described in detail, including the experimental setup, feature extraction, and classification. In Section 4, the results of experiments are displayed to demonstrate the performance of our method. In Section 5, the method and its contribution are summarized.

3. Methodology

3.1. Material Recognition Method

The method proposed in this paper consists of two steps. Firstly, a short-duration slide action is conducted by the robot, where the high-density and large-scale tactile sensor was used to collect tactile images from a one-second slide action. Secondly, the tactile data was processed with a machine learning algorithm, which included feature exaction and classifier training. In the process of feature extraction, three material-related time sequences were calculated to abstract the tactile images. Then, 11 bioinspired tactile features were extracted from the sequences based on the statistics and Wavelet transformation. Finally, the feature vectors were used to train seven classifiers, and the most accurate classifier was selected for property and material recognition. The processes of our method are shown in Figure 1.

3.2. Experimental Setup

Tactile Sensor: A tactile sensor (Pressure Mapping Sensor 5101, Tekscan, South Boston, MA, USA) is employed to acquire the tactile data, as shown in Figure 2a. The sensor consists of two thin, flexible polyester sheets that have electrically conductive electrodes deposited in varying patterns. The inner surface of one sheet forms a row pattern, while the inner surface of the other employs a column pattern. The intersection of these rows and columns creates a sensing element, and a force sensitive material is placed between these two mating sheets. The spacing between the rows and columns is 2.5 mm, and the total thickness is 0.1 mm. Since the sensor is flexible, large-scale, and ultra-thin, it can be overlaid on the robot, making material recognition more achievable. The sensor consists of 44 × 44 sensing elements, it is high-density, and capable of monitoring the pressure distribution of the contact surface and generating tactile images that are relevant to the compliance and texture of the material, which enriches the tactile data and allows robots to collect enough data in a short-duration exploration. Pressures are converted into intensities of tactile images, which range from 0 to 255. Similar to human skin, a full range of information, including sustained downward pressure, slippage, object position, object shape, and vibration, can be detected by processing the tactile data.
Slide Action: Mimicking the dynamic tactile sensing that humans employ to explore an unknown material [30], a short-duration slide action was designed on the test materials to collect tactile data in 1 s. For robots, implementing the slide action is easier than grasp and other actions; for safety, the slide action is less harmful than the press action. Additionally, vibration caused by slide action is useful for material recognition. Hence, the slide action is adopted as the exploratory procedure in this paper. Moreover, 1 s sliding time improves the rapidness efficiency of the method. In the slide action, the sensor was dragged across the surface of the material at a constant force. In our experiments, a linear motion platform (M-403, Physik Instruments, Karlsruhe, Germany) was used to drag the sensor at a speed of 2.5 mm/s, while a standard 200 g weight is placed on the sensor to provide the constant interactive force. The dragging speed is slower than the sliding speed used by humans, as the slow slide can prevent robots from harming humans or objects. The 200 g load makes the interactive force fluctuate within a range that will not hurt objects or humans. Moreover, the exploration area on the material surface of each slide action is different, which increases the robustness of the tactile data. Figure 2b illustrates the slide action.
Test materials: To make the test materials representative, 12 materials with different textures and compliances were selected. The 12 materials are foamed silicone rubber (FSR), sponge, cotton, polyvinylchloride (PVC), silicone rubber (SR), rubber, sandpaper (80 Grit), wood, iron (unpolished), ceramics, copper, and acrylic (shown in Figure 2c). These materials were divided into four categories: soft-rough, soft-smooth, rigid-rough, and rigid-smooth, which covers properties of most objects. The individual materials and the categories to which they belong are shown in Table 1.

3.3. Data Collecting and Dimensionality Reduction

When human skin slides along an unknown material, the four mechanoreceptors cooperate with each other to sense the stimuli dynamically and generate the corresponding tactile signals [25]. The stimuli they sensed are shown in Table 2. To collect tactile data with different frequencies, as skin does, the data were recorded at 100 Hz, which mainly covers the frequency range of the stimuli perceived by the mechanoreceptors. At this frequency, the sensor slid across the material for 1 s at a speed of 2.5 mm/s, and the sensor can record 100 tactile images after the slide action. However, the high-density sensor produces 193600-dimensional data in each exploration. To process the data efficiently, three characteristics were designed to represent the image, and their time sequences were calculated to reduce the amount of data. The first sequence Sa consists of areas of 100 tactile images, the second sequence Sv consists of variances of 100 tactile images, and the third sequence Ss consists of sums of the forces in the 100 tactile images. Numerically, the tactile image area is the number of the force sensing elements that sensed the forces, the tactile image variance is the variance of the pixels, and the sum of forces is the sum of the image intensities. These three sequences were used as the sample data, since they are relevant to the compliance and texture properties of the material. According to contact mechanics [31], the approximate information of the contact area (Sa), pressure distribution (Sv) and force value (Ss) are relevant to the compliance of the material, while detail information on sequence Ss is relevant to the texture properties [32] The three sequences of a random sample were displayed in Figure 3, with the average sequences of all samples. As demonstrated in Figure 3, it is obvious that all materials behaved differently in the slide action. The sequences of the random sample presented different consistencies of materials, for example, the Ss of PVC changed very quickly, and the magnitude of the change was large, while the Ss of FSR was almost unchanged. By observing the average sequences, it is clear that the materials are statistically discriminable. Therefore, extracting features that characterize the change and average of sequences is an efficient approach to improving the performance of our method. In this study, the means of three sequences and the changes of Ss were concerned with the feature extraction. Moreover, in order to make the method independent of the interactive force, the three sequences were standardized by removing the mean and scaling to unit variance. In addition, 120 sets of sample data were collected for each material, which means 120 × 12 × 100 = 144,000 tactile images were used to build the database.

3.4. Feature Extraction Mimicing Human Tactile Perception

The human-like slide action was carried out to perceive three time sequences, which are relevant to the compliance and texture properties of the material. However, the dimensionality of the sequences is 300, which is still large for material recognition. Inspired by dynamic tactile sensing of humans, 11 features were extracted from the three sequences based on the statistics and Wavelet transformation. Features based on the statistics characterize the consistency and complexity of the sequence, while the features based on the Wavelet transformation can be used to analyze non-stationary signals. Since the textures of the materials are irregular and non-uniform, using Wavelet instead of Fourier transform is more suitable [33]. The first three features are the means of the three sequences. As the three features are relevant to the sustained pressure and texture, they are the analogs to stimuli sensed by Ruffini Corpuscle and Merkel Cells. The next five features are the L1 norms of the cA4, cD4, cD3, cD2, and cD1 coefficients, which are generated by a level-4 discrete Wavelet transformation (DWT) of the sequence Ss, and db3 is used as the wavelet. The cD4, cD3, cD2, and cD1 coefficients provide the detail information of vibration at different frequency durations for the slide action, which is similar to the signal of the Pacinian Corpuscle and Meissner’s Corpuscle. The cA4 coefficient provide the approximate information of vibration, which is relevant to the signal of Merkel Cells. The ninth feature is the absolute energy of the sequence Ss, which is the sum of the squared values:
E = i = 1 , , n x i 2
where x is the sequence. The absolute energy is also similar to the signal of Merkel Cells. The tenth feature is the complexity-invariant distance (CID) of the sequence Ss, as CID is an estimate for a time sequence complexity [34]. If the sequence is complicated, there are more peak-valley values in the sequence. Hence, CID provides the frequency information of vibration, which is similar to the signal of the Pacinian Corpuscle. It gives the value of:
i = 0 n 2 l a g ( x i x i + 1 ) 2
The last feature is the standard deviation of Ss, which is relevant to the material texture that is sensed by the Merkel Cells. The 11 features constitute the feature vector of the sample, which will be used for material recognition, and the features, the meanings they represent, and the corresponding tactile stimulus are displayed in detail in Table 3.

3.5. Classifier Configuration

After the data collection and feature extraction, the dimensionality of each sample dropped from 193,600 to 11. Using the feature vectors, seven classifiers were selected to test the accuracies of two different recognition problems. The first recognition is property recognition. In this recognition, the materials are classified into four categories based on compliance and texture, and the categories are soft-rough, soft-smooth, rigid-rough, and rigid-smooth. The second recognition is material recognition. In this recognition, the specific materials are recognized. The seven classifiers are Linear Support Vector Machine (Linear SVM), Radial Basis Function Support Vector Machine (RBF-SVM), k-Nearest Neighbor (kNN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), Random Forest (RF), and Multi-Layer Perceptron (MLP). Additionally, these classifiers can also be used to validate the effectiveness and robustness of the tactile data and feature extraction, and the most accurate classifier can be selected for material recognition. The description and parameters of the classifiers were listed as follows.
Linear Support Vector Machine (Linear SVM) and Radial Basis Function Support Vector Machine (RBF SVM) are supervised classifiers based on the SVM theory. The theory maximizes the margin for two linearly separable classes by means of a suitable hyperplane to distinguish a set of inputs. The standard SVM is commonly used for binary classification, although it has been developed to enable multi-classification. Linear SVM uses linear kernel function (KF) as the kernel to map the data from the original space to a higher-dimensional space, which SVM requires, while RBF-SVM adopts the radial basis function as the kernel. The linear kernel function is given by Equation (3), and the radial basis function is given by Equation (4).
κ ( x i , x j ) = x i T x j
κ ( x i , x j ) = e x p ( || x i x j || 2 2 σ 2 )
For Linear SVM, the L2 function was chosen as the penalty, since L2-SVM is differentiable and imposes a bigger (quadratic vs. linear) loss for points that violate the margin [35]. Besides, the squared hinge loss was used, because it is differentiable and suitable for maximum-margin classification [36]. Except for these, Linear SVM and RBF-SVM share the same key parameter C, and the parameter C is set to 1 to avoid the unwanted over-fitting.
The k-Nearest Neighbor (kNN) method is a supervised classifier based on lazy learning. The kNN method is non-parametric, as it simply assigns the object to the class that is most common among its k nearest neighbors, following a majority vote. In this study, the Euclidean distances among the samples were calculated, and all the k nearest neighbors were weighted equally. Furthermore, the key parameter k was evaluated in a range from 1 to 15 to acquire the maximum accuracy. The k value that proved to be most effective for property recognition was 5, while the value became 10 for material recognition.
Linear Discriminant Analysis (LDA) is a classical linear learning method that is capable of binary or multi-class classification. LDA maps all objects to a decision boundary, and by optimizing the coefficients the boundary, it clusters objects belonging to the same class compactly, while segregating different classes separately [37]. Therefore, LDA aims to maximize the generalized Rayleigh quotient J of within-class scatter matrix Sw and between-class scatter matrix Sb, and it can be expressed as Equation (5).
J = w T S b w w T S w w
where w is the coefficient vector of the decision boundary. In this study, the w was calculated by singular value decomposition.
Naive Bayes (NB) is a supervised learning method based on the Bayes’ theorem. Similar to the above method, the Naive Bayes method can realize binary or multi-class classification. Different from other classifiers, all Naive Bayes classifiers apply the “naive” assumption of conditional independence between every pair of features. Under this assumption, the relationship between the class variable c and feature vector x can be simplified as Equation (6), and the class variable c can be distinguished by Equation (7). In this study, the Gaussian Naive Bayes was used to recognize the materials.
P ( c | x ) = P ( c ) P ( x ) i = 1 d ( x i | c )
c ^ = argmax c P ( c ) i = 1 d ( x i | c )
Random Forest (RF) is an ensemble classifier based on randomized decision trees. Compared with bagging, there is an additional layer of randomness in RF. Moreover, compared with the standard decision tree, each split in an RF is the best split among a random subset of the features [38]. Due to the randomness, the generalization of the ensemble is improved. In this study, ten trees were built in the RF to classify samples into their groups.
Multi-Layer Perceptron (MLP) is an artificial neural network that learns from the dataset by back-propagation. A MLP is composed of the input-layer, hidden-layer(s), and output-layer, and these layers consists of several neurons, with weights that are trained to map sets of input data onto a set of appropriate outputs [39]. Here, a range of different structures were tested to determine the most suitable number of layers and corresponding neurons. Empirically, a 4-layer MLP, with 100, 50, 25, and 8 neurons in sequence, was designed.

4. Results

In order to verify the validity of our method, 5-fold cross validation was used. The collected training data were split into 5 folds randomly, and during each evaluation, 4 of them were used for training, and one was used for testing [1]. Furthermore, principal component analysis (PCA) was used before classification to select the dimension that had the best accuracy. Besides accuracy, computational efficiency is another important factor that influences the application of our method. In order to test the efficiency of the method, this paper also measured the running time from data collection to classifier training (as shown in Figure 1). All the computations were performed in a Python 3.5 environment on a computer with an Intel Core i3 @ 3.30 GHz processor, 4 GB RAM, and 7200 RMP mechanical hard disk.

4.1. Property Recognition

Linear SVM was used to traverse the number of principal components from 1 to 11 for the best dimension. The results shown in Figure 4 illustrate that the features were redundant for property recognition, as the accuracy remained the same when the number reached 8. Therefore, 8 principal components were used for property recognition. The time overheads, accuracies, and other indexes of the classifiers are provided as Table 4.
From Table 4, it can be noted that for most classifiers, the accuracy is not below 90%, and kNN provides the best accuracy (95.83%). Figure 5 is the confusion matrix obtained from the kNN property recognition, and the columns of the confusion matrix indicate the predictions, while the rows indicate the real classification. The confusion matrix shows that most errors occur in rigid-rough material recognition. In our experiment, rigid-rough materials were easily recognized as soft-smooth materials. This is because there is great adhesion in the contact between soft-smooth materials and the tactile sensor, and the adhesion will result in vibration, with a low frequency and large amplitude, during slide action, as rigid-rough materials do [40]. In addition, Table 4 shows that our method is a time-saving. The run time of each process was measured, and the results are provided in Table 5. By analyzing Table 5, it is observed that reading the tactile data from the hard disk (Data Reading) takes up most of the run time, and converting data to time sequences is the second most time-consuming process (Dimensionality Reduction). Moreover, pre-processing tactile data took up more than 99% of the total time. This is due to the fact that our method is required to process the huge data produced by the tactile sensor: for each sample, there are 100 tactile images, whose size is 44 × 44. However, the overhead is still cheap compared with the overhead of the slide action, which means our method can finish the classification during the next slide action.

4.2. Material Recognition

Consistent with property recognition, Linear SVM was used to determine the principal components. As shown in Figure 4, the accuracy increased, with the number increasing as a result, and 11 principal components were selected to form the feature vector. The accuracies and time overheads of the classifiers are provided in Table 4.
From Table 4, it can be seen that the accuracies of the six classifiers are not below 80%, and MLP provides the best accuracy (90.28%). The performances of these classifiers proved the effectiveness and robustness of the tactile data and feature extraction. The confusion matrix in Figure 6 was plotted based on the MLP classifier, and the matrix shows that most mis-classified errors occur in materials with the same or similar properties. For example, PVC was confused with rubber, as they are soft and smooth. Additionally, acrylic was mistakenly classified as wood, since they are both hard. The time overheads of material and property recognitions in each process are mainly the same, except for classification. Material recognition took much more time in classification, although the overhead was still the small in relation to the data-processing.
This paper used slide action to explore the material. However, it is difficult to keep the interactive force constant during the slide. Although the sequences extracted from the raw tactile data were standardized to make themselves independent of the interactive force, the robustness of our method needs to be validated. Here, 100 g, 150 g, 250 g, and 300 g weights were used as the loads, then 30 sets of sample data were recollected for each material to test the robustness. The new data were allocated into 4 databases: D100, D150, D250, and D300, and the trained kNN classifier, and MLP classifier were used for the property classification and material classification. The results in Table 6 show that our method is robust enough to recognize materials under different interactive forces.
In order to test the real-world applicability of the method, the trained MLP classifier was used to identify other materials that are the same or similar to the materials used in this paper but from different objects. The materials are wood, sponge, cotton, iron, sandpaper (150 Grit), and ceramics. For each material, 30 data samples were collected. The recognition results are shown in Table 7. Although the recognition of other materials is not as accurate as before, it is also acceptable.

5. Discussion

In recent years, many researchers have devoted themselves to material recognition. Here, a few relevant works are summarized, and the tactile data, EP, time of EP (ToEP), number of features (NoF), number of materials (NoM), and accuracy of the works are shown in Table 8. Analyzing Table 8, it is worth noting that our method spends less time for the exploratory procedure than most works, which illustrates the efficiency of the method. The efficient method enables robots to recognize the material quickly, which greatly improves robotic tactile sensing and extend robotic applications. Since the previous works have recognized materials with different numbers and properties, accuracy comparison between these works and our method is not statistically significant. However, the materials our method recognized have different properties that cover most objects, and some materials, such as sandpaper and sponge, are quite different, while some materials, such as PVC and rubber, are quite similar. Hence, 90% accuracy of the method is accurate enough to improve the grasp, manipulation, and other interaction tasks for robots.

6. Conclusions

This paper presented an efficient material recognition method for robotic tactile sensing. In our method, a one-second slide action was performed on 12 materials with different compliances and textures using a tactile sensor with a high-density sensing array. Then, three sequences were extracted from the raw tactile data to reduce the amount of data. Based on the sequences, 11 bioinspired features were calculated as the inputs of the classifiers. In our study, seven classifiers were used for classification, and kNN showed the best accuracy in property recognition (96%), while MLP presented the best accuracy in material recognition (90%). The short-duration exploratory procedure makes the method efficient, while bioinspired data processing makes the method accurate. In the future work, sensors with multimodal sensing will be applied to improve the accuracy and robustness of material recognition.

Author Contributions

C.C. and W.X. proposed the concept of the paper and performed the investigation. Y.X. and D.W. designed the experiments. C.C. performed the experiment and the coding. W.X. and H.L. prepared the original draft. D.W. and Y.X. reviewed and edited the draft. Y.X. was in charge of supervision and project administration. All authors have read and approved the final manuscript.

Funding

We gratefully acknowledge the financial support from the National Natural Science Foundation of China (No.61803221), Natural Science Foundation of Guangdong Province (No.2017A030313352), and Science and Technology on Space Intelligent Control Laboratory for National Defense (KJGZDSYS-2018-07).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tsarouchi, P.; Makris, S.; Chryssolouris, G. Human–robot interaction review and challenges on task planning and programming. Int. J. Comput. Integr. Manuf. 2016, 29, 916–931. [Google Scholar] [CrossRef]
  2. Li, M.; Deng, J.; Zha, F.; Qiu, S.; Wang, X.; Chen, F. Towards Online Estimation of Human Joint Muscular Torque with a Lower Limb Exoskeleton Robot. Appl. Sci. 2018, 8. [Google Scholar] [CrossRef]
  3. Hu, D.; Gong, Y.; Hannaford, B.; Seibel, E.J. Semi-autonomous simulated brain tumor ablation with RAVENII Surgical Robot using behavior tree. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 3868–3875. [Google Scholar]
  4. Kehoe, B.; Kahn, G.; Mahler, J.; Kim, J.; Lee, A.; Lee, A.; Nakagawa, K.; Patil, S.; Boyd, W.D.; Abbeel, P.; et al. Autonomous multilateral debridement with the Raven surgical robot. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 1432–1439. [Google Scholar]
  5. Chortos, A.; Liu, J.; Bao, Z. Pursuing prosthetic electronic skin. Nat. Mater. 2016, 15, 937. [Google Scholar] [CrossRef] [PubMed]
  6. Cirillo, A.; Ficuciello, F.; Natale, C.; Pirozzi, S.; Villani, L. A Conformable Force/Tactile Skin for Physical Human–Robot Interaction. IEEE Robot. Autom. Lett. 2016, 1, 41–48. [Google Scholar] [CrossRef]
  7. Luo, S.; Bimbo, J.; Dahiya, R.; Liu, H. Robotic tactile perception of object properties: A review. Mechatronics 2017, 48, 54–67. [Google Scholar] [CrossRef] [Green Version]
  8. Howe, R.D.; Cutkosky, M.R. Dynamic tactile sensing: Perception of fine surface features with stress rate sensing. IEEE Trans. Robot. Autom. 1993, 9, 140–151. [Google Scholar] [CrossRef]
  9. Drimus, A.; Petersen, M.B.; Bilberg, A. Object texture recognition by dynamic tactile sensing using active exploration. In Proceedings of the IEEE Ro-Man: The IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; pp. 277–283. [Google Scholar]
  10. Cutkosky, M.R.; Ulmen, J. Dynamic Tactile Sensing. In The Human Hand as An Inspiration for Robot Hand Development; Balasubramanian, R., Santos, V.J., Eds.; Springer International Publishing: Cham, Germany, 2014; pp. 389–403. [Google Scholar]
  11. Hammock, M.L.; Chortos, A.; Tee, B.C.; Tok, J.B.; Bao, Z. 25th anniversary article: The evolution of electronic skin (e-skin): A brief history, design considerations, and recent progress. Adv. Mater. 2013, 25, 5997. [Google Scholar] [CrossRef]
  12. Kappassov, Z.; Corrales, J.-A.; Perdereau, V. Tactile sensing in dexterous robot hands—Review. Robot. Auton. Syst. 2015, 74, 195–220. [Google Scholar] [CrossRef]
  13. Heyneman, B.; Cutkosky, M.R. Slip classification for dynamic tactile array sensors. Int. J. Robot. Res. 2016, 35, 404–421. [Google Scholar] [CrossRef]
  14. Hughes, D.; Correll, N. Texture recognition and localization in amorphous robotic skin. Bioinspir. Biomim. 2015, 10, 055002. [Google Scholar] [CrossRef]
  15. Spiers, A.J.; Liarokapis, M.V.; Calli, B.; Dollar, A.M. Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors. IEEE Trans. Haptics 2016, 9, 207–220. [Google Scholar] [CrossRef] [PubMed]
  16. Kerr, E.; Mcginnity, T.M.; Coleman, S. Material Recognition using Tactile Sensing. Expert Syst. Appl. 2018, 94, 94–111. [Google Scholar] [CrossRef]
  17. Khasnobish, A.; Pal, M.; Tibarewala, D.N.; Konar, A.; Pal, K. Texture- and deformability-based surface recognition by tactile image analysis. Med. Biol. Eng. Comput. 2016, 54, 1269–1283. [Google Scholar] [CrossRef] [PubMed]
  18. Sinapov, J.; Sukhoy, V.; Sahai, R.; Stoytchev, A. Vibrotactile Recognition and Categorization of Surfaces by a Humanoid Robot. IEEE Trans. Robot. 2011, 27, 488–497. [Google Scholar] [CrossRef]
  19. Strese, M.; Schuwerk, C.; Iepure, A.; Steinbach, E. Multimodal Feature-Based Surface Material Classification. IEEE Trans. Haptics 2017, 10, 226–239. [Google Scholar] [CrossRef]
  20. Kerr, E.; Mcginnity, T.M.; Coleman, S. Material classification based on thermal and surface texture properties evaluated against human performance. In Proceedings of the International Conference on Control Automation Robotics & Vision, Singapore, 10–12 December 2014; pp. 444–449. [Google Scholar]
  21. Bhattacharjee, T.; Rehg, J.M.; Kemp, C.C. Haptic classification and recognition of objects using a tactile sensing forearm. In Proceedings of the Ieee/rsj International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 7–12 October 2012; pp. 4090–4097. [Google Scholar]
  22. Lederman, S.J.; Klatzky, R.L. Extracting object properties through haptic exploration. Acta Psychol. 1993, 84, 29–40. [Google Scholar] [CrossRef]
  23. Kaboli, M.; Cheng, G. Robust Tactile Descriptors for Discriminating Objects From Textural Properties via Artificial Robotic Skin. IEEE Trans. Robot. 2018, 34, 985–1003. [Google Scholar] [CrossRef]
  24. Tanaka, D.; Matsubara, T.; Ichien, K.; Sugimoto, K. Object manifold learning with action features for active tactile object recognition. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 608–614. [Google Scholar]
  25. Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile Sensing—From Humans to Humanoids. IEEE Trans. Robot. 2010, 26, 1–20. [Google Scholar] [CrossRef]
  26. Romano, J.M.; Hsiao, K.; Niemeyer, G.; Chitta, S.; Kuchenbecker, K.J. Human-Inspired Robotic Grasp Control with Tactile Sensing. IEEE Trans. Robot. 2011, 27, 1067–1079. [Google Scholar] [CrossRef]
  27. Stark, B.; Carlstedt, T.; Hallin, R.G.; Risling, M. Distribution of human Pacinian corpuscles in the hand. A cadaver study. J. Hand Surg. 1998, 23, 370–372. [Google Scholar] [CrossRef]
  28. Johansson, R.S.; Vallbo, A.B. Tactile sensibility in the human hand: Relative and absolute densities of four types of mechanoreceptive units in glabrous skin. J. Physiol. 1979, 286, 283–300. [Google Scholar] [CrossRef] [PubMed]
  29. Balasubramanian, R.; Santos, V.J. The Human Hand as An Inspiration for Robot Hand Development; Springer Publishing Company, Incorporated: Saxonburg, PA, USA, 2014. [Google Scholar]
  30. Lederman, S.J.; Klatzky, R.L. Hand movements: A window into haptic object recognition. Cogn. Psychol. 1987, 19, 342–368. [Google Scholar] [CrossRef]
  31. Popov, V.L. Contact Mechanics and Friction; Springer: Heidelberg, Berlin, 2010; pp. 219–250. [Google Scholar]
  32. Kaboli, M.; Armando, D.L.R.T.; Walker, R.; Cheng, G. In-hand object recognition via texture properties with robotic hands, artificial skin, and novel tactile descriptors. In Proceedings of the Ieee-Ras International Conference on Humanoid Robots, Seoul, Korea, 3–5 November 2015; pp. 1155–1160. [Google Scholar]
  33. Chathuranga, D.S.; Wang, Z.; Ho, V.A.; Mitani, A. A biomimetic soft fingertip applicable to haptic feedback systems for texture identification. In Proceedings of the IEEE International Symposium on Haptic Audio Visual Environments and Games, Istanbul, Turkey, 26–27 October 2013; pp. 29–33. [Google Scholar]
  34. Batista, G.E.A.P.A.; Keogh, E.J.; Tataw, O.M.; Souza, V.M.A.D. CID: An efficient complexity-invariant distance for time series. Data Min. Knowl. Discov. 2014, 28, 634–669. [Google Scholar] [CrossRef]
  35. Tang, Y. Deep Learning using Linear Support Vector Machines. arXiv 2013, arXiv:1306.0239. [Google Scholar]
  36. Lee, C.P.; Lin, C.J. A study on L2-loss (squared hinge-loss) multiclass SVM. Neural Comput. 2013, 25, 1302–1323. [Google Scholar] [CrossRef] [PubMed]
  37. Kak, A.C.; Martínez, A.M. PCA versus LDA. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 228–233. [Google Scholar]
  38. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  39. Bishop, C.M. Neural Networks for Pattern Recognition; Oxford University Press: Oxford, UK, 1995; pp. 1235–1242. [Google Scholar]
  40. Shull, K.R. Contact mechanics and the adhesion of soft solids. Mater. Sci. Eng. R 2002, 36, 1–45. [Google Scholar] [CrossRef]
  41. Jamali, N.; Sammut, C. Majority Voting: Material Classification by Tactile Sensing Using Surface Texture. IEEE Trans. Robot. 2011, 27, 508–521. [Google Scholar] [CrossRef]
  42. Ho, V.A.; Araki, T.; Makikawa, M.; Hirai, S. Experimental investigation of surface identification ability of a low-profile fabric tactile sensor. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 7–12 October 2012; pp. 4497–4504. [Google Scholar]
  43. Baishya, S.S.; Bäuml, B. Robust material classification with a tactile skin using deep learning. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 8–15. [Google Scholar]
  44. Gómez Eguíluz, A.; Rañó, I.; Coleman, S.A.; McGinnity, T.M. Multimodal Material identification through recursive tactile sensing. Robot. Auton. Syst. 2018, 106, 130–139. [Google Scholar] [CrossRef] [Green Version]
  45. Rasouli, M.; Chen, Y.; Basu, A.; Kukreja, S.L.; Thakor, N.V. An Extreme Learning Machine-Based Neuromorphic Tactile Sensing System for Texture Recognition. IEEE Trans. Biomed. Circuits Syst. 2018, 12, 313–325. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The processes of the material recognition method.
Figure 1. The processes of the material recognition method.
Applsci 09 02537 g001
Figure 2. The tactile data acquisition of test materials (a) The tactile sensor; (b) the sensor was slid across the material surface by the motion platform. (c) The test materials: (1) foamed silicone rubber (FSR), (2) sponge, (3) cotton, (4) polyvinyl chloride (PVC), (5) silicone rubber (SR), (6) rubber, (7) sandpaper (80 Grit), (8) wood, (9) iron (unpolished), (10) ceramics, (11) copper, and (12) acrylic.
Figure 2. The tactile data acquisition of test materials (a) The tactile sensor; (b) the sensor was slid across the material surface by the motion platform. (c) The test materials: (1) foamed silicone rubber (FSR), (2) sponge, (3) cotton, (4) polyvinyl chloride (PVC), (5) silicone rubber (SR), (6) rubber, (7) sandpaper (80 Grit), (8) wood, (9) iron (unpolished), (10) ceramics, (11) copper, and (12) acrylic.
Applsci 09 02537 g002
Figure 3. The three sequences of 12 materials. (a) The sequences of a random sample; (b) the average sequences of all samples.
Figure 3. The three sequences of 12 materials. (a) The sequences of a random sample; (b) the average sequences of all samples.
Applsci 09 02537 g003
Figure 4. The accuracy changes as the number of principal components increase.
Figure 4. The accuracy changes as the number of principal components increase.
Applsci 09 02537 g004
Figure 5. Confusion matrix of property recognition based on the kNN classifier. The columns indicate the predictions and rows indicate the real classification.
Figure 5. Confusion matrix of property recognition based on the kNN classifier. The columns indicate the predictions and rows indicate the real classification.
Applsci 09 02537 g005
Figure 6. Confusion matrix of material recognition based on the MLP classifier. The columns indicate the predictions and rows indicate the real classification.
Figure 6. Confusion matrix of material recognition based on the MLP classifier. The columns indicate the predictions and rows indicate the real classification.
Applsci 09 02537 g006
Table 1. The test materials and the categories to which they belong.
Table 1. The test materials and the categories to which they belong.
Soft-RoughSoft-SmoothRigid-RoughRigid-Smooth
FSRPVCSandpaperCeramics
SpongeSRWoodCopper
CottonRubberIronAcrylic
Table 2. The stimuli that the four mechanoreceptors sensed [18].
Table 2. The stimuli that the four mechanoreceptors sensed [18].
Classification BasisPacinian CorpuscleRuffini CorpuscleMerkel CellsMeissner’s Corpuscle
Stimuli Frequency (Hz)40–500+100–500+0.4–33–40
Stimuli TypeHigh Frequency VibrationSustained Downward Pressure; Skin Slip; Tangential ForceSpatial deformation; Sustained pressure; TextureLow Frequency Vibration
Table 3. The 11 features, their meaning, and the corresponding tactile stimulus that sensed by mechanoreceptors.
Table 3. The 11 features, their meaning, and the corresponding tactile stimulus that sensed by mechanoreceptors.
FeaturesMeaningTactile Stimulus
Mean of SaReal contact areaSustained pressure and texture
Mean of SvPressure distributionTexture
Mean of SsContact pressureSustained pressure
Absolute Energy of SsContact force and amplitude of vibrationSustained pressure and low frequency vibration
Standard Deviation of SsConsistency of contact forceTexture
cA4 of SsApproximate information of vibrationSustained pressure
cD4 of SsDetail information of vibrationLow frequency vibration
cD3 of SsDetail information of vibrationMedium-low frequency vibration
cD2 of SsDetail information of vibrationMedium-high frequency vibration
cD1 of SsDetail information of vibrationHigh frequency vibration
CID of SsApproximate frequency of vibrationHigh frequency vibration
Table 4. The accuracy, precision, recall, F1-score, and time overhead of the seven classifiers.
Table 4. The accuracy, precision, recall, F1-score, and time overhead of the seven classifiers.
RecognitionPerformance IndexesLinear SVMRBF SVMkNNLDANBRFMLP
Property RecognitionAccuracy (%)92.0194.1095.8390.9777.4390.6394.79
Precision (%)92.2594.2595.7591.7579.5090.7595.00
Recall (%)92.2594.0095.7591.0077.5091.0094.75
F1-Score (%)92.2594.0095.7591.2574.5090.7594.75
Time (ms)35.6835.0736.0335.9735.9937.0235.44
Material RecognitionAccuracy (%)84.0385.4285.7680.5673.6180.2190.28
Precision (%)85.2586.5886.4282.8376.0880.7590.58
Recall (%)84.0885.5886.1781.2574.4180.8390.25
F1-Score (%)84.1785.2586.0880.6773.5880.5090.17
Time (ms)36.7635.9236.3836.7435.1136.6736.78
Table 5. The run time of the processes of material recognition.
Table 5. The run time of the processes of material recognition.
RecognitionProcessesLinear SVMRBF SVMkNNLDANBRFMLP
Property RecognitionData Reading (ms)19.6019.8619.7819.7520.5819.5320.13
Dimensionality Reduction (ms)11.4910.8011.5811.5111.0312.6310.75
Feature Extraction (ms)4.564.334.644.664.354.834.53
Classification (μs)0.6922.9217.360.691.392.783.47
Material RecognitionData Reading (ms)19.8519.9419.6919.8419.9821.0420.58
Dimensionality Reduction (ms)12.0111.4511.7611.8310.7411.0911.34
Feature Extraction (ms)4.884.564.855.044.234.364.83
Classification (μs)1.3948.62211.811.392.08118.063.47
Table 6. Recognition results under different interactive forces.
Table 6. Recognition results under different interactive forces.
RecognitionD100D150D250D300
Property Recognition93.24%94.85%95.77%94.16%
Material Recognition91.02%86.47%87.53%86.14%
Table 7. Recognition results under different interactive forces.
Table 7. Recognition results under different interactive forces.
RecognitionWoodSpongeCottonIronSandpaperCeramics
Property Recognition86.67%93.33%80.00%83.33%96.66%80.00%
Material Recognition70.00%83.33%70.00%73.33%90.00%66.67%
Table 8. The summary of relevant works.
Table 8. The summary of relevant works.
YearEPTactile DataToEP (s)NoFNoMAccuracy
2011 [41]Prescribed contact; slide and liftLateral stretch sensed by strain gauges; vibration sensed by PVDFsN/A6995%
2012 [42]Prescribed slide with accelerationDetails calculated from outputs of the tactile sensor by DWTN/A10390%
2012 [21]Prescribed pushThe sequence of maximum force; contact area and contact motion during contact1.2201872%
2015 [14]Rubbing the texture on the surface of the amorphous robot skin near a sensor by handThe spectrum of the signal produced by individual microphones51281571%
2016 [43]Prescribed grasp with right hand, thumb and index finger of robots; slide; releaseSpatial-temporal force signal of tactile skin5+24,000697%
2016 [17]Free exploration by twenty healthy subjectsIntensities and geometric property of tactile image0.110478%
2017 [19]Capture a surface image; impact and move arbitrarilyDescribed sound; image, friction force; acceleration featuresN/A66974%
2018 [44]Contact until temperature was stable; prescribed circular slideVibration signal; absolute temperature; thermal fluxN/A93498%
2018 [16]Prescribed press and slideStatic temperature; thermal flow rate; static vibration; vibration strength20+181486%
2018 [45]Prescribed contact, slide and holdingBiomimetic tactile signals based on the values of force sensitive units12+901092%

Share and Cite

MDPI and ACS Style

Xie, Y.; Chen, C.; Wu, D.; Xi, W.; Liu, H. Human-Touch-Inspired Material Recognition for Robotic Tactile Sensing. Appl. Sci. 2019, 9, 2537. https://doi.org/10.3390/app9122537

AMA Style

Xie Y, Chen C, Wu D, Xi W, Liu H. Human-Touch-Inspired Material Recognition for Robotic Tactile Sensing. Applied Sciences. 2019; 9(12):2537. https://doi.org/10.3390/app9122537

Chicago/Turabian Style

Xie, Yu, Chuhao Chen, Dezhi Wu, Wenming Xi, and Houde Liu. 2019. "Human-Touch-Inspired Material Recognition for Robotic Tactile Sensing" Applied Sciences 9, no. 12: 2537. https://doi.org/10.3390/app9122537

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop