Next Article in Journal
Intelligent Localization and Deep Human Activity Recognition through IoT Devices
Next Article in Special Issue
AI-Assisted Cotton Grading: Active and Semi-Supervised Learning to Reduce the Image-Labelling Burden
Previous Article in Journal
Designing UAV Swarm Experiments: A Simulator Selection and Experiment Design Process
Previous Article in Special Issue
Marbling-Net: A Novel Intelligent Framework for Pork Marbling Segmentation Using Images from Smartphones
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Tactile-Sensing Technologies: Trends, Challenges and Outlook in Agri-Food Manipulation

1
School of Computer Science, University of Lincoln, Lincoln LN6 7TS, UK
2
Lincoln Institute for Agri-Food Technology, University of Lincoln, Lincoln LN6 7TS, UK
*
Author to whom correspondence should be addressed.
Current address: Brayford Way, Brayford, Pool, Lincoln LN6 7TS, UK.
Sensors 2023, 23(17), 7362; https://doi.org/10.3390/s23177362
Submission received: 13 June 2023 / Revised: 1 August 2023 / Accepted: 15 August 2023 / Published: 23 August 2023
(This article belongs to the Special Issue Artificial Intelligence and Sensor Technologies in Agri-Food)

Abstract

:
Tactile sensing plays a pivotal role in achieving precise physical manipulation tasks and extracting vital physical features. This comprehensive review paper presents an in-depth overview of the growing research on tactile-sensing technologies, encompassing state-of-the-art techniques, future prospects, and current limitations. The paper focuses on tactile hardware, algorithmic complexities, and the distinct features offered by each sensor. This paper has a special emphasis on agri-food manipulation and relevant tactile-sensing technologies. It highlights key areas in agri-food manipulation, including robotic harvesting, food item manipulation, and feature evaluation, such as fruit ripeness assessment, along with the emerging field of kitchen robotics. Through this interdisciplinary exploration, we aim to inspire researchers, engineers, and practitioners to harness the power of tactile-sensing technology for transformative advancements in agri-food robotics. By providing a comprehensive understanding of the current landscape and future prospects, this review paper serves as a valuable resource for driving progress in the field of tactile sensing and its application in agri-food systems.

1. Introduction

Climate change, shifting migration patterns, urban population growth, ageing populations, and overall population expansion pose significant challenges to the global food chain [1]. The agriculture and food (agri-food) sector can address these challenges through the adoption of robotics and automation technologies. Although the agricultural sector has been slower to integrate automation compared to other industries, there has been a recent surge in agri-food-related robotics and automation research. The inherently complex and uncontrolled nature of the agri-food industry demands innovative solutions for handling uncertainties. Tactile sensation is one such promising approach.
Tactile sensation gives robotics the ability to measure physical interactions and, as such, is developing widespread use in robotics systems [2]. The development of affordable tactile sensors and fabrication techniques has spurred a wide range of research endeavours in tactile sensation within the agri-food sector. From gently harvesting delicate foods without causing damage to assessing the ripeness of soft fruits and sorting produce, tactile sensation equips robots with a novel modality to tackle some of the most intricate physical interaction challenges in the industry.
Tactile sensing in the agri-food sector plays a crucial role in the harvesting, manipulation, and feature extraction of food items. Initial research primarily aimed at utilising tactile sensing for determining key features, such as hardness, in order to estimate food ripeness [3,4,5,6,7,8,9,10,11,12]. This enables the classification of food items, particularly in high-occlusion scenarios, where visual cues may be obstructed by a robot’s end effector, surrounding environment, or packaging [13,14,15,16,17].
Harvesting and grasping food items often involve dealing with visual occlusion and handling delicate, easily bruised items that require a gentle touch [18,19,20,21,22,23,24,25,26,27]. As agricultural robotics progresses towards more complex and realistic environments, these systems must be capable of interacting with their surroundings, such as manipulating foliage during soft fruit harvesting [28,29].
Tactile sensing is also experiencing increased interest in food preparation tasks, such as robotic cooking, where the qualities applied to harvesting and handling are similarly utilised. Tactile sensing allows for measurements without requiring a line of sight, enabling sensing in problems like food item cutting and pouring items from within packaging [30,31,32,33,34,35].
Although tactile sensation is gaining popularity in physical robot interaction tasks, there are still considerable shortcomings with the current approaches. State-of-the-art tactile sensors cannot be spatially scaled up for sensing over large areas due to sensing points crosstalk, wiring space requirements, and latency resulting from having a large number of sensing points [36,37]. While tactile sensation in humans involves an active perception system, where exploratory actions are proactively generated for tactile exploration [38], tactile-based robot controllers are limited to fully reactive systems. Current multi-modal sensing systems and data fusion techniques for effectively combining tactile data with other sensing modalities, such as vision, are still far behind reaching a human-level multi-modal perception. Furthermore, although tactile sensors have been used in robotic systems in agri-food applications, such as robotic harvesting [21,22], fruit ripeness estimation [4,8], and food item classification [13], meeting the standards of food item damage and bruise [39,40,41] or food hygiene standards [42] has not yet been formalised in these research problem statements.
The integration of tactile sensation into the agri-food sector provides benefits in farming yield, product quality control, and food preparation. However, there are significant sectors within the agri-food industry that have not yet utilised tactile sensation. This review paper aims to offer a thorough understanding of tactile sensation and its applications in robotics within the agri-food industry. To achieve this, we first present a general introduction to contemporary tactile sensor hardware and associated algorithms. We then delve into an in-depth analysis of the state-of-the-art techniques and applications of tactile sensation in the agri-food sector. Subsequently, we provide a comprehensive assessment of the limitations of current tactile sensors and their applications, as well as identifying potential future research directions within the agri-food industry. While some topics may overlap with previous surveys, our goal is to equip the reader with a holistic comprehension of this emerging and critical area of research as there is currently no existing review addressing tactile sensation in the agri-food domain.
In this paper, we make the following concrete contributions to the field of tactile sensation research:
  • Context of tactile sensation research: We provide a comprehensive overview of tactile sensation technology in Section 2 and core algorithms in Section 3 that are used in robotics and automation research. This discussion offers a solid foundation and understanding of the current state-of-the-art technologies in the field.
  • Tactile sensation in agri-food: Our review paper focuses on the application of tactile sensation research specifically in the agri-food domain, which has not been covered in other tactile sensation review papers. In Section 4, we present a concise and comprehensive examination of the current research on tactile sensation applied to various aspects of agri-food, highlighting its significance and potential impact.
  • Systematic Review of Shortcomings and Challenges: We contribute a systematic review of the shortcomings and use case challenges associated with the developed tactile sensor technologies for agri-food use cases. This critical assessment, presented in Section 5, addresses an aspect that has been largely disregarded in other review papers on tactile sensors [43,44,45,46,47,48,49].
We outline the structure of this review in Figure 1. By providing a contextual overview, examining the agri-food applications, and addressing the challenges, our paper aims to contribute to the advancement of tactile sensation research and its practical implementation in the agri-food industry.

2. Tactile-Sensing Technologies

This section briefly overviews tactile-sensing technologies, emphasising more on the tactile sensors reported in the past five years. This overview includes a discussion of the various transduction methods utilised by these sensors and the tactile features extracted and a discussion of some recent advancements in this domain.

2.1. Transduction Methods

Tactile-sensing technologies use various transduction methods to convert physical interactions into useful tactile features. These methods include (a) measuring electric variables (such as resistance, capacitance, impedance, etc.) of the sensing element; (b) analysing sensing element’s deformation using images (camera based), sound (acoustic methods), fluid pressure, etc.; and (c) combining multiple transduction methods. The literature provides detailed reports on how these methods are utilised to derive tactile information [50,51]. However, for the readability of the paper’s subsequent sections, these transduction methods are briefly outlined.

2.1.1. Resistive and Piezoresistive

Both these transduction methods quantify the physical interaction by measuring the changes in resistance of the sensing element (see Figure 2a). In the resistive type, the change in the contact resistance between two conductive elements is transduced as a measure of the applied load [52]. While in the case of the piezoresistive type, piezoresistive materials are used as sensing elements, whose resistance varies with respect to the applied load. The resistive method offers high sensitivity at lower loads and has a good dynamic response. At the same time, it suffers from drift and hysteresis [52]. Piezoresistive methods are known to have high spatial resolution and are less susceptible to noise. But their downside includes lower repeatability, hysteresis, and high power consumption [51].

2.1.2. Capacitive

Capacitive transduction methods measure the changes in the capacitance of the sensing element as a measure of the applied load. The sensing element will have two conductor plates with a dielectric material in between them (see Figure 2b). The capacitive method provides high sensitivity and better spatial resolution with an extensive dynamic range. And the performance is not affected by temperature variations. Due to noise susceptibility and fringing capacitance, capacitive methods call for complex filtering circuitries during the sensor construction [47,51,52].

2.1.3. Magnetic and Hall Effect

The magnetic transduction method utilises the changes in magnetic flux or field intensity when the sensing element is subjected to an external force [47]. These changes in magnetic property are measured using a Hall effect sensing pickup by generating a corresponding voltage relative to the applied load (see Figure 2c). The magnetic method offers better linearity, low hysteresis, and good repeatability in its measurement [53].

2.1.4. Piezoelectric

This transaction method uses piezoelectric material as the sensing element (see Figure 2d). The piezoelectric materials can develop a voltage in response to the applied mechanical load. And this voltage is used as a measure of the applied load. Piezoelectric sensing methods offer a high-frequency response, high sensitivity, and high dynamic sensing range. Also, they are known to have low spatial resolution and are incapable of measuring static loads due to their high internal resistance [51].

2.1.5. Electrical Impedance Tomography

Electrical impedance tomography (EIT) is an imaging technique based on the variation in the electrical impedance distribution on the surface of a deformable object when subjected to an external load [54,55]. Such sensors have two sets of electrodes arranged around a conductive sensing area of the sensor. One set injects an electric current, while the other measures the potential distributions, which vary as a force acts on the sensing area (see Figure 2e).

2.1.6. Camera/Vision Based

In the camera-based method, the image of the deforming sensing surface captured by the camera is used to extract the tactile features (see Figure 2f). Usually, the deforming surface (soft sensing surface) has markers or pins arranged on its inner surface, whose displacements are recorded by the camera [32,56,57,58,59,60,61]. Apart from using pins or markers, the imprints of the external object on the sensing skin are captured by the camera in certain sensors [62,63,64,65,66,67,68]. Compared to other transduction methods discussed above, the camera-based method calls for a bigger form factor of the sensor, as it has to house a camera with its illumination provisions and also needs to place the camera away from the sensing surface to obtain a better view of the surface. To reduce the total form factor of the sensor, attempts have been made to increase the number of cameras so that the gap between the sensing surface and the camera can be reduced without compromising the view of the sensing surface [69].

2.1.7. Optic Fiber Based

With the development of optic fibres, they have been utilised in constructing tactile sensors. Essentially, such a tactile sensor will have a light source (usually LEDs), optic fibre(s), sensing surface, and light output detectors (e.g., camera and CCD) as primary components. The optic fibre acts as the light carrier from the source to the sensing surface and then returns the output light to the detector. The light property variations from any external interaction on the sensing surface are generally quantified either in terms of light intensity modulation, fibre Bragg grating (FBG), or interferometry to output the tactile measurement [47]. This method offers high spatial resolution, sensitivity, and repeatability in the measurements, and the sensor performance is not affected by electromagnetic interference [70].

2.1.8. Acoustics

In the acoustic method, the variation in the acoustic wave propagation through deformable sensing body is mapped to the tactile readings [71,72,73,74,75] (see Figure 2g). The wave propagation alters when there is a deformation on the sensing body due to the external load. This method primarily uses ultrasonic sound waves for the sensor operation.

2.1.9. Fluid Based

An external stimulus can alter the fluid pressure enclosed in a flexible housing. Hence, the measure of this fluid pressure variation can be related to the external stimulus (see Figure 2h). Such a transduction method using pneumatics can generate readings with good linearity, repeatability, and low hysterisis [76].

2.1.10. Triboelectric

This transduction method employs triboelectric nanogenerators (TENGs) to convert the mechanical load into a corresponding electric signal without the need for external operational power [77]. Hence, sensors based on this method are gaining importance, as they can self-power themselves. Triboelectric sensors are an ideal candidate for artificial prosthetics due to their self-powering nature.

2.1.11. Combination of Various Methods

Certain attempts have been reported that use a combination of various transduction methods. Some of them include a combination of the piezoresistive and Hall effect [78]; triboelectric–piezoelectric–pyroelectric [79] and EIT and acoustics [80]. Such attempts offer the advantages of measuring additional tactile features and overcoming the shortcomings of a single transduction method when used alone. For example, combining the triboelectric, piezoelectric, and pyroelectric methods enables the tactile sensor to measure the contact pressure and contact temperature and also makes the sensor operate without an external power [79]. Moreover, combining acoustic with the EIT method enables the sensor to measure contact vibrations, which are impossible with EIT alone [80].
The type of transduction method used in a sensor can define the kind of tactile features it can generate. So, in the next section, various tactile features that these transduction methods can extract are discussed.
Figure 2. Examples of transduction methods: (a) Piezoresistive [81]. (b) Capacitive [82]. (c) Magnetic [53]. (d) Piezoelectric [83]. (e) Electrical impedance tomography (EIT) [55]. (f) Camera-based [84]. (g) Acoustics [71]. (h) Fluid based [76].
Figure 2. Examples of transduction methods: (a) Piezoresistive [81]. (b) Capacitive [82]. (c) Magnetic [53]. (d) Piezoelectric [83]. (e) Electrical impedance tomography (EIT) [55]. (f) Camera-based [84]. (g) Acoustics [71]. (h) Fluid based [76].
Sensors 23 07362 g002

2.2. Tactile Features

Tactile sensing plays a pivotal role in determining the success of numerous robotic manipulation tasks that involve direct physical interactions with the environment [51,85,86]. The tactile sensors help derive essential information for robot control from these interactions using the transduction methods outlined above. In general, this tactile information is collectively known as “tactile features". The tactile features include information on contact force components, contact location, contact deformation, and its derived features (like surface characteristics, the shape of the contacting object, the pose/orientation of contact), and other features like temperature, etc. The details of these features are presented below:

2.2.1. Contact Force

Force-sensing tactile sensors are capable of quantifying the various forms of interaction forces like pressure [55,79,80,81,87,88], normal force [53,71,77,78,89,90,91], shear force [53,78,87,90], tangential force (or frictional force) [89], angular force (or torque) [53,78,87] and vibrations (oscillating force) [76,80]. Measuring contact force is vital for manipulation tasks like grasp force control and slip detection [92].

2.2.2. Contact Location

This is the spatial description of external contact on the sensory surface. It is represented in terms of coordinate values/area relative to the sensor surface. The sensors using transduction methods like EIT [55], magnetic [90], acoustic [91], camera-based [58,63,64], and a few other combinations (EIT-acoustics [80], piezoresisitve–Hall effect [78], triboelectric–piezoelectric–pyroelectric [79]) are reported to generate contact location information.

2.2.3. Contact Deformation

Whenever an external object comes in contact with the deformable sensing surface, the sensing surface takes the shape of the contact object. This deformation can be utilised to derive key features, such as the following: Contact object shape, surface texture, contact pose, and orientation. These features help execute tasks like object sorting, mechanical assembly, in-hand manipulation, and preventing object slippage. Since contact deformation is the primary clue to deriving these sub-features, the sensors call for a highly deformable sensing surface. Usually, camera-based transduction methods are used to capture high-definition images of the corresponding deformation of the inner surface of the sensing surface. Using the images of the sensing surface deformation, sensors can reconstruct the shape of the contact object. DenseTact and DIGIT are typical examples of camera-based sensors that can generate contact shape reconstruction from sensing surface deformation. DenseTact can reconstruct shape with an absolute mean error of 0.28 mm [62]. DIGIT is sensitive enough to capture sub-millimetre structures for shape reconstruction [68]. Besides camera-based methods, acoustic and EIT-based transduction methods are also tried to extract the deformation information for contact shape recognition. Ref. [71] showed its capability to differentiate three contact shapes (triangular, square, and circular) while it interacts with the sensing surface at a contact force ranging from 0.2 N to 1 N. The biomimetic elastomeric skin uses the EIT method to recognise the shape of contact objects [80]. The camera-based sensor can also recognise surface texture (e.g., patterns of fingerprints and roughness) of the contact objects from the sensing surface deformation images. Ref. [56] could classify the textures with a maximum accuracy of 83%. Another tactile sensor, NeuroTac, enables surface texture (grid size of 2.5–5 mm) classification at 92.8% accuracy [60]. Tactile sensors like Soft-bubble [67] and OmniTact [66] can use the sensing surface deformation images to estimate the contact pose and sensor orientation. By the contact pose, it means the pose of the object in contact relative to the sensor, and vice versa for sensor orientation. The Soft-bubble tactile sensor [67] can estimate the contact pose in 0.5 s, and the OmniTact sensor [66] can measure the sensor orientation with a maximum median absolute error of 1.986 .

2.2.4. Other Features

Apart from all these features, tactile sensors can quantify the temperature of the contacting object. This is made possible by integrating pyroelectric sensing elements in the sensor. One such sensor can measure temperature with a sensitivity of 0.11 V/ C [79]. Also, sometimes, measuring the temperature inside actuators (pneumatic fingers) is of interest. For this, acoustic methods are employed, which could sense temperature with a mean accuracy of 4.5 C [93].
A table is constructed to map the tactile features generated by various transduction methods (see Table 1).

2.3. Advancements in Tactile Sensing

Recently, tactile-sensing technology has been receiving updates that can improve their practicality in various robot manipulation tasks. Some relevant ones are briefed here.

2.3.1. Low-Cost Tactile-Sensing Techniques

In most robotic applications, either off-the-shelf or custom-made tactile sensors are attached to the robot counterparts when tactile sensing is needed. This includes attaching sensors to the robot body or to end effectors, depending on the need. In some instances, attaching tactile sensors to the robot counterparts can affect its operation. For example, while attaching a tactile sensor to a pneumatic finger, the flexible nature of the finger can be altered to some extent. So research has been conducted to sensorize end effectors without retrofitting tactile sensors directly. Recently, the acoustic method was tried out in this regard. Zoller et al. [94,95] sensorized a pneumatic finger by implanting a speaker and microphone. The speaker continuously emits a reference sound signal retrieved by the microphone. Whenever this finger interacts with the external environment while in operation, the modulation of the reference sound signal is altered. And this altered modulation is used to characterise the tactile features, like contact force, contact location, the material of a touching object, finger inflation, and temperature [93].
Similarly, acoustic is also adopted to sensorize flexible material skin by making continuous passages between speaker and microphone through the skin [91] (see Figure 3). And any deformation on the skin will impact the cross section of these passages and affect the reference signal modulation. It is proved that such skin can detect normal static forces and their contact location. These approaches make use of minimal hardware and eliminate the requirement of tiny knit electric circuits or other complex manufacturing techniques, thereby reducing the overall cost of the sensor.

2.3.2. Self-Powered Tactile Sensors

Tactile sensors are receiving updates based on their operational power. The vast majority of reported tactile sensors require external power for their operation. Now, a small subset of the tactile sensors can derive power themselves from external interactions. This is achieved by combining transduction elements capable of generating electric charges, such as triboelectric, piezoelectric, and pyroelectric elements [79]. These self-powered tactile sensors possess promising potential for sustained use in applications such as measuring human-specific tactile features in wearable robotic prosthetic applications [77,79].

2.3.3. Anti-Microbial Feature of Tactile Sensors

Tactile sensors are finding application in human health-care purposes, prostheses, robotic surgeries, etc. In such cases, the tactile sensors will make direct contact with the human body, and hence the sensors should exhibit some anti-microbial characteristics for safe usage. There was a recent adaptation in tactile sensors such that they exhibit anti-bacterial features [79,96,97,98,99,100]. This is made possible by using specific materials with anti-microbial properties in the sensor construction, such as fabric triboelectric nanogenerator (FTENG) [100], zinc oxide-polytetrafluoroethylene (ZnO–PTFE) [99], silver nanoparticles (AgNPs) [97], etc.
Table 1. Mapping of transduction methods and tactile features extracted.
Table 1. Mapping of transduction methods and tactile features extracted.
Tactile FeaturePr/ReCCrOMaPeEITAcTrFCom
Normal Force [101] [82,89,102,103,104] [32,56,57,84] [70,105,106] [53,90]-- [71,91] [77] [76] [78]
Shear Force- [82,87,104] [84]- [53,90]----- [78]
Tangential Force- [89,103]---------
Torque- [82,87] [107]- [53]----- [78]
Pressure[81,88,108,109] [87]--- [110] [55]- [111,112]- [79,80]
Vibration--------- [76] [80]
Contact Location [101]- [58,63,64,84] [70,105] [90]- [55] [75,91,93]-- [78,80]
Deformation/Object Shape/geometry [101]- [59,62,65,68,107,113] [17,114]--- [71,72,74]- [115] [80]
Surface texture-- [60,63,116] [117]---- [77] [76]-
Pose/Orientation-- [66,67]--------
Temperature------- [93]-- [79]
Pr/Re: Piezoresistive/resistive, C: Capacitive, Cr: Camera based, O: Optic fiber, Ma: Magnetic, Pe: Piezoelectric, Ac: Acoustics, Tr: Triboelectric, F: Fluid based, Com: Combinations.

3. Tactile Sensors in Robotics and Automation

Tactile sensing in agri-food robotics has gained attention as an area of interest in the field of advanced agricultural technologies. The focus of this review section is to provide an overview of the current state of the art in tactile sensing for agri-food robotics, discussing the key principles and recent advancements for different agri-food problems. Key challenges and potential future directions are discussed in Section 5 and Section 6, respectively.
Although “Tactile sensing […] has hardly received attention in the agricultural robotics community” (Kootstra et al.) [118], as robotics research drives towards solving real-world agri-food problems, the integration of tactile sensation has become more popular. Typically research has been focused on automated harvesting, where tactile sensation can extract features such as ripeness and location in scenarios where visual sensation typically fails. The following subsections are split into specific agri-food application domains.

3.1. Food Item Feature Extraction

Tactile sensation enables the extraction of features that cannot be obtained by visual techniques. Dong et al. [3] showed that objects with similar appearance and shape, e.g., ripe and unripe fruits, “cannot be discriminated accurately with only visual information” and that tactile sensation can be a more effective modality. Tactile sensation is an emerging research area in the domain of food item feature extraction, with a focus on developing robust, non-destructive methods for recognizing the hardness, ripeness, and firmness of various fruits and vegetables.
Zhang et al. [4] focused on recognising the hardness of fruits and vegetables (apple, kiwi, orange and tomato) using tactile array information from the WTS0406-38 magnetic tactile sensor. They proposed PCA-KNN and PCA-SVM classification models, with the latter showing significantly better performance. In a similar vein, Blanes et al. [5] developed a pneumatic robot gripper, shown in Figure 4a, for sorting eggplants by firmness, which demonstrated high sorting accuracy and adaptability. These two studies share a common goal of enhancing the grasping capabilities of robotic manipulators while ensuring product safety. For a bin-sorting application, Ramirez-Amaro et al. [6] used a heuristic-based system to sort soft fruits by ripeness, using learning from demonstration with the group bespoke tactile omni-directional mobile manipulator (TOMM) Dean-Leon et al. [7]; the tactile skin measures torque across the whole body of the manipulator.
Another approach, proposed by Scimeca et al. [8], involves a custom-made gripper with a capacitive tactile sensor array for non-destructive ripeness estimation in mangoes. This method achieved 88% accuracy in ripeness classification, showing potential as an alternative to traditional penetrometers. Ribeiro et al. [9] also explored non-invasive fruit quality control but with a soft tactile sensor that detected small forces and analysed surfaces. This approach demonstrated high-accuracy results for apples (96%) and strawberries (83%), indicating the potential of soft tactile sensors in automated fruit quality control systems. A capacitive sensor was also used by Maharshi et al. [10]. The flexible nano-needle patterned polydimethylsiloxane (PDMS) works as a dielectric layer to detect the ripeness of the fruits.
Cortés et al. [11] introduced a novel robotic gripper, shown in Figure 4b, that combines mechanical and optical properties for non-destructive mango ripeness assessment. This innovative approach demonstrated the potential for improving post-harvest processes by assessing mango quality during pick-and-place operations using data fusion from multiple sensors. Similarly, Blanes et al. [12] presented pneumatic grippers with accelerometers attached to their fingers for assessing the firmness of irregular products, like eggplants and mangoes, showing potential for industrial pick-and-place processes.
The majority of research in tactile food item feature extraction focuses on creating custom end effectors tailored to specific tasks. Researchers employ various sensing technologies, including capacitive, pneumatic, and magnetic systems, often combined with accelerometers and visual sensors. Due to the specialized nature of agri-food research, there is a lack of generalized tools or equipment, resulting in each gripper being designed for a particular task. Although the algorithms used in these studies may be considered generalizable, the hardware is not.
However, Ribeiro et al. [9] and Zhang et al. [4] developed systems capable of non-invasive analysis on multiple different fruits, indicating a potential shift towards more versatile solutions. Encouraging the development of more generalized harvesting end effectors could benefit the sector by producing research that is more widely applicable, allowing researchers to integrate these findings into their specific problem-solving efforts.

3.2. Food Item Grasping

The task of grasping and moving food items is the most-studied application domain of tactile sensation in agri-food. Researchers have proposed various methods and designs to tackle the challenges of food item grasping, addressing aspects, such as obstacle interference, damage reduction, and real-time force control.
In recent years, there has been a significant focus on soft robotic grippers for the delicate handling of agricultural products. Cook et al. [18] proposed a 3D-printed tri-gripper with embedded tactile sensors made of thermoplastic polyurethane (TPU), demonstrating a sample application in a fruit pick-and-drop task. Similarly, Liu and Adelson [19] introduced a sensorized soft robotic finger, the GelSight Fin Ray, which passively adapts to the objects it grasps while performing tactile reconstruction and object orientation estimation. This design allows for applications, such as wine glass reorientation and placement in a kitchen task; testing was also shown on soft fruit, highlighting use in agri-food applications as well.
Both Hohimer et al. [20] and Zhou et al. [21] investigated the use of soft robotic actuators with embedded tactile-sensing arrays for agricultural applications. Hohimer et al. [20] explored multi-material fused filament fabrication to print flexible TPU-MWCNT composites with built-in tactile-sensing capabilities for performing apple grasping, while Zhou et al. [21] presented a tactile-enabled robotic grasping method that combined deep learning, tactile sensing, and soft robotics. The work of Zhou et al. [21] is on an intelligent robotic grasping method for handling obstacle interference in crop harvesting environments. The researchers designed a fin-ray gripper with an embedded tactile sensor and multiple degrees of freedom, shown in Figure 5a,b, which can adjust its state based on the tactile feedback. A robust perception algorithm and deep learning network were developed to classify grasping status using stress distribution data from the gripper’s fingers. The method developed is applied to apple harvesting.
Zhou et al. [22] addressed the challenges of robotic harvesting unstructured horticultural environments which contain obstacles, such as branches, twigs, trellis wires, and sprinkler lines. Robotic grasping in these environments can lead to high fruit damage rates (6.3% to 30%). The proposed method, which builds on research in Zhou et al. [21], integrates the fin-ray fingers with embedded tactile-sensing arrays and customised perception algorithms to enhance the robot’s ability to detect and handle branch interference during the harvesting process, thereby reducing potential mechanical fruit damage, shown in Figure 5a,b. Experimental validations demonstrate an overall 83.3–87.0% grasping status detection success rate and a promising interference handling method.
A grasp adaptation controller can adapt its grasp pose to maintain hold of an object; this is especially tricky when dealing with delicate objects, as the typical policy of a large grip force is not acceptable. Yamaguchi and Atkeson [23] used a vision-based tactile sensor that extracts object information, such as distance, location, pose, size, shape, and texture. These features were integrated into a grasp adaptation controller to pick up 30 different and delicate food items, including vegetables, fruit, eggs and mushrooms.
Tactile sensing has also been combined with other sensing modalities to improve robotic fruit picking Dischinger et al. [24]. While previous research had focused on visual feedback for closed-loop end-effector placement, this study aims to incorporate tactile, visual, and force feedback for efficient and damage-free fruit removal from trees. Dischinger et al. [24] presented the design of a custom end effector with multiple in-hand sensors, including tactile sensors on the fingertips. The end effector was tested on a Honeycrisp apple tree in outdoor picking trials, demonstrating the ability to detect fruit slip, separate fruit from the tree, and release fruit from the hand using multi-modal sensing.
Slip detection for fruit grasping and manipulation was also explored in Zhou et al. [25]. Using the tactile sensor presented in Zhou et al. [21], the system uses an LSTM (long short-term memory unit)-based recurrent neural network to process tactile sensation into a slip detection signal for each of the four fingers and a close loop controller to maintain a delicate grasp of apples while under leaf interference.
Tian et al. [26] developed a sensitive slipping sensor with a piezoresistor to control the gripping force of agricultural robots handling fruits and vegetables. By using an adaptive neuro-fuzzy inference system, the researchers were also able to effectively extract the slipping signal and control the gripping force when grasping tomatoes and apples. By analysing the grasp force and slip signals, the system is able to adjust the grip force to reduce bruising from robot manipulation.
Misimi et al. [27] presented a robust learning policy based on learning from demonstration (LfD) for the robotic grasping of compliant food objects. The approach combines visual (RGB-D) images and tactile data to estimate the gripper pose, finger configuration, and forces necessary for effective robotic handling. The proposed LfD learning policy automatically discards inconsistent demonstrations from human teachers and estimates the intended policy. This method is validated for fragile and compliant food objects (tested on lettuce) with complex 3D shapes.
Handling delicate food items like crisps without causing damage was explored in Ishikawa et al. [119]. The system attempted to anticipate fractures in food items during robotic manipulation. The LSTM-based learning system uses a two-fingered end effector equipped with tactile sensing that determines the physical properties of a given food item. A predictive control algorithm is then applied to maximise grip force without fracturing or damaging the food item.
In conclusion, the inherent benefits of tactile sensation can be exploited in grasping delicate food items. Soft robotic grippers with embedded tactile sensors have emerged as a promising approach to delicately handle agricultural products, while the integration of tactile sensing with other modalities has further enhanced efficiency and damage reduction. Research addressing obstacle interference, damage reduction, and real-time force control has enabled the development of intelligent robotic systems that can adapt to various agri-food applications.

3.3. Food Item Identification

Fruit identification is a key challenge with some food items during harvesting and crop monitoring. Knowing if the item grasped is the intended target is essential for efficient, timely and safe robotic grasping. Visual-based approaches are often not appropriate due to high occlusion levels during harvesting from the surrounding environment, such as foliage and other crops, and also from the robot’s end effector itself. Tactile sensation is able to operate in areas of low occlusion, both as an independent sensing modality and in collaboration with visual sensation.
Fruit identification in agricultural robotics can be achieved using adaptive robotic grippers, tactile sensing, and machine learning algorithms. Zhang et al. [13] developed a bespoke adaptive gripper, shown in Figure 6a,b, with force and bending sensors to measure contact force distribution and finger deformation during grasping. The random forest classifier demonstrates the highest accuracy of 98% in identifying five fruit types. The proposed method can provide a reference for controlling grasping force and planning robotic motion during the plucking, picking, and harvesting of fruits and vegetables.
Drimus et al. [120] developed an 8 × 8 tactile sensor for grasping and classifying a variety of soft and deformable food items using a dynamic time warping and a nearest neighbourhood classifier.
A hybrid tactile sensor combining a triboelectric active sensing unit with an electromagnetic inductance transducer developed in Li et al. [14] identified fruits from eight categories with an accuracy of 98.75%. Further, using data from 200 object-gripping trials, they trained a convolutional neural network to process the tactile data. The system was capable of identification of soft fruit through four kinds of fruits wrapped in paper bags, plastic bags, foam and none, with a recognition accuracy of 95.93%.
Riffo et al. [15] took a similar approach, using a pressure sensor and machine learning to categorise soft fruits. Li and Zhu [16] identified objects with a new multi-sensory robot hand. Integrating the tactile features of (i) contact pressure (ii) local ambient temperature, (iii) thermal conductivity and (iv) temperature of an object, together with a neural network classification system, enabled them to distinguish between soft fruits and other items.
Using a VGG neural network on post-processed tactile signals from grasping, Lyu et al. [17] were able to classify soft fruit and vegetables using fibre Bragg grating (FBG) tactile sensors attached to the inside fingers of a three-fingered fin-ray end effector.

3.4. Selective Harvesting Motion Planning and Control

Reaching a target fruit for selective robotic harvesting can be a complex task, often requiring more than simple obstacle avoidance. Selective harvesting in greenhouses or orchards often requires robots to cope with dense clutter, non-repetitive tasks, and diverse environmental conditions. Traditional industrial robot systems and vision-based sensors are not well suited for these requirements, as they treat obstacles as rigid bodies and cannot differentiate between soft and hard objects. The ability to manipulate and interact with surrounding obstacles can enable selective harvesting robotics to access more food items when harvesting.
Schuetz et al. [28] proposed a simple and efficient tactile sensor module for a 9-DOF (degree of freedom) multipurpose agricultural manipulator. They developed two approaches for reactive inverse kinematics planning algorithms. By incorporating tactile sensing, the manipulator could perform fine manipulation tasks, explore unknown regions, and respond to the impact of its actions on the surrounding environment structure. The work presents real-world initial experiments that evaluate the performance of the tactile sensor module and the reactive inverse kinematics planning algorithms in suitable agricultural scenarios.
In more recent research, the ability to push aside occluding unripe strawberries was explored in Nazari et al. [29]; Figure 7. Occlusion from a robot’s end effector causes physical interaction tasks, like pushing, to be a significant issue.Tactile sensation can be used to improve pushing performance and a robot’s physical interaction perception Mandil and Ghalamzan-E [121]. Nazari et al. [29] showed that pushing aside occluding strawberries can be performed with tactile sensation alone, enabling harvesting in these complex scenarios.

3.5. Food Preparation and Kitchen Robotics

The use of tactile sensation for pouring food items from deformable containers and packaging was introduced in Tsuchiya et al. [30]. Using a dual arm system, coffee beans, rice, flour and breakfast cereal are all poured using tactile sensation (3D force sensors OMD20-SE-40N, On-Robot Ltd. integrated into three fingers of the end effector), which is used to minimise the grasping force and thus the deformation of the deformable containers, thus reducing pour uncertainty.
During the cooking of beef, Wang et al. [31] used pressure reading tactile sensors to measure tenderness. By probing the beef during cooking, the system was able to agree 95% of the time with the high cost and time-consuming established procedure.
Cutting food items in a household robotics setting was explored in Yamaguchi and Atkeson [32]. By combining visual and optical-based tactile sensations, the control system uses force sensations inferred through the knife to avoid slipping and to feel when the knife has cut through the item. Zhang et al. [33] further explored tactile sensation use in food item cutting. The proposed method measures vibration and uses force–torque sensing to control and adapt the cutting motion and to monitor for contact events. The control method was capable of cutting through a variety of different food items, and further, the tactile system could be used to classify food items and make ripeness estimations.
Tactile sensation has been used to identify foreign objects in food items Shimonomura et al. [34]. By rolling a tactile sensor over the item, hardness measures were mapped to the image space and could be identified. Recent advancements towards a generalised dataset and data collection method for kitchen robotics were explored and developed by DelPreto et al. [35]. The method uses human demonstrations with a variety of sensing features for the exploitation of multi-modal kitchen robotics, including tactile sensation from the human demonstrations. This dataset will help integrate tactile sensation into kitchen robotics.

3.6. Summary

In conclusion, tactile sensing in agri-food robotics has emerged as a valuable area of research, with applications in food item feature extraction, grasping, identification, selective harvesting motion planning and control, and food preparation. As the integration of tactile sensing becomes more popular, researchers are focusing on developing robust, non-destructive, and adaptable methods for addressing a wide range of agri-food challenges. In the next section, we will present the major shortcomings and challenges of using tactile sensors in real-world applications based on the reviewed research works in Section 2, Section 3 and Section 4.

4. Applications of Tactile Sensors in Agri-Food

In this section, we will explore the diverse range of general applications of tactile-sensing technologies, focusing on their essential role in enhancing the capabilities of robotic systems and various other contexts. Tactile sensing has shown promise in numerous applications, including slip control, texture recognition, robot pushing, 3D shape reconstruction, etc. Understanding these general applications is vital, as they provide the foundation upon which more specific agri-food applications can be built. We will present an overview of each application, highlighting the importance of tactile sensing in facilitating better control, interaction, and overall performance in diverse scenarios.

4.1. Force Control

Force control is a crucial aspect of robotic systems, as it enables robots to modulate the applied forces during various tasks, such as grasping, manipulation, and assembly. Tactile sensing plays a significant role in force control, as some tactile sensors can provide real-time information about the contact forces and pressure distribution between the robot’s end effector and the interacting object [45]. This information allows robots to maintain stable contact with objects while handling them.
Tactile sensors, such as capacitive, piezoresistive, or optical sensors, are commonly employed in force control applications [86]. They can be integrated into the end effector or gripper, providing valuable feedback about the forces exerted on the object during manipulation [122]. This feedback can then be used to adjust the applied forces dynamically, ensuring safe and efficient object handling. Further, the integration of tactile sensors into multi-fingered robotic hands enables more dexterous force control of objects [45,123,124].

4.2. Robotic Grasping

Grasping is an essential component of robotic manipulation, as it involves determining the optimal way to grasp and manipulate objects using a robotic hand or end effector. Tactile sensing plays a crucial role in grasping, as it provides valuable information about the object’s properties, such as shape, size, and surface texture, enabling more effective and stable grasping strategies [125,126].
Tactile sensors can be employed in various stages of robotic grasping. During the pre-grasp phase, contact analysis and force optimisation can be applied to increase the likelihood of a successful grasp [127]. Once contact has been made, these sensors can provide real-time feedback about the contact points and force distribution, allowing the robotic system to dynamically adjust its grasping strategy based on the acquired information [128,129]. Re-grasping, the ability to adapt the grasp pose on the object after the initial grasp, is also an essential aspect aided by tactile sensation. By re-grasping based on learned object dynamics acquired during previous grasp attempts, systems can adjust grasp plans to produce successful tactile grasps [130].
Recent advancements in machine learning techniques, such as deep learning and reinforcement learning, have been utilised in grasp planning to improve grasp quality and stability [131,132]. By incorporating tactile-sensing data, these algorithms can learn more robust and adaptive grasping strategies that can take into account complex geometries and unknown properties like softness and provide more generalised grasp approaches [123]. Multi-modal approaches, like visuo-tactile (a vision and tactile multi-modal approach) methods, have enabled precision grasp planning [133] in the multi-fingered robotic hand, showing the benefits of integrating tactile sensation with visual information for more complex grasp planning.

4.3. Slip Detection

Slip detection is a critical aspect of robotic manipulation, as it enables robots to identify and respond to object slippage during grasping and handling tasks. Tactile sensing plays a significant role in slip detection, as it provides valuable information about contact forces, pressure distribution, and object surface properties, which are essential for detecting the onset of slippage and maintaining stable grasps [134,135].
Various types of tactile sensors, such as capacitive, piezoresistive, and optical sensors, have been employed for slip detection in robotic systems [135,136,137,138]. These sensors can detect changes in contact forces or pressure distribution patterns, which may indicate the occurrence of slippage; by monitoring these changes, robots can dynamically adjust their grasping force and strategy to prevent further slippage or to re-establish a stable grasp. Classical methods, like support vector machines and random forest, are still highly used [139,140]. Specifying a threshold on the rate of shear forces for slip detection is a common approach for slip classification [138,141]. Other techniques rely on analysing micro-vibration in the incipient slip phase such as spectral analysis [142,143]. Multi-modal approaches are also proposed to use proximity [144] or visual sensing data [145] to improve slip detection.
Machine learning techniques have been increasingly applied to slip detection, leveraging the rich information provided by tactile sensors [146]. Furthermore, deep learning approaches have shown promise in enhancing slip detection capabilities, particularly when handling objects with complex or unknown properties. Recurrent neural networks, graph neural networks and LSTM (long short-term memory units)-based recurrent neural networks have shown success in this field [123,146,147].
More recent works have attempted to improve robustness, safety and trajectory optimisation through slip prediction [148]. Where tactile sensations are predicted into the future during object manipulation [149], this tactile prediction can then be classified into a predictive slip signal, with which the robot can adapt its path in an optimal manner to avoid future slip [150].

4.4. Texture Recognition

Tactile sensing can be used to recognise different textures on an object’s surface. This information can be used by a robot to perform tasks such as sorting objects based on their texture.
Texture recognition is an important capability for robotic systems, as it allows robots to identify and distinguish between different surface properties of objects [151]. Tactile sensing plays a significant role in texture recognition, as it provides rich information about the surface features and properties of objects, such as roughness, hardness, and friction [47].
Various types of tactile sensors, including capacitive, piezoresistive, optical sensors and even tactile features, like surface temperature and vibration, have been employed for texture recognition in robotic systems [60,152,153,154]. These sensors can measure the physical interactions between the robot’s end effector and the object’s surface, allowing the robotic system to extract essential features for texture classification [155].
Although classical methods can be used for texture recognition [60,151,156], machine learning techniques, particularly deep learning, have shown significant promise in enhancing texture recognition capabilities using tactile data [157]. Convolutional neural networks (CNNs) have been used to process and analyse tactile images, enabling accurate and efficient classification of different textures [158,159,160]. More recently, spiking neural networks have also been applied to high frequency and efficient texture classification [161]. As with other tactile sensation problems, visuo-tactile multi-modal approaches can be used for texture recognition to provide extra information for deep learning models to use when performing texture classification [160,162].

4.5. Compliance Control

Compliance control is an essential aspect of robotic manipulation, as it allows robots to adapt to the physical properties and geometries of objects, ensuring stable and secure grasping, and minimising damage. Tactile sensing plays a critical role in compliance control, as it provides information about the changing contact forces, pressure distribution, and object surface properties, enabling robots to adjust their grasping and manipulation strategies accordingly [163,164]. Although visual information can be applied to this task, occlusion from the robot’s end effector often makes it impractical to rely on visual information for compliance control.
Various control strategies have been developed to achieve compliance in robotic systems, such as impedance control, force control, and hybrid force/position control [165,166]. These strategies rely on tactile-sensing data to modulate the robot’s stiffness or damping properties, enabling it to adapt to the object’s physical characteristics and maintain stable contact during manipulation [167].

4.6. Object Recognition

Tactile sensors have played a crucial role in advancing object recognition capabilities, which refer to the identification and classification of objects based on their physical properties [45]. This is particularly important in unstructured environments, where visual information may be insufficient or unreliable. However, even with unoccluded visual information, tactile sensation features like local surface texture [156] are proven to improve performance in object recognition tasks [86].
Exploiting multi-modal tactile sensors to provide more features improves tactile object recognition [168]. Spectral analysis from sliding a tactile sensor over the surface of the object can also provide the required features for object recognition [169]. Using image-processing techniques applied to contact pattern recognition can also be used to recognise objects [170]. These tactile images can also be applied to the classification of deformable objects [120]. Combining visual and tactile information together also leads to better object recognition [43,160].
Similar to other techniques, tactile-based object recognition has been improved more recently through the use of multi-fingered robotic hands, capable of providing more tactile information about an object than a standard pincer-based robotic end effector [171]. Further, the application of deep learning methods provides more generalised and robust object recognition performance, using simple linear layers [171] and LSTM-based recurrent neural networks [172].

4.7. Three-Dimensional Shape Reconstruction

Three-dimensional shape reconstruction enables robotic systems to create accurate representations of objects’ geometries, which can be used to facilitate data collection in complex environments, in-hand object localisation and classification [62]. Tactile sensing contributes significantly to 3D shape reconstruction by providing fine-grained information about the local geometry of the object [173]. Tactile features can be combined with visual features to generate more accurate 3D shape reconstructions [44].

4.8. Haptic Feedback

Tactile sensing can be used to provide haptic feedback to a robot operator, allowing them to feel what the robot is touching or grasping. This can improve the operator’s ability to control the robot and perform tasks that require fine motor skills. This capability enhances the operator’s ability to control the robot’s actions, especially for tasks that require fine manipulation, force control, or exploration in unstructured environments [174,175].
Haptic feedback can be achieved by mapping the tactile-sensing data and proprioceptive data (like joint force feedback) from the robot’s end effector to a haptic device, such as a force–feedback joystick or a wearable haptic glove. This enables the operator to perceive contact forces, pressure distribution, surface properties, and vibrations as if they were directly interacting with the object [176,177,178].

4.9. Object Pushing

The manipulation of objects through pushing is an essential feature of physical robot interactions. Tactile sensing plays a significant role in object pushing, as it provides valuable information about the contact forces, pressure distribution, surface properties, and frictional interactions between the robot’s end effector and the object.
One of the key challenges in object pushing is to maintain stable contact between the robot’s end effector and the object while applying the desired force. Tactile sensing can be used to monitor the contact state and detect any slipping or unintended contact changes, enabling the robot to adapt its pushing strategy accordingly [29].
Additionally, tactile data can be employed to estimate the frictional properties between the object and the environment, which can be used to predict the object’s motion and plan more effective pushing actions [179]. Using a combination of vision and tactile sensation was shown to produce more accurate object location predictions over extended time horizons [121]. Machine learning techniques have shown promise in enhancing object-pushing capabilities using tactile data. LSTM-based recurrent neural network algorithms have been employed to process and analyse tactile images, enabling the robot to infer object motion, using model predictive control and deep functional predictive control to optimise robot trajectory [29,179].

4.10. Summary

We summarise the different key applications of tactile sensors in robotics and automation in Table 2. In this section, we show that there is a wide variety of uses for tactile sensation, ranging from physical robot interaction tasks like grasping and pushing, to feature extraction tasks like texture recognition. Although the provided list and citations are not exhaustive, we aim in this section and the previous one to provide the reader with enough context to understand the following sections on the use of tactile sensation in agri-food and the current issues and shortcomings with tactile-sensing technologies and algorithms that should be addressed.

5. Complexities Associated with Tactile Sensors

Despite the recent advances in tactile sensor technology and tactile information-processing techniques, there are still many challenges ahead of the domain for improvement [118,180]. The shortcomings can be categorised into algorithmic and practical challenges. The algorithmic shortcomings relate to the processing techniques of tactile information, which are usually used in three major domains, namely feature extraction, tactile-based robot controllers, and sensory information fusion. The practical complexities include the challenges for calibrating the tactile sensors, the high dimensionality of the tactile information due to the high number of sensing points, and finally the shortcomings related to the hardware integration of tactile sensors into the robotic systems. Figure 8 shows the block diagram of the challenges currently in front of artificial tactile-sensing technology. Knowing the current tactile technology’s shortcomings can provide insightful information for leading the future domain research in the right direction for agri-food applications [118,181,182,183,184]. We will review the shortcomings of the recent research articles presented based on both the algorithmic and practical challenges.

5.1. Tactile Feature Extraction

The tactile information acquired from the physical interaction with an object can include a range of different features describing the objects’ (i) geometry, such as size [171,185], edges [186,187], curvatures [188], shape [189,190], or texture [191,192,193,194]; (ii) dynamics, such as inertia [195], stiffness [196,197,198], friction [199,200], deformation [201], or contained fluid [202]; and (iii) properties related to the robot controllers, such as grasping/picking pose [129,203], slippage [134,204], contact point [205], or in-hand object pose [206]. We will focus on the tactile features, which are related to the agri-food domain and present the challenges and shortcomings of the reviewed approaches.

5.1.1. Geometric Features

Based on the variation in size, geometry, and deformability of the fruit and food items, specific hardware and algorithmic requirements are needed for each geometrical feature extraction problem. Zhang et al. [13] combined the force and bending sensors data of a three-finger soft gripper as input to multiple machine learning-based classification models for fruit classification problems. The random forest (RF) classifier results in highest classification scores compared to k-nearest neighbour (KNN), support vector classification (SVC), and naive Bayes (NB) in classifying a fruit set consisting of apple, orange, peach, pear, and tomato. While the fruit object set includes various texture types, there is no large variation in size and geometry. As such, the same tactile sensor and method might not be applicable for detecting a fruit with non-spherical geometry such as a cucumber, or with a larger size such as a melon. Patel et al. [207] utilised the ResNet50 convolutional neural network (CNN) on the tactile images from the Digger Finger sensor to detect different contact geometry classes (circle, hex, square, and triangle) dipped in a granular media (i.e., rice). While the used vision-based tactile sensor and CNN model perform well for the objects with distinguishable edges, it might not work equally well for geometry classification problems for fruits with smoother geometries. Ribeiro et al. [9] used moving average and finite impulse response (FIR) high-pass filter on the cilium-based tactile sensor data for achieving input features in the apple and strawberry smoothness, stiffness, and texture recognition task. The random forest classifier reached 96% and 83% accuracies for apple and strawberry ripeness estimation, respectively. The miniature design of the sensor makes it challenging to be used in agricultural fields in non-laboratory environments. Abderrahmane et al. [208] proposed a CNN model for object recognition capable of generating synthetic tactile features using upconvolutional layers in the model. The BioTac sensor data, besides the first principal components, are used as the input features. Tactile sensor arrays on a soft multi-finger gripper in Zhou et al. [22] help detect the branch interference in apple harvesting using a customised CNN model. The large spatial distribution of the sensors allows successful branch location inference on the fingers. The current approaches are usually limited to specific types of object classes or tactile sensors. Due to the broad spectrum of geometries of food and fruit items, the problem of geometrical feature extraction can be addressed better by the tactile sensors integrated on multi-finger robot grippers, such as in [13,22]. This will allow easier tactile exploration, data collection, and in-field implementation. Feature detection algorithms that can be applied to different tactile data representations (e.g., pressure distribution, tactile image, and force vector), such as machine learning-based models, enable model generalisation to various tactile sensors.

5.1.2. Dynamic Features

Dynamic feature extraction usually demands physical interaction with higher ranges of contact forces compared to geometrical feature extraction. As such, the standard definition of food item damage and bruise [39,40,41] must be passed by the proposed methods. Zhang et al. [4] utilised tactile data for the hardness recognition of a set of fruit and vegetable items. The input features to the KNN and SVM (support vector machine) classifiers are the lower-dimensional vectors achieved by applying principal component analysis (PCA) on raw tactile data for classifying the hardness of apple, kiwi, tangerine, and tomato. Tactile measurements are acquired by squeezing the fruits with a parallel gripper equipped with pressure-based tactile sensors, which can damage softer fruits such as berries type and result in fruit bruises. Scimeca et al. [8] used capacitive tactile sensors on a parallel gripper for estimating the ripeness of mango fruit. The sensor calibration model [209] maps the pressure values to force, and a spring model is used for estimating the stiffness of the fruit. The assumption about the spring model is dependent on the contact geometry and can have variation due to slight changes in the contact state. As such, it needs multiple palpation trials and averaging the estimated stiffness as the final prediction. Although the approach achieved equal classification performance compared to the destructive penetrometer measurements, the heuristic contact model is specific to the fruit type. Blanes et al. [5,12] used six accelerometers’ data attached on the fingers of a pneumatic gripper to classify the firmness of eggplant and mango. The slope and area under the curve of the acceleration signals in the post-grasp phase are used for firmness classification. Although this method can be used in the quasi-static post-grasp phase, where the accelerations are only due to gripping actions, it struggles to differentiate finger grip motion acceleration from the acceleration of the robot hand in the moving phase of a task. Cortés et al. [11] used a similar approach for mango ripeness estimation by a pneumatic gripper equipped with accelerometers and spectrometers. Partial least square regression was used for both accelerations and spectral data for ripeness prediction. Spiers et al. [210] used a random forest classifier for the object class, pose, and stiffness estimation by the single point pressure sensors on the phalanges of the two-finger TakkTile [211] gripper. The object set consists of a small set of objects, and generalisation to novel objects is not explored.
Although tactile-based palpation for dynamic feature extraction could achieve the same level of performance compared to destructive approaches for some fruit items [8,212], they still lack generalisation to different food and fruit types. Reaching human-level dynamic feature extraction requires tactile-enabled soft grippers for non-destructive physical interaction, efficient fusion with visual and proprioception data [213], and processing algorithms with minimal assumptions about the problem features for better generalisation.

5.1.3. Controller Features

Tactile-based robot controllers generate the control actions required for task success based on specific tactile features. For instance, a fruit-picking robot controller can adjust its grasp pose after initial contact according to the more accurate tactile-based pose estimation compared to the initial visual-based estimated pose [130,214]. The latency of the control feature extraction process, consisting of (i) data sampling and (ii) a computation phase, is more critical than geometric or dynamic feature extraction since it determines the reaction time of the robot controllers. Feature detection methods with high computation complexity [215] can increase the system’s latency and reduce the success rate of the controller. The requirement of the latency of the feature extraction method can be determined based on the frequency of the control task.
Hohimer et al. [20] used the R-C time delay characteristic of a thermoplastic polyurethane (TPU)-based tactile sensor circuit for detecting the contact on a pneumatic soft gripper. Liu and Adelson [19] performed object orientation estimation by the GelSight Fin Ray sensor. Live orientation estimation is performed by using Poisson reconstruction on a differential tactile image that is calculated by subtracting a reference image (HSV threshold on untouched mode image) from each sensor frame. The precision of the estimation can degrade in dynamic manipulation tasks by faster in-hand object motions. The object set contains artificial fruits, including orange, apple, and strawberry. Zhou et al. [22] used a CNN model for classifying the grasp status in the apple-harvesting task by a four-finger tactile-enabled soft gripper. The grasp status can be one of the following classes: good grasp, null grasp, branch interference, and finger obstructed. The CNN model feeds each finger’s tactile data to separate sub-networks and concatenates the latent features before the output layer. A moving variance method is also used to localise the branch interference by the assumption that branch touch triggers faster pressure variation than apple touch. The grasp status detection and branch interference localisation models reach 87.0% and 83.3% accuracy, respectively. A heuristic classifier is used in [24] on the wrist F/T sensor and the encoders data on mounted finger joints for grasp success estimation in the apple-harvesting problem.Kim et al. [216] used the voltage of a strain gauge-based tactile sensor and motor encoder data for detecting object slip in a pincer gripper. The sensor is not calibrated to measure force or deformation. Wi et al. [201] used estimated contact force and location based on tactile and visual feedback for object deformation estimation. The signed distance functions (SDFs) represent the object deformation in a double-encoder feedforward neural network. The visual feedback may not be available in cluttered scene settings.
The robustness of the feature extraction methods is not tested in various in-field conditions, where there can be large variances in the temperature, humidity, and lighting conditions, and uncertainties, which can significantly influence the tactile measurements. Electronic-based tactile sensors (e.g., capacitive [217] and piezoresistive [218]) have higher sampling rates and are more suitable for real-time robot control but can be occasionally limited to measuring forces only in one direction [219]. Visual-based tactile sensors provide rich information about the contact state for robot control [66,179,220,221] but suffer from low sampling frequency.

5.2. Robot Controller

Tactile sensory feedback plays a substantial role in motor control in primates [86,122]. Although visual-based controllers have recently advanced tremendously in robotic manipulation [222], tactile-based controllers are behind the state of the art in visual systems and human-level performance. The main challenges relate to extracting crisp features from high-dimensional tactile data, the complexity of the dynamics of physical contact, and hardware shortcomings, such as having a spatially distributed sensing system that is efficiently integrated with the actuation system similarly to biological sensorimotor loops. We will review the tactile-based controllers which have been used in agri-robotic applications or have the potential to be effectively integrated into an agri-robotic system and discuss the corresponding shortcomings.

5.2.1. Grasp Control

The force closure grip control method has been used as a common approach in robotic grasping and manipulation [223,224,225]. This approach aims to regularise the contact force to a set value either by directly changing the fingers’ position or finger joints’ torques. This approach is difficult to apply to delicate or deformable objects, as the grasp width and tactile force can change with object deformation. He et al. [226] used a force closure approach on a gripper with soft finger pads with air cavities to avoid damaging delicate objects. Wen et al. [227] introduced a high-precision grip force control of delicate objects, where the grip force is regulated based on tactile feedback and a human teleoperator hand’s electromyography (EMG) signals. These approaches require an object model to adjust the grip force or width. Yin et al. [228] leveraged deep reinforcement learning for in-hand object rotation control based on tactile feedback, which is a data-driven approach. The tactile data include sixteen contact sensors on the palm and phalanges of the Alegro hand. The deep RL method takes tactile and robot proprioception data as input and generates future joint torques to achieve a desired object rotation. The reward function is defined for the rotation task, and having a global reward function for in-hand manipulation can be intractable to find. Slip avoidance controllers regulate the grip force to prevent future slip incidents [138,139,140,229,230]. Achieving human-level dexterity requires an efficient integration of tactile sensors and actuation systems in an anthropomorphic gripper. The active touch in humans [231] generates exploratory actions to achieve suitable tactile feedback for further adjustment of the grip force and pose. Nonetheless, customised grip control for handling certain types of food and fruit items can be more cost efficient than designing an expensive human-level hand for universal manipulation applications.

5.2.2. Motion Planning

Proprioception sensory feedback helps primates to not only adjust their grip force but optimise the motion of different body parts, such as the hand and the arm [232]. The bulk of the robotic literature seeks to exploit tactile feedback primarily in grip control and disregards hand and arm motion planning based on tactile feedback. Schuetz et al. [28] proposed an obstacle avoidance motion planning framework based on tactile feedback for an agriculture robotic manipulator. Using feedback linearisation and gradient-based optimisation with an objective function consisting of collision avoidance and joint limit terms, the motion planning approach avoids obstacles in the task space. The proposed approach is applied for the collision of one link of the manipulator. Multiple link collision detection and prevention can make the control problem untraceable. An online robot trajectory optimisation approach was proposed by [150] for object slip avoidance. The objective function consists of (i) the distance from a desired pre-planned reference trajectory (e.g., minimum time and minimum jerk), and (ii) future slip likelihood in a horizon. The optimisation helps the motion-planning pipeline to maintain both the reference trajectory behaviour and avoid object slip simultaneously. Nazari et al. [29] introduced tactile deep functional predictive control in strawberry-pushing tasks. The proposed data-driven controller adjusts the robot’s Cartesian velocity to prevent losing contact with the strawberry stem during the pushing task execution based on the sensory feedback from a camera-based tactile finger. Off-the-shelf motion planning libraries [233,234] lack integrating tactile feedback in problem formulation, such as the approaches in [28,29,150] for more robust closed-loop motion planning.

5.2.3. Learning from Demonstration

Learning from demonstration (LfD) is a data-driven control approach, where the robot controller can be learned from expert demonstration [235]. This approach is especially preferred in scenarios, where the control problem cannot be formulated analytically. Misimi et al. [27] proposed a LfD method for the compliant grasping of deformable food objects, which can automatically discard inconsistent demonstrations. The method combines visual and tactile feedback for the robust autonomous grasping of food items. A breast cancer examination robot controller was studied in [236] by a novel LfD technique using deep probabilistic movement primitives. The human demonstration data consist of reach to palpate and palpation trajectories, and the model can generalise to unseen breast poses. Tactile-based LfD has a high potential to improve the robotic harvesting systems using the demonstration of expert human pickers. Recent research in [237] tried to recognise human fruit-pickers’ activities during avocado fruit harvesting. DelPreto et al. [35] collected a multi-modal dataset of human activities in a kitchen environment, including tactile sensors on a human subject’s hand. These types of data and studies can be used for the automation process of applications, such as harvesting and kitchen robotic systems by using LfD approaches. The main challenge ahead of tactile-based LfD methods is the level of generalisation to novel tasks and test conditions.

5.3. Sensor Fusion

Combining the sense of touch with other sensing modalities, such as vision and audition, helps humans to have a robust and intelligent multi-modal perception system. Nevertheless, robotic systems fall far behind in finding an efficient universal fusion approach for multi-modal sensing. Recent advances in deep learning leveraged some techniques for effective sensory data fusion [238,239]. Dong et al. [3] proposed a robotic visual–tactile perception learning based on an auto-encoder neural network which consists of a modality-specific knowledge library and modality-invariant sparse constraint to learn both intra-modality and cross-modality knowledge. Each sub-network of the model requires retraining for every new task, which is time and cost expensive to use the model in different multi-modal perception problems. Misimi et al. [27] generated the initial grasp pose by the visual data and used tactile feedback for the final adjustment of the grip pose and force after touching the objects. Wi et al. [201] combined point-cloud data and tactile feedback in an object-conditioned feedforward neural network for estimating the deformation of the object in hand. Luo et al. [162] combined visual and tactile data in a CNN model for texture recognition. Calandra et al. [240] used a similar approach of visuo-tactile fusion by a CNN for grasp stability prediction. Mandil and Ghalamzan-E [121] proposed a multi-modal video prediction model by combining tactile and visual data in novel action-conditioned recurrent neural networks. A strong potential of visual and tactile fusion, which is not yet well explored, lies in the problem of fruit ripeness estimation, where the visual data capture the colour features for ripeness and tactile data measure the stiffness of the fruit. Proximity sensors data are also combined with tactile data in applications such as surface crack detection [241] and safety control in human–robot interaction [242], which can be transferred to the agri-robotic domain for safer human–robot interaction.

5.4. Sensor Calibration

Calibrating the tactile sensors to map the raw sensory reading, such as resistance, capacitance, voltage, current, length, light intensity, magnetic field, image, or vibration to force values can be very challenging based on the raw sensory data type and dimensionality. The common approach is to fix either the tactile sensor or a load cell (force sensor), and apply precise constant forces to the fixed object by the other to record both the tactile sensor’s raw reading and the load cell’s force values. After a data collection phase of applying forces of various magnitudes and locations, a regression model will be trained for sensor calibration. Depending on the complexity of the regression task, the regression model’s variance and bias can change. Wang et al. [243] used moving least square (MLS) for tri-axial force calibration from the magnetic field values. The calibration test setup is shown in Figure 9a. Yuan et al. [173] used a CNN model to map Gelsight’s markers pattern motion to force values. The entropy of the marker displacement field is used for slip calibration. Khamis et al. [244] used a camera with a high frame rate to measure the sensor taxel’s deformation to further map to normal and shear force values and also for slip calibration (see Figure 9c). Scimeca et al. [8] calibrated a capacitive tactile sensor on a parallel gripper by pinching a metal cuboid that has a force sensor attached to one of its faces. In the calibration procedure, one finger is fixed and the second finger performs a linear motion with constant speed while pressing the force sensor until the maximum displacement of the taxels reaches a threshold. This method is limited to normal force calibration and cannot be applied for calibrating shear forces. Furthermore, the logarithmic function used for mapping pressure to force is specific to the capacitive tactile sensor and cannot be used for other sensors with different hardware. A similar setup was created in [245] for calibrating a photoelastic haptic finger mounted on the Franka Emika robot hand. The calibration setup is shown in Figure 9b. Bio-inspired cilium-based tactile sensors can detect very small-scale touch features [9]. However, due to the miniature size of the hairlike sensor structure, it is challenging to calibrate the sensor for measuring force values. Figure 9 shows the tactile sensor calibration setup for the reviewed research items. Knowledge transfer for sensor calibration could save time and cost for performing the calibration procedure and physical interaction data collection for every new tactile sensor.

5.5. The Curse of Dimensionality

The spatial distribution of tactile sensing over a large area and having multiple sensing points (taxels) make the dimensionality of tactile information very high. High-dimensional sensory data are challenging to deal with from the perspective of both control and feature extraction. As such, dimensionality reduction by deep neural networks is widely used for tactile data processing in various tasks, such as slip classification [247], object pose estimation [248,249,250], and grasp stability prediction [147,251]. Zhang et al. [4] applied PCA on tactile data to achieve compact features for KNN and SVM classifiers for fruit hardness recognition. Dong et al. [3] used the VGG-16 model on both visual and tactile images to achieve a vector input to a multi-modal auto-encoder in a fabric classification task. Funabashi et al. [252] compared the performance of DNNs (deep neural networks), CNNs (convolutional neural networks), and RNNs (recurrent neural networks) models for the object recognition task using sixteen uSkin tactile sensor arrays integrated on the Allegro hand. Abderrahmane et al. [208] combined the BioTac tactile data with the first four principal components obtained by PCA for the object recognition task. The major shortcoming of the existing approaches is that data compression methods are used for specific tasks and features, whereas in the human somatosensory system, the encoding of tactile data happens universally for all tactile features [253,254].

5.6. Hardware Integration and Scalibility

Each fingertip of the human hand contains 3000 tactile mechanoreceptors, which are, on average, 0.4 mm away from each other [255]. Reaching the same level of measurement and spatial resolution for artificial tactile sensors is very challenging with the current hardware technology. As such, integrating tactile sensors in small areas of robot fingers and gripper is a challenging task. Crosstalk between sensing points [37], the required space for wiring [36], and the size of single sensing hardware are the main bottlenecks to increasing the spatial resolution of tactile sensors. A desired feature for integrating the sense of touch to robotic systems is to have a tactile-enabled hand that has efficient integration of tactile sensing and actuation in both the hardware and software sides. In this section, we review some of the research work that proposed tactile sensors which are integrated within a robotic system, including robotic grippers, soft grippers, and the universal attachment of tactile sensors over a robot body.

5.6.1. Rigid Links Grippers

Abdeetedal and Kermani [256] proposed an underactuated two-degrees-of-freedom gripper, which has integrated load cells within each finger’s phalanges and a potentiometer on the finger joints. Finite element analysis is used to optimise the location of the load cells with the phalanges and contact point estimation is implemented using the force and torque feedback. The grasp controller has two gripping options, namely, the precision and power grasp of the fingers. Zhang et al. [13] used flexible silicone sheets for the fingers in a multi-finger gripper with integrated force and bending sensors. The effectiveness of the flexible fingers is tested in the grasping of different types of soft and hard fruits. Cook et al. [18] used nine capacitive tactile sensors alongside temperature detectors in the phalanges of a tri-gripper. The paper demonstrated a pick-and-place task for various fruits. However, the pick-and-place pipeline is fully working on visual feedback and does not use tactile data. Ntagios et al. [257] used soft capacitive pressure sensors on the distal phalanges of a 5-finger gripper. The pressure sensors measure one value for the whole contact surface, which lacks having a distribution of pressure over the contact area. This means a low spatial resolution for the finger-integrated tactile sensor. Piezo-capacitive tactile sensors with conformal microstructure were proposed in [258] and integrated into a multi-finger robot hand for Braille and roughness detection. The response speed is 25 ms, which cannot be sufficient for dynamic manipulation tasks. Fabricating an array of sensors with this technology could further decrease the response speed, which shows the limitation for expanding the spatial size of the sensor similar to biological skin. Dischinger et al. [24] embedded 2 × 2 pressure sensor arrays below the finger pads of a three-finger gripper for apple harvesting. The tactile-enabled gripper was tested in the field for branch interference detection. The IMU and finger encoder data from finger pads and joints, respectively, were preferred over the tactile data for grasp model estimation. Branch interference made the tactile data unreliable for detecting a successful grasp or object slip. No bruise tests were conducted in the real-world apple harvesting tests. Blanes et al. [12] used accelerometer data on a pneumatic robotic gripper for fruit firmness estimation. Funabashi et al. [252] integrated sixteen uSkin tactile sensors on Allegro hand palm, phalanges, and finger pads. A gripper that has good integration of the sensing of (I) tactile data on fingers and palm, (II) joint angles of each finger joint, (III) and joint torques on each finger joint is currently missing in the literature.

5.6.2. Soft Grippers

Soft grippers can be more suited to the application of agri-food manipulation compared to hard grippers based on their adaptive soft shape and lower actuation forces [259]. Tactile-enabled soft grippers can leverage bruise-free fruit harvesting with sufficiently good tactile perception. An adaptive compliant gripper was proposed by Liu and Adelson [19] for compliant grasp control using the GelSight Fin Ray sensor. A silicone gel pad is attached to a printed deformable finger, and internal illumination helps the camera sitting in the finger base measure the displacement of the patterns on the pad. Tactile images are used for in-hand object orientation estimation. Flexible thermoplastic polyurethane (TPU) material inspired the fabrication of a range of tactile sensors working on capacitive and piezoresistive fundamentals [20]. The sensor is integrated into a pneumatic-based soft gripper for apple harvesting. Contact force localisation is challenging for TPU-based tactile sensors. Zhou et al. [21,22] integrated twenty-four piezoresistive tactile sensors (RX-M0404S) in a four-finger soft gripper for apple harvesting. Each finger contains six tactile arrays embedded by a thin silicone skin, and each tactile sensor has a 4 × 4 taxel configuration. The tactile-enabled gripper is integrated into a UR5 robot, which is mounted on a mobile platform, and the system is extensively tested in the field for apple harvesting. He et al. [226] introduced a gripper with soft fingertips containing a cavity inside to measure the applied pressure. The spatial resolution of the proposed method for pressure sensing can be extremely low in the case of scaling up the sensors over larger areas.

5.6.3. Tactile Skin

Although the primary focus on tactile sensor integration in the robotic systems has been on robot manipulators’ hands, developing an electronic skin over non-hand areas has been partially investigated [260]. Schuetz et al. [28] embedded a tactile sensor in one of the middle links of an agriculture robotic manipulator. The tactile sensor has two rigid frames, which are connected to each other by four force sensors. The force values are used to achieve the applied force and torque on the robot link utilised by the obstacle avoidance controller. The contact location cannot be localised on the arm, and a middle point on the link is assumed as the default contact point. Patel et al. [207] introduced Digger Finger with a GelSight-type visual tactile sensor restructured in a cylindrical architecture for inspecting hard objects in granular environments. The test tasks include inspecting metal objects in rice. Zhang et al. [261] introduced a resistive-based large tactile skin used to cover the links of the UR5 robot. The tactile skin has a divided texture, where each part can measure the normal pressure. The developed tactile skins are usually incapable of measuring shear forces and have a small spatial resolution.

6. Future Trends and Conclusions

In this review, we explore the use of tactile sensation in agri-food. We provide an overview of tactile sensation hardware and its general uses in robotics and sensing. We discuss in depth the use of tactile sensation in agri-food and the current shortcomings of tactile sensation technologies. Tactile sensation has been employed in three primary areasof agri-food research: First is the design of robotic harvesting systems that use tactile sensors to harvest foods delicately and assess the ripeness and quality of fruits and vegetables based on factors such as firmness, texture, and other attributes. Second, the incorporation of tactile sensing in automated packaging and handling systems enables the gentle and accurate manipulation of delicate produce to reduce damage and food waste. Tactile sensation is starting to emerge in kitchen robotics, such as in food-pouring techniques and cutting processes. However, the research in this area is still in its early stages, offering ample opportunities for further exploration and development.
Tactile sensing has the potential to contribute significantly to a broader array of agri-food applications. One such application includes the monitoring and maintenance of livestock health by facilitating the early detection of injuries or diseases through the examination of animal coats, skin conditions, and overall body condition. Furthermore, the application of tactile sensing, although primarily employed in food quality evaluation, can also be leveraged for monitoring plant health and detecting diseases. This is attributed to the sensor’s capability to perceive the physical properties of plants, which may act as indicators of their overall health. Additionally, extending this approach towards pest control and identification might further broaden its scope. The use of tactile sensors could contribute additional features required for detecting and identifying a wide array of pests related to agri-food products. This might be possible by interpreting physical interactions with the crops. For instance, detecting alterations in the plant’s physical structure due to pest infestations or diseases could be achieved through tactile features and not through remote-sensing units, thereby offering a promising direction for future research. In the long run, ongoing research in these areas will lay the groundwork for advanced applications of tactile sensing in agri-food, promoting sustainable agriculture and improved food security.
The hardware development of artificial tactile sensors is still facing scalability challenges for having a large number of sensitive high-spatial-resolution taxels over large areas. The recently proposed self-powered tactile skin partially addressed the scalability problem, but it still lacks high dynamic ranges and has a low resolution for large contact forces. Improving the dynamic range of self-powered tactile sensors can introduce the next generation of artificial skin suitable for various applications, including the agri-food domain. There is no standard definition of wear and tear testing of tactile sensors to measure their endurance in long-term real-world applications such as harvesting in agricultural fields. Future research can benchmark wear testing of the tactile sensors for easier endurance comparison between different tactile sensors in the robotic community. Most of the proposed tactile-sensing technology have complicated calibration and integration procedures, which can limit their applications to research environments. As such, future work can explore modular design with easier integration with off-the-shelf hardware systems and a unified tactile feature extraction approach, which simplifies the sensor calibration. The commercially available tactile sensors are currently very expensive to afford, and future large-scale commercialisation requires significant optimisation of the production process.
In conclusion, this review delves into the potential of tactile sensation in the agri-food sector, highlighting its primary applications in robotic harvesting systems, automated handling, and emerging uses in kitchen robotics. Tactile-sensing technology also holds promise in livestock health monitoring and plant health assessment, which could further revolutionize sustainable agriculture and enhance food security. However, challenges remain in scaling up the hardware, improving dynamic ranges, and increasing resolution for large contact forces. The development of self-powered tactile skin has enabled progress in addressing some of these issues, but further advancements are required. To facilitate the adoption of tactile-sensing technology, future research should focus on establishing standardized wear-and-tear testing, creating modular designs for seamless integration, and simplifying calibration procedures. Lastly, making commercially available tactile sensors more affordable through optimized production processes will be essential for widespread implementation in the agri-food industry.

Author Contributions

Conceptualization, W.M. and A.G.-E.; methodology, W.M.; investigation, W.M., K.N. and V.R.; resources, W.M.; writing—original draft preparation, W.M., K.N. and V.R.; writing—review and editing, W.M.; supervision, A.G.-E.; project administration, W.M.; funding acquisition, A.G.-E. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the UKRI’s Engineering and Physical Sciences Research Council (EPSRC) [Grant reference: EP/S023917/1].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Duckett, T.; Pearson, S.; Blackmore, S.; Grieve, B.; Chen, W.H.; Cielniak, G.; Cleaversmith, J.; Dai, J.; Davis, S.; Fox, C.; et al. Agricultural robotics: The future of robotic agriculture. arXiv 2018, arXiv:1806.06762. [Google Scholar]
  2. Zou, L.; Ge, C.; Wang, Z.J.; Cretu, E.; Li, X. Novel tactile sensor technology and smart tactile sensing systems: A review. Sensors 2017, 17, 2653. [Google Scholar] [CrossRef] [PubMed]
  3. Dong, J.; Cong, Y.; Sun, G.; Zhang, T. Lifelong robotic visual-tactile perception learning. Pattern Recognit. 2022, 121, 108176. [Google Scholar] [CrossRef]
  4. Zhang, Z.; Zhou, J.; Yan, Z.; Wang, K.; Mao, J.; Jiang, Z. Hardness recognition of fruits and vegetables based on tactile array information of manipulator. Comput. Electron. Agric. 2021, 181, 105959. [Google Scholar] [CrossRef]
  5. Blanes, C.; Ortiz, C.; Mellado, M.; Beltrán, P. Assessment of eggplant firmness with accelerometers on a pneumatic robot gripper. Comput. Electron. Agric. 2015, 113, 44–50. [Google Scholar] [CrossRef]
  6. Ramirez-Amaro, K.; Dean-Leon, E.; Bergner, F.; Cheng, G. A semantic-based method for teaching industrial robots new tasks. KI-Künstliche Intelligenz 2019, 33, 117–122. [Google Scholar] [CrossRef]
  7. Dean-Leon, E.; Pierce, B.; Bergner, F.; Mittendorfer, P.; Ramirez-Amaro, K.; Burger, W.; Cheng, G. TOMM: Tactile omnidirectional mobile manipulator. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 2441–2447. [Google Scholar]
  8. Scimeca, L.; Maiolino, P.; Cardin-Catalan, D.; del Pobil, A.P.; Morales, A.; Iida, F. Non-destructive robotic assessment of mango ripeness via multi-point soft haptics. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 1821–1826. [Google Scholar]
  9. Ribeiro, P.; Cardoso, S.; Bernardino, A.; Jamone, L. Fruit quality control by surface analysis using a bio-inspired soft tactile sensor. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 8875–8881. [Google Scholar]
  10. Maharshi, V.; Sharma, S.; Prajesh, R.; Das, S.; Agarwal, A.; Mitra, B. A Novel Sensor for Fruit Ripeness Estimation Using Lithography Free Approach. IEEE Sens. J. 2022, 22, 22192–22199. [Google Scholar] [CrossRef]
  11. Cortés, V.; Blanes, C.; Blasco, J.; Ortiz, C.; Aleixos, N.; Mellado, M.; Cubero, S.; Talens, P. Integration of simultaneous tactile sensing and visible and near-infrared reflectance spectroscopy in a robot gripper for mango quality assessment. Biosyst. Eng. 2017, 162, 112–123. [Google Scholar] [CrossRef]
  12. Blanes, C.; Mellado, M.; Beltrán, P. Tactile sensing with accelerometers in prehensile grippers for robots. Mechatronics 2016, 33, 1–12. [Google Scholar] [CrossRef]
  13. Zhang, J.; Lai, S.; Yu, H.; Wang, E.; Wang, X.; Zhu, Z. Fruit Classification Utilizing a Robotic Gripper with Integrated Sensors and Adaptive Grasping. Math. Probl. Eng. 2021, 2021, 7157763. [Google Scholar] [CrossRef]
  14. Li, N.; Yin, Z.; Zhang, W.; Xing, C.; Peng, T.; Meng, B.; Yang, J.; Peng, Z. A triboelectric-inductive hybrid tactile sensor for highly accurate object recognition. Nano Energy 2022, 96, 107063. [Google Scholar] [CrossRef]
  15. Riffo, V.; Pieringer, C.; Flores, S.; Carrasco, C. Object recognition using tactile sensing in a robotic gripper. Insight-Non-Destr. Test. Cond. Monit. 2022, 64, 383–392. [Google Scholar] [CrossRef]
  16. Li, G.; Zhu, R. A multisensory tactile system for robotic hands to recognize objects. Adv. Mater. Technol. 2019, 4, 1900602. [Google Scholar] [CrossRef]
  17. Lyu, C.; Xiao, Y.; Deng, Y.; Chang, X.; Yang, B.; Tian, J.; Jin, J. Tactile recognition technology based on Multi-channel fiber optical sensing system. Measurement 2023, 216, 112906. [Google Scholar] [CrossRef]
  18. Cook, J.N.; Sabarwal, A.; Clewer, H.; Navaraj, W. Tactile sensor array laden 3D-printed soft robotic gripper. In Proceedings of the 2020 IEEE SENSORS, Las Vegas, NV, USA, 25–29 October 2020; pp. 1–4. [Google Scholar]
  19. Liu, S.Q.; Adelson, E.H. Gelsight fin ray: Incorporating tactile sensing into a soft compliant robotic gripper. In Proceedings of the 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft), Edinburgh, UK, 4–8 April 2022; pp. 925–931. [Google Scholar]
  20. Hohimer, C.J.; Petrossian, G.; Ameli, A.; Mo, C.; Pötschke, P. 3D printed conductive thermoplastic polyurethane/carbon nanotube composites for capacitive and piezoresistive sensing in soft pneumatic actuators. Addit. Manuf. 2020, 34, 101281. [Google Scholar] [CrossRef]
  21. Zhou, H.; Wang, X.; Kang, H.; Chen, C. A Tactile-enabled Grasping Method for Robotic Fruit Harvesting. arXiv 2021, arXiv:2110.09051. [Google Scholar]
  22. Zhou, H.; Kang, H.; Wang, X.; Au, W.; Wang, M.Y.; Chen, C. Branch interference sensing and handling by tactile enabled robotic apple harvesting. Agronomy 2023, 13, 503. [Google Scholar] [CrossRef]
  23. Yamaguchi, A.; Atkeson, C.G. Tactile behaviors with the vision-based tactile sensor FingerVision. Int. J. Hum. Robot. 2019, 16, 1940002. [Google Scholar] [CrossRef]
  24. Dischinger, L.M.; Cravetz, M.; Dawes, J.; Votzke, C.; VanAtter, C.; Johnston, M.L.; Grimm, C.M.; Davidson, J.R. Towards intelligent fruit picking with in-hand sensing. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 3285–3291. [Google Scholar]
  25. Zhou, H.; Xiao, J.; Kang, H.; Wang, X.; Au, W.; Chen, C. Learning-based slip detection for robotic fruit grasping and manipulation under leaf interference. Sensors 2022, 22, 5483. [Google Scholar] [CrossRef]
  26. Tian, G.; Zhou, J.; Gu, B. Slipping detection and control in gripping fruits and vegetables for agricultural robot. Int. J. Agric. Biol. Eng. 2018, 11, 45–51. [Google Scholar] [CrossRef]
  27. Misimi, E.; Olofsson, A.; Eilertsen, A.; Øye, E.R.; Mathiassen, J.R. Robotic handling of compliant food objects by robust learning from demonstration. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 6972–6979. [Google Scholar]
  28. Schuetz, C.; Pfaff, J.; Sygulla, F.; Rixen, D.; Ulbrich, H. Motion planning for redundant manipulators in uncertain environments based on tactile feedback. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015; pp. 6387–6394. [Google Scholar]
  29. Nazari, K.; Gandolfi, G.; Talebpour, Z.; Rajendran, V.; Rocco, P.; Ghalamzan E, A. Deep Functional Predictive Control for Strawberry Cluster Manipulation using Tactile Prediction. arXiv 2023, arXiv:2303.05393. [Google Scholar]
  30. Tsuchiya, Y.; Kiyokawa, T.; Ricardez, G.A.G.; Takamatsu, J.; Ogasawara, T. Pouring from deformable containers using dual-arm manipulation and tactile sensing. In Proceedings of the 2019 Third IEEE International Conference on Robotic Computing (IRC), Naples, Italy, 25–27 February 2019; pp. 357–362. [Google Scholar]
  31. Wang, X.d.; Sun, Y.h.; Wang, Y.; Hu, T.j.; Chen, M.h.; He, B. Artificial tactile sense technique for predicting beef tenderness based on FS pressure sensor. J. Bionic Eng. 2009, 6, 196–201. [Google Scholar] [CrossRef]
  32. Yamaguchi, A.; Atkeson, C.G. Combining finger vision and optical tactile sensing: Reducing and handling errors while cutting vegetables. In Proceedings of the 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016; pp. 1045–1051. [Google Scholar]
  33. Zhang, K.; Sharma, M.; Veloso, M.; Kroemer, O. Leveraging multimodal haptic sensory data for robust cutting. In Proceedings of the 2019 IEEE-RAS 19th International Conference on Humanoid Robots (Humanoids), Toronto, ON, Canada, 15–17 October 2019; pp. 409–416. [Google Scholar]
  34. Shimonomura, K.; Chang, T.; Murata, T. Detection of Foreign Bodies in Soft Foods Employing Tactile Image Sensor. Front. Robot. AI 2021, 8, 774080. [Google Scholar] [CrossRef] [PubMed]
  35. DelPreto, J.; Liu, C.; Luo, Y.; Foshey, M.; Li, Y.; Torralba, A.; Matusik, W.; Rus, D. ActionSense: A multimodal dataset and recording framework for human activities using wearable sensors in a kitchen environment. Adv. Neural Inf. Process. Syst. 2022, 35, 13800–13813. [Google Scholar]
  36. Baldini, G.; Albini, A.; Maiolino, P.; Cannata, G. An Atlas for the Inkjet Printing of Large-Area Tactile Sensors. Sensors 2022, 22, 2332. [Google Scholar] [CrossRef]
  37. Lin, W.; Wang, B.; Peng, G.; Shan, Y.; Hu, H.; Yang, Z. Skin-inspired piezoelectric tactile sensor array with crosstalk-free row+ column electrodes for spatiotemporally distinguishing diverse stimuli. Adv. Sci. 2021, 8, 2002817. [Google Scholar] [CrossRef]
  38. Mattens, F. The sense of touch: From tactility to tactual probing. Australas. J. Philos. 2017, 95, 688–701. [Google Scholar] [CrossRef]
  39. Imami, D.; Valentinov, V.; Skreli, E. Food safety and value chain coordination in the context of a transition economy: The role of agricultural cooperatives. Int. J. Commons 2021, 15, 21–34. [Google Scholar] [CrossRef]
  40. Mostafidi, M.; Sanjabi, M.R.; Shirkhan, F.; Zahedi, M.T. A review of recent trends in the development of the microbial safety of fruits and vegetables. Trends Food Sci. Technol. 2020, 103, 321–332. [Google Scholar] [CrossRef]
  41. Žuntar, I.; Petric, Z.; Bursać Kovačević, D.; Putnik, P. Safety of probiotics: Functional fruit beverages and nutraceuticals. Foods 2020, 9, 947. [Google Scholar] [CrossRef]
  42. Fleetwood, J.; Rahman, S.; Holland, D.; Millson, D.; Thomson, L.; Poppy, G. As clean as they look? Food hygiene inspection scores, microbiological contamination, and foodborne illness. Food Control 2019, 96, 76–86. [Google Scholar] [CrossRef]
  43. Tabrik, S.; Behroozi, M.; Schlaffke, L.; Heba, S.; Lenz, M.; Lissek, S.; Güntürkün, O.; Dinse, H.R.; Tegenthoff, M. Visual and tactile sensory systems share common features in object recognition. eNeuro 2021, 8. [Google Scholar] [CrossRef] [PubMed]
  44. Smith, E.; Calandra, R.; Romero, A.; Gkioxari, G.; Meger, D.; Malik, J.; Drozdzal, M. 3d shape reconstruction from vision and touch. Adv. Neural Inf. Process. Syst. 2020, 33, 14193–14206. [Google Scholar]
  45. Kappassov, Z.; Corrales, J.A.; Perdereau, V. Tactile sensing in dexterous robot hands. Robot. Auton. Syst. 2015, 74, 195–220. [Google Scholar] [CrossRef]
  46. Hu, Z.; Lin, L.; Lin, W.; Xu, Y.; Xia, X.; Peng, Z.; Sun, Z.; Wang, Z. Machine Learning for Tactile Perception: Advancements, Challenges, and Opportunities. Adv. Intell. Syst. 2023, 2200371. [Google Scholar] [CrossRef]
  47. Chi, C.; Sun, X.; Xue, N.; Li, T.; Liu, C. Recent progress in technologies for tactile sensors. Sensors 2018, 18, 948. [Google Scholar] [CrossRef]
  48. Zhu, Y.; Liu, Y.; Sun, Y.; Zhang, Y.; Ding, G. Recent advances in resistive sensor technology for tactile perception: A review. IEEE Sens. J. 2022, 22, 15635–15649. [Google Scholar] [CrossRef]
  49. Peng, Y.; Yang, N.; Xu, Q.; Dai, Y.; Wang, Z. Recent advances in flexible tactile sensors for intelligent systems. Sensors 2021, 21, 5392. [Google Scholar] [CrossRef]
  50. Wei, Y.; Xu, Q. An overview of micro-force sensing techniques. Sens. Actuators A Phys. 2015, 234, 359–374. [Google Scholar] [CrossRef]
  51. Tiwana, M.I.; Redmond, S.J.; Lovell, N.H. A review of tactile sensing technologies with applications in biomedical engineering. Sens. Actuators A Phys. 2012, 179, 17–31. [Google Scholar] [CrossRef]
  52. Wang, C.; Dong, L.; Peng, D.; Pan, C. Tactile sensors for advanced intelligent systems. Adv. Intell. Syst. 2019, 1, 1900090. [Google Scholar] [CrossRef]
  53. Rehan, M.; Saleem, M.M.; Tiwana, M.I.; Shakoor, R.I.; Cheung, R. A Soft Multi-Axis High Force Range Magnetic Tactile Sensor for Force Feedback in Robotic Surgical Systems. Sensors 2022, 22, 3500. [Google Scholar] [CrossRef] [PubMed]
  54. Soleimani, M.; Friedrich, M. E-skin using fringing field electrical impedance tomography with an ionic liquid domain. Sensors 2022, 22, 5040. [Google Scholar] [CrossRef] [PubMed]
  55. Wu, H.; Zheng, B.; Wang, H.; Ye, J. New Flexible Tactile Sensor Based on Electrical Impedance Tomography. Micromachines 2022, 13, 185. [Google Scholar] [CrossRef] [PubMed]
  56. Fang, B.; Sun, F.; Yang, C.; Xue, H.; Chen, W.; Zhang, C.; Guo, D.; Liu, H. A dual-modal vision-based tactile sensor for robotic hand grasping. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 4740–4745. [Google Scholar]
  57. Zhang, T.; Cong, Y.; Li, X.; Peng, Y. Robot tactile sensing: Vision based tactile sensor for force perception. In Proceedings of the 2018 IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Tianjin, China, 19 July–23 July 2018; pp. 1360–1365. [Google Scholar]
  58. Ward-Cherrier, B.; Pestell, N.; Cramphorn, L.; Winstone, B.; Giannaccini, M.E.; Rossiter, J.; Lepora, N.F. The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies. Soft Robot. 2018, 5, 216–227. [Google Scholar] [CrossRef]
  59. Lin, X.; Wiertlewski, M. Sensing the frictional state of a robotic skin via subtractive color mixing. IEEE Robot. Autom. Lett. 2019, 4, 2386–2392. [Google Scholar] [CrossRef]
  60. Ward-Cherrier, B.; Pestell, N.; Lepora, N.F. Neurotac: A neuromorphic optical tactile sensor applied to texture recognition. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 2654–2660. [Google Scholar]
  61. Sferrazza, C.; D’Andrea, R. Design, motivation and evaluation of a full-resolution optical tactile sensor. Sensors 2019, 19, 928. [Google Scholar] [CrossRef]
  62. Do, W.K.; Kennedy, M. DenseTact: Optical tactile sensor for dense shape reconstruction. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 6188–6194. [Google Scholar]
  63. Li, R.; Platt, R.; Yuan, W.; ten Pas, A.; Roscup, N.; Srinivasan, M.A.; Adelson, E. Localization and manipulation of small parts using gelsight tactile sensing. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 3988–3993. [Google Scholar]
  64. Gomes, D.F.; Luo, S. GelTip tactile sensor for dexterous manipulation in clutter. In Tactile Sensing, Skill Learning, and Robotic Dexterous Manipulation; Elsevier: Amsterdam, The Netherlands, 2022; pp. 3–21. [Google Scholar]
  65. Romero, B.; Veiga, F.; Adelson, E. Soft, round, high resolution tactile fingertip sensors for dexterous robotic manipulation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4796–4802. [Google Scholar]
  66. Padmanabha, A.; Ebert, F.; Tian, S.; Calandra, R.; Finn, C.; Levine, S. Omnitact: A multi-directional high-resolution touch sensor. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 618–624. [Google Scholar]
  67. Alspach, A.; Hashimoto, K.; Kuppuswamy, N.; Tedrake, R. Soft-bubble: A highly compliant dense geometry tactile sensor for robot manipulation. In Proceedings of the 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft), Seoul, Republic of Korea, 14–18 April 2019; pp. 597–604. [Google Scholar] [CrossRef]
  68. Lambeta, M.; Chou, P.W.; Tian, S.; Yang, B.; Maloon, B.; Most, V.R.; Stroud, D.; Santos, R.; Byagowi, A.; Kammerer, G.; et al. Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation. IEEE Robot. Autom. Lett. 2020, 5, 3838–3845. [Google Scholar] [CrossRef]
  69. Trueeb, C.; Sferrazza, C.; D’Andrea, R. Towards vision-based robotic skins: A data-driven, multi-camera tactile sensor. In Proceedings of the 2020 3rd IEEE International Conference on Soft Robotics (RoboSoft), New Haven, CT, USA, 15 May–15 July 2020; pp. 333–338. [Google Scholar]
  70. Kappassov, Z.; Baimukashev, D.; Kuanyshuly, Z.; Massalin, Y.; Urazbayev, A.; Varol, H.A. Color-Coded Fiber-Optic Tactile Sensor for an Elastomeric Robot Skin. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 2146–2152. [Google Scholar] [CrossRef]
  71. Chuang, C.H.; Weng, H.K.; Chen, J.W.; Shaikh, M.O. Ultrasonic tactile sensor integrated with TFT array for force feedback and shape recognition. Sens. Actuators A Phys. 2018, 271, 348–355. [Google Scholar] [CrossRef]
  72. Shinoda, H.; Ando, S. A tactile sensor with 5-D deformation sensing element. In Proceedings of the IEEE International Conference on Robotics and Automation, Minneapolis, MI, USA, 22–28 April 1996; Volume 1, pp. 7–12. [Google Scholar]
  73. Ando, S.; Shinoda, H. Ultrasonic emission tactile sensing. IEEE Control Syst. Mag. 1995, 15, 61–69. [Google Scholar]
  74. Ando, S.; Shinoda, H.; Yonenaga, A.; Terao, J. Ultrasonic six-axis deformation sensing. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2001, 48, 1031–1045. [Google Scholar] [CrossRef] [PubMed]
  75. Shinoda, H.; Ando, S. Ultrasonic emission tactile sensor for contact localization and characterization. In Proceedings of the 1994 IEEE International Conference on Robotics and Automation, San Diego, CA, USA, 8–13 May 1994; pp. 2536–2543. [Google Scholar]
  76. Gong, D.; He, R.; Yu, J.; Zuo, G. A pneumatic tactile sensor for co-operative robots. Sensors 2017, 17, 2592. [Google Scholar] [CrossRef] [PubMed]
  77. Yao, G.; Xu, L.; Cheng, X.; Li, Y.; Huang, X.; Guo, W.; Liu, S.; Wang, Z.L.; Wu, H. Bioinspired triboelectric nanogenerators as self-powered electronic skin for robotic tactile sensing. Adv. Funct. Mater. 2020, 30, 1907312. [Google Scholar] [CrossRef]
  78. Lu, Z.; Gao, X.; Yu, H. GTac: A biomimetic tactile sensor with skin-like heterogeneous force feedback for robots. IEEE Sens. J. 2022, 22, 14491–14500. [Google Scholar] [CrossRef]
  79. Ma, M.; Zhang, Z.; Zhao, Z.; Liao, Q.; Kang, Z.; Gao, F.; Zhao, X.; Zhang, Y. Self-powered flexible antibacterial tactile sensor based on triboelectric-piezoelectric-pyroelectric multi-effect coupling mechanism. Nano Energy 2019, 66, 104105. [Google Scholar] [CrossRef]
  80. Park, K.; Yuk, H.; Yang, M.; Cho, J.; Lee, H.; Kim, J. A biomimetic elastomeric robot skin using electrical impedance and acoustic tomography for tactile sensing. Sci. Robot. 2022, 7, eabm7187. [Google Scholar] [CrossRef] [PubMed]
  81. Chang, K.; Guo, M.; Pu, L.; Dong, J.; Li, L.; Ma, P.; Huang, Y.; Liu, T. Wearable nanofibrous tactile sensors with fast response and wireless communication. Chem. Eng. J. 2023, 451, 138578. [Google Scholar] [CrossRef]
  82. Ham, J.; Huh, T.M.; Kim, J.; Kim, J.O.; Park, S.; Cutkosky, M.R.; Bao, Z. Porous Dielectric Elastomer Based Flexible Multiaxial Tactile Sensor for Dexterous Robotic or Prosthetic Hands. Adv. Mater. Technol. 2023, 8, 2200903. [Google Scholar] [CrossRef]
  83. Yu, P.; Liu, W.; Gu, C.; Cheng, X.; Fu, X. Flexible piezoelectric tactile sensor array for dynamic three-axis force measurement. Sensors 2016, 16, 819. [Google Scholar] [CrossRef]
  84. Andrussow, I.; Sun, H.; Kuchenbecker, K.J.; Martius, G. Minsight: A Fingertip-Sized Vision-Based Tactile Sensor for Robotic Manipulation. Adv. Intell. Syst. 2023, 2300042. [Google Scholar] [CrossRef]
  85. Yousef, H.; Boukallel, M.; Althoefer, K. Tactile sensing for dexterous in-hand manipulation in robotics—A review. Sens. Actuators A Phys. 2011, 167, 171–187. [Google Scholar] [CrossRef]
  86. Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile sensing—From humans to humanoids. IEEE Trans. Robot. 2009, 26, 1–20. [Google Scholar] [CrossRef]
  87. Zheng, H.; Jin, Y.; Wang, H.; Zhao, P. DotView: A Low-Cost Compact Tactile Sensor for Pressure, Shear, and Torsion Estimation. IEEE Robot. Autom. Lett. 2023, 8, 880–887. [Google Scholar] [CrossRef]
  88. Sygulla, F.; Ellensohn, F.; Hildebrandt, A.C.; Wahrmann, D.; Rixen, D. A flexible and low-cost tactile sensor for robotic applications. In Proceedings of the 2017 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Singapore, 29 May–3 June 2017; pp. 58–63. [Google Scholar]
  89. Yao, T.; Guo, X.; Li, C.; Qi, H.; Lin, H.; Liu, L.; Dai, Y.; Qu, L.; Huang, Z.; Liu, P.; et al. Highly sensitive capacitive flexible 3D-force tactile sensors for robotic grasping and manipulation. J. Phys. D Appl. Phys. 2020, 53, 445109. [Google Scholar] [CrossRef]
  90. Yan, Y.; Hu, Z.; Yang, Z.; Yuan, W.; Song, C.; Pan, J.; Shen, Y. Soft magnetic skin for super-resolution tactile sensing with force self-decoupling. Sci. Robot. 2021, 6, eabc8801. [Google Scholar] [CrossRef]
  91. Vishnu, R.S.; Mandil, W.; Parsons, S.; Ghalamzan E, A. Acoustic Soft Tactile Skin (AST Skin). arXiv 2023, arXiv:2303.17355. [Google Scholar]
  92. Stachowsky, M.; Hummel, T.; Moussa, M.; Abdullah, H.A. A slip detection and correction strategy for precision robot grasping. IEEE/ASME Trans. Mechatron. 2016, 21, 2214–2226. [Google Scholar] [CrossRef]
  93. Wall, V.; Zöller, G.; Brock, O. Passive and Active Acoustic Sensing for Soft Pneumatic Actuators. arXiv 2022, arXiv:2208.10299. [Google Scholar] [CrossRef]
  94. Zöller, G.; Wall, V.; Brock, O. Acoustic sensing for soft pneumatic actuators. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 6986–6991. [Google Scholar]
  95. Zöller, G.; Wall, V.; Brock, O. Active acoustic contact sensing for soft pneumatic actuators. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 7966–7972. [Google Scholar]
  96. Zhu, M.; Wang, Y.; Lou, M.; Yu, J.; Li, Z.; Ding, B. Bioinspired transparent and antibacterial electronic skin for sensitive tactile sensing. Nano Energy 2021, 81, 105669. [Google Scholar] [CrossRef]
  97. Wang, S.; Xiang, J.; Sun, Y.; Wang, H.; Du, X.; Cheng, X.; Du, Z.; Wang, H. Skin-inspired nanofibrillated cellulose-reinforced hydrogels with high mechanical strength, long-term antibacterial, and self-recovery ability for wearable strain/pressure sensors. Carbohydr. Polym. 2021, 261, 117894. [Google Scholar] [CrossRef]
  98. Cui, X.; Chen, J.; Wu, W.; Liu, Y.; Li, H.; Xu, Z.; Zhu, Y. Flexible and breathable all-nanofiber iontronic pressure sensors with ultraviolet shielding and antibacterial performances for wearable electronics. Nano Energy 2022, 95, 107022. [Google Scholar] [CrossRef]
  99. Ippili, S.; Jella, V.; Lee, J.M.; Jung, J.S.; Lee, D.H.; Yang, T.Y.; Yoon, S.G. ZnO–PTFE-based antimicrobial, anti-reflective display coatings and high-sensitivity touch sensors. J. Mater. Chem. A 2022, 10, 22067–22079. [Google Scholar] [CrossRef]
  100. Tian, X.; Hua, T. Antibacterial, scalable manufacturing, skin-attachable, and eco-friendly fabric triboelectric nanogenerators for self-powered sensing. ACS Sustain. Chem. Eng. 2021, 9, 13356–13366. [Google Scholar] [CrossRef]
  101. Si, Z.; Yu, T.C.; Morozov, K.; McCann, J.; Yuan, W. RobotSweater: Scalable, Generalizable, and Customizable Machine-Knitted Tactile Skins for Robots. arXiv 2023, arXiv:2303.02858. [Google Scholar]
  102. Maslyczyk, A.; Roberge, J.P.; Duchaine, V.; Loan Le, T.H. A highly sensitive multimodal capacitive tactile sensor. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 407–412. [Google Scholar]
  103. Xu, D.; Hu, B.; Zheng, G.; Wang, J.; Li, C.; Zhao, Y.; Yan, Z.; Jiao, Z.; Wu, Y.; Wang, M.; et al. Sandwich-like flexible tactile sensor based on bioinspired honeycomb dielectric layer for three-axis force detection and robotic application. J. Mater. Sci. Mater. Electron. 2023, 34, 942. [Google Scholar] [CrossRef]
  104. Arshad, A.; Saleem, M.M.; Tiwana, M.I.; ur Rahman, H.; Iqbal, S.; Cheung, R. A high sensitivity and multi-axis fringing electric field based capacitive tactile force sensor for robot assisted surgery. Sens. Actuators A Phys. 2023, 354, 114272. [Google Scholar] [CrossRef]
  105. Fujiwara, E.; de Oliveira Rosa, L. Agar-based soft tactile transducer with embedded optical fiber specklegram sensor. Results Opt. 2023, 10, 100345. [Google Scholar] [CrossRef]
  106. Xie, H.; Jiang, A.; Seneviratne, L.; Althoefer, K. Pixel-based optical fiber tactile force sensor for robot manipulation. In Proceedings of the SENSORS, 2012 IEEE, Daegu, Republic of Korea, 22–25 October 2012; pp. 1–4. [Google Scholar] [CrossRef]
  107. Althoefer, K.; Ling, Y.; Li, W.; Qian, X.; Lee, W.W.; Qi, P. A Miniaturised Camera-based Multi-Modal Tactile Sensor. arXiv 2023, arXiv:2303.03093. [Google Scholar]
  108. Kang, Z.; Li, X.; Zhao, X.; Wang, X.; Shen, J.; Wei, H.; Zhu, X. Piezo-Resistive Flexible Pressure Sensor by Blade-Coating Graphene–Silver Nanosheet–Polymer Nanocomposite. Nanomaterials 2023, 13, 4. [Google Scholar] [CrossRef] [PubMed]
  109. Ohashi, H.; Yasuda, T.; Kawasetsu, T.; Hosoda, K. Soft Tactile Sensors Having Two Channels With Different Slopes for Contact Position and Pressure Estimation. IEEE Sens. Lett. 2023, 7, 2000704. [Google Scholar] [CrossRef]
  110. Sappati, K.K.; Bhadra, S. Flexible piezoelectric 0–3 PZT-PDMS thin film for tactile sensing. IEEE Sens. J. 2020, 20, 4610–4617. [Google Scholar] [CrossRef]
  111. Lu, D.; Liu, T.; Meng, X.; Luo, B.; Yuan, J.; Liu, Y.; Zhang, S.; Cai, C.; Gao, C.; Wang, J.; et al. Wearable triboelectric visual sensors for tactile perception. Adv. Mater. 2023, 35, 2209117. [Google Scholar] [CrossRef] [PubMed]
  112. Chang, K.B.; Parashar, P.; Shen, L.C.; Chen, A.R.; Huang, Y.T.; Pal, A.; Lim, K.C.; Wei, P.H.; Kao, F.C.; Hu, J.J.; et al. A triboelectric nanogenerator-based tactile sensor array system for monitoring pressure distribution inside prosthetic limb. Nano Energy 2023, 111, 108397. [Google Scholar] [CrossRef]
  113. Hu, J.; Cui, S.; Wang, S.; Zhang, C.; Wang, R.; Chen, L.; Li, Y. GelStereo Palm: A Novel Curved Visuotactile Sensor for 3D Geometry Sensing. IEEE Trans. Ind. Inf. 2023; early access. [Google Scholar] [CrossRef]
  114. Sepehri, A.; Helisaz, H.; Chiao, M. A fiber Bragg grating tactile sensor for soft material characterization based on quasi linear viscoelastic analysis. Sens. Actuators A Phys. 2023, 349, 114079. [Google Scholar] [CrossRef]
  115. Jenkinson, G.P.; Conn, A.T.; Tzemanaki, A. ESPRESS. 0: Eustachian Tube-Inspired Tactile Sensor Exploiting Pneumatics for Range Extension and SenSitivity Tuning. Sensors 2023, 23, 567. [Google Scholar] [CrossRef]
  116. Cao, G.; Jiang, J.; Lu, C.; Gomes, D.F.; Luo, S. TouchRoller: A Rolling Optical Tactile Sensor for Rapid Assessment of Textures for Large Surface Areas. Sensors 2023, 23, 2661. [Google Scholar] [CrossRef]
  117. Peyre, K.; Tourlonias, M.; Bueno, M.A.; Spano, F.; Rossi, R.M. Tactile perception of textile surfaces from an artificial finger instrumented by a polymeric optical fibre. Tribol. Int. 2019, 130, 155–169. [Google Scholar] [CrossRef]
  118. Kootstra, G.; Wang, X.; Blok, P.M.; Hemming, J.; Van Henten, E. Selective harvesting robotics: Current research, trends, and future directions. Curr. Robot. Rep. 2021, 2, 95–104. [Google Scholar] [CrossRef]
  119. Ishikawa, R.; Hamaya, M.; Von Drigalski, F.; Tanaka, K.; Hashimoto, A. Learning by Breaking: Food Fracture Anticipation for Robotic Food Manipulation. IEEE Access 2022, 10, 99321–99329. [Google Scholar] [CrossRef]
  120. Drimus, A.; Kootstra, G.; Bilberg, A.; Kragic, D. Design of a flexible tactile sensor for classification of rigid and deformable objects. Robot. Autom. Syst. 2014, 62, 3–15. [Google Scholar] [CrossRef]
  121. Mandil, W.; Ghalamzan-E, A. Combining Vision and Tactile Sensation for Video Prediction. arXiv 2023, arXiv:2304.11193. [Google Scholar]
  122. Johansson, R.S.; Flanagan, J.R. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat. Revi. Neurosci. 2009, 10, 345–359. [Google Scholar] [CrossRef] [PubMed]
  123. Deng, Z.; Jonetzko, Y.; Zhang, L.; Zhang, J. Grasping force control of multi-fingered robotic hands through tactile sensing for object stabilization. Sensors 2020, 20, 1050. [Google Scholar] [CrossRef] [PubMed]
  124. Jara, C.A.; Pomares, J.; Candelas, F.A.; Torres, F. Control framework for dexterous manipulation using dynamic visual servoing and tactile sensors’ feedback. Sensors 2014, 14, 1787–1804. [Google Scholar] [CrossRef]
  125. Bicchi, A.; Kumar, V. Robotic grasping and contact: A review. In Proceedings of the 2000 ICRA, Millennium Conference, IEEE International Conference on Robotics and Automation, Symposia Proceedings (Cat. No. 00CH37065), San Francisco, CA, USA, 24–28 April 2000; Volume 1, pp. 348–353. [Google Scholar]
  126. Bekiroglu, Y.; Laaksonen, J.; Jorgensen, J.A.; Kyrki, V.; Kragic, D. Assessing grasp stability based on learning and haptic data. IEEE Trans. Robot. 2011, 27, 616–629. [Google Scholar] [CrossRef]
  127. Lynch, P.; Cullinan, M.F.; McGinn, C. Adaptive grasping of moving objects through tactile sensing. Sensors 2021, 21, 8339. [Google Scholar] [CrossRef]
  128. Kroemer, O.; Daniel, C.; Neumann, G.; Van Hoof, H.; Peters, J. Towards learning hierarchical skills for multi-phase manipulation tasks. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Washington, DC, USA, 26–30 May 2015; pp. 1503–1510. [Google Scholar]
  129. Kolamuri, R.; Si, Z.; Zhang, Y.; Agarwal, A.; Yuan, W. Improving grasp stability with rotation measurement from tactile sensing. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 6809–6816. [Google Scholar]
  130. Hogan, F.R.; Bauza, M.; Canal, O.; Donlon, E.; Rodriguez, A. Tactile regrasp: Grasp adjustments via simulated tactile transformations. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 2963–2970. [Google Scholar]
  131. Mahler, J.; Liang, J.; Niyaz, S.; Laskey, M.; Doan, R.; Liu, X.; Ojea, J.A.; Goldberg, K. Dex-net 2.0: Deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics. arXiv 2017, arXiv:1703.09312. [Google Scholar]
  132. Kalashnikov, D.; Irpan, A.; Pastor, P.; Ibarz, J.; Herzog, A.; Jang, E.; Quillen, D.; Holly, E.; Kalakrishnan, M.; Vanhoucke, V.; et al. Qt-opt: Scalable deep reinforcement learning for vision-based robotic manipulation. arXiv 2018, arXiv:1806.10293. [Google Scholar]
  133. Matak, M.; Hermans, T. Planning Visual-Tactile Precision Grasps via Complementary Use of Vision and Touch. IEEE Robot. Autom. Lett. 2022, 8, 768–775. [Google Scholar] [CrossRef]
  134. Romeo, R.A.; Zollo, L. Methods and sensors for slip detection in robotics: A survey. IEEE Access 2020, 8, 73027–73050. [Google Scholar] [CrossRef]
  135. Chen, W.; Khamis, H.; Birznieks, I.; Lepora, N.F.; Redmond, S.J. Tactile sensors for friction estimation and incipient slip detection—Toward dexterous robotic manipulation: A review. IEEE Sens. J. 2018, 18, 9049–9064. [Google Scholar] [CrossRef]
  136. Yang, H.; Hu, X.; Cao, L.; Sun, F. A new slip-detection method based on pairwise high frequency components of capacitive sensor signals. In Proceedings of the 2015 5th International Conference on Information Science and Technology (ICIST), Kopaonik, Serbia, 8–11 March 2015; pp. 56–61. [Google Scholar]
  137. Romeo, R.A.; Oddo, C.M.; Carrozza, M.C.; Guglielmelli, E.; Zollo, L. Slippage detection with piezoresistive tactile sensors. Sensors 2017, 17, 1844. [Google Scholar] [CrossRef] [PubMed]
  138. Su, Z.; Hausman, K.; Chebotar, Y.; Molchanov, A.; Loeb, G.E.; Sukhatme, G.S.; Schaal, S. Force estimation and slip detection/classification for grip control using a biomimetic tactile sensor. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), Seoul, Republic of Korea, 3–5 November 2015; pp. 297–303. [Google Scholar]
  139. Veiga, F.; Van Hoof, H.; Peters, J.; Hermans, T. Stabilizing novel objects by learning to predict tactile slip. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015; pp. 5065–5072. [Google Scholar]
  140. James, J.W.; Pestell, N.; Lepora, N.F. Slip detection with a biomimetic tactile sensor. IEEE Robot. Autom. Lett. 2018, 3, 3340–3346. [Google Scholar] [CrossRef]
  141. Kaboli, M.; Yao, K.; Cheng, G. Tactile-based manipulation of deformable objects with dynamic center of mass. In Proceedings of the 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016; pp. 752–757. [Google Scholar]
  142. Van Wyk, K.; Falco, J. Calibration and analysis of tactile sensors as slip detectors. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2744–2751. [Google Scholar]
  143. Romano, J.M.; Hsiao, K.; Niemeyer, G.; Chitta, S.; Kuchenbecker, K.J. Human-inspired robotic grasp control with tactile sensing. IEEE Trans. Robot. 2011, 27, 1067–1079. [Google Scholar] [CrossRef]
  144. Hasegawa, H.; Mizoguchi, Y.; Tadakuma, K.; Ming, A.; Ishikawa, M.; Shimojo, M. Development of intelligent robot hand using proximity, contact and slip sensing. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AL, USA, 3–7 May 2010; pp. 777–784. [Google Scholar]
  145. Li, J.; Dong, S.; Adelson, E. Slip detection with combined tactile and visual information. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 7772–7777. [Google Scholar]
  146. Zhang, Y.; Kan, Z.; Tse, Y.A.; Yang, Y.; Wang, M.Y. Fingervision tactile sensor design and slip detection using convolutional lstm network. arXiv 2018, arXiv:1810.02653. [Google Scholar]
  147. Garcia-Garcia, A.; Zapata-Impata, B.S.; Orts-Escolano, S.; Gil, P.; Garcia-Rodriguez, J. Tactilegcn: A graph convolutional network for predicting grasp stability with tactile sensors. In Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14–19 July 2019; pp. 1–8. [Google Scholar]
  148. Mandil, W.; Nazari, K.; Ghalamzan E, A. Action conditioned tactile prediction: A case study on slip prediction. arXiv 2022, arXiv:2205.09430. [Google Scholar]
  149. Nazari, K.; Mandill, W.; Hanheide, M.; Esfahani, A.G. Tactile dynamic behaviour prediction based on robot action. In Proceedings of the towards Autonomous Robotic Systems: 22nd Annual Conference, TAROS 2021, Lincoln, UK, 8–10 September 2021; Springer: Berlin/Heidelberg, Germany, 2021; pp. 284–293. [Google Scholar]
  150. Nazari, K.; Mandil, W.; Esfahani, A.M.G. Proactive slip control by learned slip model and trajectory adaptation. In Proceedings of the Conference on Robot Learning, PMLR, Auckland, New Zealand, 8 March 2023; pp. 751–761. [Google Scholar]
  151. Mayol-Cuevas, W.W.; Juarez-Guerrero, J.; Munoz-Gutierrez, S. A first approach to tactile texture recognition. In Proceedings of the SMC’98 Conference Proceedings, 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No. 98CH36218), San Diego, CA, USA, 14 October 1998; Volume 5, pp. 4246–4250. [Google Scholar]
  152. Muhammad, H.; Recchiuto, C.; Oddo, C.M.; Beccai, L.; Anthony, C.; Adams, M.; Carrozza, M.C.; Ward, M. A capacitive tactile sensor array for surface texture discrimination. Microelect. Eng. 2011, 88, 1811–1813. [Google Scholar] [CrossRef]
  153. Drimus, A.; Petersen, M.B.; Bilberg, A. Object texture recognition by dynamic tactile sensing using active exploration. In Proceedings of the 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; pp. 277–283. [Google Scholar]
  154. Chun, S.; Kim, J.S.; Yoo, Y.; Choi, Y.; Jung, S.J.; Jang, D.; Lee, G.; Song, K.I.; Nam, K.S.; Youn, I.; et al. An artificial neural tactile sensing system. Nat. Electr. 2021, 4, 429–438. [Google Scholar] [CrossRef]
  155. Jamali, N.; Sammut, C. Material classification by tactile sensing using surface textures. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AL, USA, 3–7 May 2010; pp. 2336–2341. [Google Scholar]
  156. Li, R.; Adelson, E.H. Sensing and recognizing surface textures using a gelsight sensor. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 1241–1247. [Google Scholar]
  157. Song, Z.; Yin, J.; Wang, Z.; Lu, C.; Yang, Z.; Zhao, Z.; Lin, Z.; Wang, J.; Wu, C.; Cheng, J.; et al. A flexible triboelectric tactile sensor for simultaneous material and texture recognition. Nano Energy 2022, 93, 106798. [Google Scholar] [CrossRef]
  158. Luo, S.; Bimbo, J.; Dahiya, R.; Liu, H. Robotic tactile perception of object properties: A review. Mechatronics 2017, 48, 54–67. [Google Scholar] [CrossRef]
  159. Tsuji, S.; Kohama, T. Using a convolutional neural network to construct a pen-type tactile sensor system for roughness recognition. Sens. Actuators A Phys. 2019, 291, 7–12. [Google Scholar] [CrossRef]
  160. Gao, Y.; Hendricks, L.A.; Kuchenbecker, K.J.; Darrell, T. Deep learning for tactile understanding from visual and haptic data. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 536–543. [Google Scholar]
  161. Taunyazov, T.; Chua, Y.; Gao, R.; Soh, H.; Wu, Y. Fast texture classification using tactile neural coding and spiking neural network. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 9890–9895. [Google Scholar]
  162. Luo, S.; Yuan, W.; Adelson, E.; Cohn, A.G.; Fuentes, R. Vitac: Feature sharing between vision and tactile sensing for cloth texture recognition. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2722–2727. [Google Scholar]
  163. Howe, R.D. Tactile sensing and control of robotic manipulation. Adv. Robot. 1993, 8, 245–261. [Google Scholar] [CrossRef]
  164. Su, Z.; Fishel, J.A.; Yamamoto, T.; Loeb, G.E. Use of tactile feedback to control exploratory movements to characterize object compliance. Front. Neurorobot. 2012, 6, 7. [Google Scholar] [CrossRef] [PubMed]
  165. Dean-Leon, E.; Bergner, F.; Ramirez-Amaro, K.; Cheng, G. From multi-modal tactile signals to a compliant control. In Proceedings of the 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016; pp. 892–898. [Google Scholar]
  166. Dean-Leon, E.; Guadarrama-Olvera, J.R.; Bergner, F.; Cheng, G. Whole-body active compliance control for humanoid robots with robot skin. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 5404–5410. [Google Scholar]
  167. Calandra, R.; Ivaldi, S.; Deisenroth, M.P.; Peters. Learning torque control in presence of contacts using tactile sensing from robot skin. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), Seoul, Republic of Korea, 3–5 November 2015; pp. 690–695. [Google Scholar]
  168. Xu, D.; Loeb, G.E.; Fishel, J.A. Tactile identification of objects using Bayesian exploration. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 3056–3061. [Google Scholar]
  169. Goger, D.; Gorges, N.; Worn, H. Tactile sensing for an anthropomorphic robotic hand: Hardware and signal processing. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 895–901. [Google Scholar]
  170. Pezzementi, Z.; Plaku, E.; Reyda, C.; Hager, G.D. Tactile-object recognition from appearance information. IEEE Trans. Robot. 2011, 27, 473–487. [Google Scholar] [CrossRef]
  171. Li, G.; Liu, S.; Wang, L.; Zhu, R. Skin-inspired quadruple tactile sensors integrated on a robot hand enable object recognition. Sci. Roboti. 2020, 5, eabc8134. [Google Scholar] [CrossRef]
  172. Pastor, F.; García-González, J.; Gandarias, J.M.; Medina, D.; Closas, P.; García-Cerezo, A.J.; Gómez-de Gabriel, J.M. Bayesian and neural inference on lstm-based object recognition from tactile and kinesthetic information. IEEE Robot. Autom. Lett. 2020, 6, 231–238. [Google Scholar] [CrossRef]
  173. Yuan, W.; Dong, S.; Adelson, E.H. Gelsight: High-resolution robot tactile sensors for estimating geometry and force. Sensors 2017, 17, 2762. [Google Scholar] [CrossRef]
  174. Okamura, A.M.; Richard, C.; Cutkosky, M.R. Feeling is believing: Using a force-feedback joystick to teach dynamic systems. J. Eng. Educ. 2002, 91, 345–349. [Google Scholar] [CrossRef]
  175. Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Trans. Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef]
  176. Lee, M.H.; Nicholls, H.R. Review Article Tactile sensing for mechatronics—A state of the art survey. Mechatronics 1999, 9, 1–31. [Google Scholar] [CrossRef]
  177. Okamura, A.M. Haptic feedback in robot-assisted minimally invasive surgery. Curr. Opin. Urol. 2009, 19, 102. [Google Scholar] [CrossRef]
  178. Sun, Z.; Zhu, M.; Shan, X.; Lee, C. Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions. Nat. Commun. 2022, 13, 5224. [Google Scholar] [CrossRef] [PubMed]
  179. Tian, S.; Ebert, F.; Jayaraman, D.; Mudigonda, M.; Finn, C.; Calandra, R.; Levine, S. Manipulation by feel: Touch-based control with deep predictive models. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 818–824. [Google Scholar]
  180. Vouloutsi, V.; Cominelli, L.; Dogar, M.; Lepora, N.; Zito, C.; Martinez-Hernandez, U. Towards Living Machines: Current and future trends of tactile sensing, grasping, and social robotics. Bioinspir. Biomim. 2023, 18, 025002. [Google Scholar] [CrossRef]
  181. Siegrist, M.; Hartmann, C. Consumer acceptance of novel food technologies. Nat. Food 2020, 1, 343–350. [Google Scholar] [CrossRef] [PubMed]
  182. Lezoche, M.; Hernandez, J.E.; Díaz, M.d.M.E.A.; Panetto, H.; Kacprzyk, J. Agri-food 4.0: A survey of the supply chains and technologies for the future agriculture. Comput. Ind. 2020, 117, 103187. [Google Scholar] [CrossRef]
  183. Carmela Annosi, M.; Brunetta, F.; Capo, F.; Heideveld, L. Digitalization in the agri-food industry: The relationship between technology and sustainable development. Manag. Decis. 2020, 58, 1737–1757. [Google Scholar] [CrossRef]
  184. Miranda, J.; Ponce, P.; Molina, A.; Wright, P. Sensing, smart and sustainable technologies for Agri-Food 4.0. Comput. Ind. 2019, 108, 21–36. [Google Scholar] [CrossRef]
  185. Gandarias, J.M.; Garcia-Cerezo, A.J.; Gomez-de Gabriel, J.M. CNN-based methods for object recognition with high-resolution tactile sensors. IEEE Sens. J. 2019, 19, 6872–6882. [Google Scholar] [CrossRef]
  186. Platkiewicz, J.; Lipson, H.; Hayward, V. Haptic edge detection through shear. Sci. Rep. 2016, 6, 23551. [Google Scholar] [CrossRef]
  187. Parvizi-Fard, A.; Amiri, M.; Kumar, D.; Iskarous, M.M.; Thakor, N.V. A functional spiking neuronal network for tactile sensing pathway to process edge orientation. Sci. Rep. 2021, 11, 1320. [Google Scholar] [CrossRef]
  188. Yuan, X.; Zou, J.; Sun, L.; Liu, H.; Jin, G. Soft tactile sensor and curvature sensor for caterpillar-like soft robot’s adaptive motion. In Proceedings of the 2019 International Conference on Robotics, Intelligent Control and Artificial Intelligence, Long Beach, CA, USA, 9–15 June 2019; pp. 690–695. [Google Scholar]
  189. Luo, S.; Mou, W.; Althoefer, K.; Liu, H. Novel tactile-sift descriptor for object shape recognition. IEEE Sens. J. 2015, 15, 5001–5009. [Google Scholar] [CrossRef]
  190. Amirkhani, G.; Goodridge, A.; Esfandiari, M.; Phalen, H.; Ma, J.H.; Iordachita, I.; Armand, M. Design and Fabrication of a Fiber Bragg Grating Shape Sensor for Shape Reconstruction of a Continuum Manipulator. IEEE Sens. J. 2023, 23, 12915–12929. [Google Scholar] [CrossRef]
  191. Sotgiu, E.; Aguiam, D.E.; Calaza, C.; Rodrigues, J.; Fernandes, J.; Pires, B.; Moreira, E.E.; Alves, F.; Fonseca, H.; Dias, R.; et al. Surface texture detection with a new sub-mm resolution flexible tactile capacitive sensor array for multimodal artificial finger. J. Microelectromech. Syst. 2020, 29, 629–636. [Google Scholar] [CrossRef]
  192. Pang, Y.; Xu, X.; Chen, S.; Fang, Y.; Shi, X.; Deng, Y.; Wang, Z.L.; Cao, C. Skin-inspired textile-based tactile sensors enable multifunctional sensing of wearables and soft robots. Nano Energy 2022, 96, 107137. [Google Scholar] [CrossRef]
  193. Abd, M.A.; Paul, R.; Aravelli, A.; Bai, O.; Lagos, L.; Lin, M.; Engeberg, E.D. Hierarchical tactile sensation integration from prosthetic fingertips enables multi-texture surface recognition. Sensors 2021, 21, 4324. [Google Scholar] [CrossRef]
  194. Liu, W.; Zhang, G.; Zhan, B.; Hu, L.; Liu, T. Fine Texture Detection Based on a Solid–Liquid Composite Flexible Tactile Sensor Array. Micromachines 2022, 13, 440. [Google Scholar] [CrossRef]
  195. Choi, D.; Jang, S.; Kim, J.S.; Kim, H.J.; Kim, D.H.; Kwon, J.Y. A highly sensitive tactile sensor using a pyramid-plug structure for detecting pressure, shear force, and torsion. Adv. Mater. Technol. 2019, 4, 1800284. [Google Scholar] [CrossRef]
  196. Weng, L.; Xie, G.; Zhang, B.; Huang, W.; Wang, B.; Deng, Z. Magnetostrictive tactile sensor array for force and stiffness detection. J. Magn. Magn. Mater. 2020, 513, 167068. [Google Scholar] [CrossRef]
  197. Zhang, Y.; Ju, F.; Wei, X.; Wang, D.; Wang, Y. A piezoelectric tactile sensor for tissue stiffness detection with arbitrary contact angle. Sensors 2020, 20, 6607. [Google Scholar] [CrossRef]
  198. Christopher, C.T.; Fath Elbab, A.M.; Osueke, C.O.; Ikua, B.W.; Sila, D.N.; Fouly, A. A piezoresistive dual-tip stiffness tactile sensor for mango ripeness assessment. Cogent Eng. 2022, 9, 2030098. [Google Scholar] [CrossRef]
  199. Li, Y.; Cao, Z.; Li, T.; Sun, F.; Bai, Y.; Lu, Q.; Wang, S.; Yang, X.; Hao, M.; Lan, N.; et al. Highly selective biomimetic flexible tactile sensor for neuroprosthetics. Research 2020, 2020. [Google Scholar] [CrossRef]
  200. Li, Y.; Zhao, M.; Yan, Y.; He, L.; Wang, Y.; Xiong, Z.; Wang, S.; Bai, Y.; Sun, F.; Lu, Q.; et al. Multifunctional biomimetic tactile system via a stick-slip sensing strategy for human–machine interactions. npj Flex. Electron. 2022, 6, 46. [Google Scholar] [CrossRef]
  201. Wi, Y.; Florence, P.; Zeng, A.; Fazeli, N. Virdo: Visio-tactile implicit representations of deformable objects. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 3583–3590. [Google Scholar]
  202. Huang, H.J.; Guo, X.; Yuan, W. Understanding dynamic tactile sensing for liquid property estimation. arXiv 2022, arXiv:2205.08771. [Google Scholar]
  203. Zhao, D.; Sun, F.; Wang, Z.; Zhou, Q. A novel accurate positioning method for object pose estimation in robotic manipulation based on vision and tactile sensors. Int. J. Adv. Manuf. Technol. 2021, 116, 2999–3010. [Google Scholar] [CrossRef]
  204. Sui, R.; Zhang, L.; Li, T.; Jiang, Y. Incipient slip detection method with vision-based tactile sensor based on distribution force and deformation. IEEE Sens. J. 2021, 21, 25973–25985. [Google Scholar] [CrossRef]
  205. Gomes, D.F.; Lin, Z.; Luo, S. GelTip: A finger-shaped optical tactile sensor for robotic manipulation. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 9903–9909. [Google Scholar]
  206. von Drigalski, F.; Hayashi, K.; Huang, Y.; Yonetani, R.; Hamaya, M.; Tanaka, K.; Ijiri, Y. Precise multi-modal in-hand pose estimation using low-precision sensors for robotic assembly. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xian, China, 30 May–5 June 2021; pp. 968–974. [Google Scholar]
  207. Patel, R.; Ouyang, R.; Romero, B.; Adelson, E. Digger finger: Gelsight tactile sensor for object identification inside granular media. In Proceedings of the Experimental Robotics: The 17th International Symposium; Springer: Berlin/Heidelberg, Germany, 2021; pp. 105–115. [Google Scholar]
  208. Abderrahmane, Z.; Ganesh, G.; Crosnier, A.; Cherubini, A. A deep learning framework for tactile recognition of known as well as novel objects. IEEE Trans. Ind. Inf. 2019, 16, 423–432. [Google Scholar] [CrossRef]
  209. Schmitz, A.; Maiolino, P.; Maggiali, M.; Natale, L.; Cannata, G.; Metta, G. Methods and technologies for the implementation of large-scale robot tactile sensors. IEEE Trans. Robot. 2011, 27, 389–400. [Google Scholar] [CrossRef]
  210. Spiers, A.J.; Liarokapis, M.V.; Calli, B.; Dollar, A.M. Single-grasp object classification and feature extraction with simple robot hands and tactile sensors. IEEE Trans. Haptics 2016, 9, 207–220. [Google Scholar]
  211. Tenzer, Y.; Jentoft, L.P.; Howe, R.D. The feel of MEMS barometers: Inexpensive and easily customized tactile array sensors. IEEE Robot. Autom. Mag. 2014, 21, 89–95. [Google Scholar] [CrossRef]
  212. Chen, Y.; Lin, J.; Du, X.; Fang, B.; Sun, F.; Li, S. Non-destructive Fruit Firmness Evaluation Using Vision-Based Tactile Information. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 2303–2309. [Google Scholar]
  213. Wan, C.; Cai, P.; Guo, X.; Wang, M.; Matsuhisa, N.; Yang, L.; Lv, Z.; Luo, Y.; Loh, X.J.; Chen, X. An artificial sensory neuron with visual-haptic fusion. Nat. Commun. 2020, 11, 4602. [Google Scholar]
  214. Dang, H.; Allen, P.K. Stable grasping under pose uncertainty using tactile feedback. Auton. Robot. 2014, 36, 309–330. [Google Scholar]
  215. Papadimitriou, C.H. Computational complexity. In Encyclopedia of Computer Science; Wiley: Hoboken, NJ, USA, 2003; pp. 260–265. [Google Scholar]
  216. Kim, M.; Yang, J.; Kim, D.; Yun, D. Soft tactile sensor to detect the slip of a Robotic hand. Measurement 2022, 200, 111615. [Google Scholar] [CrossRef]
  217. Fu, X.; Zhang, J.; Xiao, J.; Kang, Y.; Yu, L.; Jiang, C.; Pan, Y.; Dong, H.; Gao, S.; Wang, Y. A high-resolution, ultrabroad-range and sensitive capacitive tactile sensor based on a CNT/PDMS composite for robotic hands. Nanoscale 2021, 13, 18780–18788. [Google Scholar] [PubMed]
  218. Cho, M.Y.; Lee, J.W.; Park, C.; Lee, B.D.; Kyeong, J.S.; Park, E.J.; Lee, K.Y.; Sohn, K.S. Large-Area Piezoresistive Tactile Sensor Developed by Training a Super-Simple Single-Layer Carbon Nanotube-Dispersed Polydimethylsiloxane Pad. Adv. Intell. Syst. 2022, 4, 2100123. [Google Scholar] [CrossRef]
  219. Lee, D.H.; Chuang, C.H.; Shaikh, M.O.; Dai, Y.S.; Wang, S.Y.; Wen, Z.H.; Yen, C.K.; Liao, C.F.; Pan, C.T. Flexible piezoresistive tactile sensor based on polymeric nanocomposites with grid-type microstructure. Micromachines 2021, 12, 452. [Google Scholar]
  220. Wang, S.; Lambeta, M.; Chou, P.W.; Calandra, R. Tacto: A fast, flexible, and open-source simulator for high-resolution vision-based tactile sensors. IEEE Robot. Autom. Lett. 2022, 7, 3930–3937. [Google Scholar] [CrossRef]
  221. Zhang, Y.; Kan, Z.; Yang, Y.; Tse, Y.A.; Wang, M.Y. Effective estimation of contact force and torque for vision-based tactile sensors with helmholtz–hodge decomposition. IEEE Robot. Autom. Lett. 2019, 4, 4094–4101. [Google Scholar] [CrossRef]
  222. Wang, A.; Kurutach, T.; Liu, K.; Abbeel, P.; Tamar, A. Learning robotic manipulation through visual planning and acting. arXiv 2019, arXiv:1905.04411. [Google Scholar]
  223. Nguyen, V.D. Constructing force-closure grasps. Int. J. Robot. Res. 1988, 7, 3–16. [Google Scholar]
  224. Han, L.; Li, Z.; Trinkle, J.C.; Qin, Z.; Jiang, S. The planning and control of robot dextrous manipulation. In Proceedings of the 2000 ICRA, Millennium Conference, IEEE International Conference on Robotics and Automation, Symposia Proceedings (Cat. No. 00CH37065), San Francisco, CA, USA, 24–28 April 2000; Volume 1, pp. 263–269. [Google Scholar]
  225. Liu, Y.; Jiang, D.; Tao, B.; Qi, J.; Jiang, G.; Yun, J.; Huang, L.; Tong, X.; Chen, B.; Li, G. Grasping posture of humanoid manipulator based on target shape analysis and force closure. Alex. Eng. J. 2022, 61, 3959–3969. [Google Scholar]
  226. He, L.; Lu, Q.; Abad, S.A.; Rojas, N.; Nanayakkara, T. Soft fingertips with tactile sensing and active deformation for robust grasping of delicate objects. IEEE Robot. Autom. Lett. 2020, 5, 2714–2721. [Google Scholar] [CrossRef]
  227. Wen, R.; Yuan, K.; Wang, Q.; Heng, S.; Li, Z. Force-guided high-precision grasping control of fragile and deformable objects using semg-based force prediction. IEEE Robot. Autom. Lett. 2020, 5, 2762–2769. [Google Scholar] [CrossRef]
  228. Yin, Z.H.; Huang, B.; Qin, Y.; Chen, Q.; Wang, X. Rotating without Seeing: Towards In-hand Dexterity through Touch. arXiv 2023, arXiv:2303.10880. [Google Scholar]
  229. Khamis, H.; Xia, B.; Redmond, S.J. Real-time friction estimation for grip force control. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xian, China, 30 May–5 June 2021; pp. 1608–1614. [Google Scholar]
  230. Zhang, Y.; Yuan, W.; Kan, Z.; Wang, M.Y. Towards learning to detect and predict contact events on vision-based tactile sensors. In Proceedings of the Conference on Robot Learning, PMLR, Cambridge, MA, USA, 16–18 November 2020; pp. 1395–1404. [Google Scholar]
  231. Prescott, T.J.; Diamond, M.E.; Wing, A.M. Active touch sensing. Philos. Trans. R. Soc. B Biol. Sci. 2011, 2989–2995. [Google Scholar] [CrossRef] [PubMed]
  232. Proske, U.; Gandevia, S.C. The kinaesthetic senses. J. Physiol. 2009, 587, 4139–4146. [Google Scholar] [CrossRef]
  233. Görner, M.; Haschke, R.; Ritter, H.; Zhang, J. Moveit! task constructor for task-level motion planning. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 190–196. [Google Scholar]
  234. Liu, S.; Liu, P. Benchmarking and optimization of robot motion planning with motion planning pipeline. Int. J. Adv. Manuf. Technol. 2022, 118, 949–961. [Google Scholar] [CrossRef]
  235. Ravichandar, H.; Polydoros, A.S.; Chernova, S.; Billard, A. Recent advances in robot learning from demonstration. Annu. Rev. Control Robot. Auton. Syst. 2020, 3, 297–330. [Google Scholar] [CrossRef]
  236. Sanni, O.; Bonvicini, G.; Khan, M.A.; López-Custodio, P.C.; Nazari, K.; Ghalamzan E., A.M. Deep movement primitives: Toward breast cancer examination robot. In Proceedings of the AAAI Conference on Artificial Intelligence, Online, 22 February–1 March 2022; Volume 36, pp. 12126–12134. [Google Scholar]
  237. Dabrowski, J.J.; Rahman, A. Fruit Picker Activity Recognition with Wearable Sensors and Machine Learning. arXiv 2023, arXiv:2304.10068. [Google Scholar]
  238. Ngiam, J.; Khosla, A.; Kim, M.; Nam, J.; Lee, H.; Ng, A.Y. Multimodal deep learning. In Proceedings of the 28th International Conference on Machine Learning (ICML-11), Washington, DC, USA, 28 June–2 July 2011; pp. 689–696. [Google Scholar]
  239. Joshi, G.; Walambe, R.; Kotecha, K. A review on explainability in multimodal deep neural nets. IEEE Access 2021, 9, 59800–59821. [Google Scholar] [CrossRef]
  240. Calandra, R.; Owens, A.; Jayaraman, D.; Lin, J.; Yuan, W.; Malik, J.; Adelson, E.H.; Levine, S. More than a feeling: Learning to grasp and regrasp using vision and touch. IEEE Robot. Autom. Lett. 2018, 3, 3300–3307. [Google Scholar] [CrossRef]
  241. Palermo, F.; Konstantinova, J.; Althoefer, K.; Poslad, S.; Farkhatdinov, I. Implementing tactile and proximity sensing for crack detection. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 632–637. [Google Scholar]
  242. Liang, J.; Wu, J.; Huang, H.; Xu, W.; Li, B.; Xi, F. Soft sensitive skin for safety control of a nursing robot using proximity and tactile sensors. IEEE Sens. J. 2019, 20, 3822–3830. [Google Scholar] [CrossRef]
  243. Wang, H.; De Boer, G.; Kow, J.; Alazmani, A.; Ghajari, M.; Hewson, R.; Culmer, P. Design methodology for magnetic field-based soft tri-axis tactile sensors. Sensors 2016, 16, 1356. [Google Scholar] [CrossRef] [PubMed]
  244. Khamis, H.; Xia, B.; Redmond, S.J. A novel optical 3D force and displacement sensor–Towards instrumenting the PapillArray tactile sensor. Sens. Actuators A Phys. 2019, 291, 174–187. [Google Scholar] [CrossRef]
  245. Mukashev, D.; Zhuzbay, N.; Koshkinbayeva, A.; Orazbayev, B.; Kappassov, Z. PhotoElasticFinger: Robot Tactile Fingertip Based on Photoelastic Effect. Sensors 2022, 22, 6807. [Google Scholar] [CrossRef]
  246. Costanzo, M.; De Maria, G.; Natale, C.; Pirozzi, S. Design and calibration of a force/tactile sensor for dexterous manipulation. Sensors 2019, 19, 966. [Google Scholar] [CrossRef] [PubMed]
  247. Zapata-Impata, B.S.; Gil, P.; Torres, F. Learning spatio temporal tactile features with a ConvLSTM for the direction of slip detection. Sensors 2019, 19, 523. [Google Scholar] [CrossRef]
  248. Bimbo, J.; Luo, S.; Althoefer, K.; Liu, H. In-hand object pose estimation using covariance-based tactile to geometry matching. IEEE Robot. Autom. Lett. 2016, 1, 570–577. [Google Scholar] [CrossRef]
  249. Lancaster, P.; Yang, B.; Smith, J.R. Improved object pose estimation via deep pre-touch sensing. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 2448–2455. [Google Scholar]
  250. Villalonga, M.B.; Rodriguez, A.; Lim, B.; Valls, E.; Sechopoulos, T. Tactile object pose estimation from the first touch with geometric contact rendering. In Proceedings of the Conference on Robot Learning, PMLR, London, UK, 8–11 November 2021; pp. 1015–1029. [Google Scholar]
  251. Li, T.; Sun, X.; Shu, X.; Wang, C.; Wang, Y.; Chen, G.; Xue, N. Robot grasping system and grasp stability prediction based on flexible tactile sensor array. Machines 2021, 9, 119. [Google Scholar] [CrossRef]
  252. Funabashi, S.; Morikuni, S.; Geier, A.; Schmitz, A.; Ogasa, S.; Torno, T.P.; Somlor, S.; Sugano, S. Object recognition through active sensing using a multi-fingered robot hand with 3d tactile sensors. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 2589–2595. [Google Scholar]
  253. Wang, L.; Ma, L.; Yang, J.; Wu, J. Human somatosensory processing and artificial somatosensation. Cyborg Bionic Syst. 2021, 2021, 9843259. [Google Scholar] [CrossRef]
  254. Langdon, A.J.; Boonstra, T.W.; Breakspear, M. Multi-frequency phase locking in human somatosensory cortex. Prog. Biophys. Mol. Biol. 2011, 105, 58–66. [Google Scholar] [CrossRef]
  255. Härtner, J.; Strauss, S.; Pfannmöller, J.; Lotze, M. Tactile acuity of fingertips and hand representation size in human Area 3b and Area 1 of the primary somatosensory cortex. NeuroImage 2021, 232, 117912. [Google Scholar] [CrossRef] [PubMed]
  256. Abdeetedal, M.; Kermani, M.R. Grasp and stress analysis of an underactuated finger for proprioceptive tactile sensing. IEEE/ASME Trans. Mechatron. 2018, 23, 1619–1629. [Google Scholar] [CrossRef]
  257. Ntagios, M.; Nassar, H.; Pullanchiyodan, A.; Navaraj, W.T.; Dahiya, R. Robotic hands with intrinsic tactile sensing via 3D printed soft pressure sensors. Adv. Intell. Syst. 2020, 2, 1900080. [Google Scholar] [CrossRef]
  258. Luo, S.; Zhou, X.; Tang, X.; Li, J.; Wei, D.; Tai, G.; Chen, Z.; Liao, T.; Fu, J.; Wei, D.; et al. Microconformal electrode-dielectric integration for flexible ultrasensitive robotic tactile sensing. Nano Energy 2021, 80, 105580. [Google Scholar] [CrossRef]
  259. Shintake, J.; Cacucciolo, V.; Floreano, D.; Shea, H. Soft robotic grippers. Adv. Mater. 2018, 30, 1707035. [Google Scholar] [CrossRef]
  260. Dahiya, R.; Akinwande, D.; Chang, J.S. Flexible electronic skin: From humanoids to humans [scanning the issue]. Proc. IEEE 2019, 107, 2011–2015. [Google Scholar] [CrossRef]
  261. Zhang, Y.; Lin, Z.; Huang, X.; You, X.; Ye, J.; Wu, H. A Large-Area, Stretchable, Textile-Based Tactile Sensor. Adv. Mater. Technol. 2020, 5, 1901060. [Google Scholar] [CrossRef]
Figure 1. A summary of the work and contributions in this review. We introduce tactile sensing technologies and their general applications/algorithms in Section 2 and Section 3. This context is then used to go in depth into the current use of tactile sensation in the agri-food sector in Section 4. Finally, we provide a detailed analysis of the current tactile-sensing technology shortcomings.
Figure 1. A summary of the work and contributions in this review. We introduce tactile sensing technologies and their general applications/algorithms in Section 2 and Section 3. This context is then used to go in depth into the current use of tactile sensation in the agri-food sector in Section 4. Finally, we provide a detailed analysis of the current tactile-sensing technology shortcomings.
Sensors 23 07362 g001
Figure 3. Example of low-cost tactile sensing: Sensorizing a flexible soft skin by constructing internal acoustic channels.
Figure 3. Example of low-cost tactile sensing: Sensorizing a flexible soft skin by constructing internal acoustic channels.
Sensors 23 07362 g003
Figure 4. Examples of tactile sensation systems used to extract features from food items. (a) A pneumatic robot gripper for sorting eggplants by firmness Blanes et al. [5], (b) mango ripeness assessment through visuo-tactile system Cortés et al. [11].
Figure 4. Examples of tactile sensation systems used to extract features from food items. (a) A pneumatic robot gripper for sorting eggplants by firmness Blanes et al. [5], (b) mango ripeness assessment through visuo-tactile system Cortés et al. [11].
Sensors 23 07362 g004
Figure 5. Novel multi-fingered, soft harvesting end effector for apples (a,b) novel ability to remove grasping finger in situations of physical obstruction Zhou et al. [22].
Figure 5. Novel multi-fingered, soft harvesting end effector for apples (a,b) novel ability to remove grasping finger in situations of physical obstruction Zhou et al. [22].
Sensors 23 07362 g005
Figure 6. Soft tactile-sensing end effector that measures contact force distribution and finger deformation during grasping for food item identification Zhang et al. [13] (a) shows the schematic of the tactile system and (b) shows the real tactile system grasping a soft fruit for identification.
Figure 6. Soft tactile-sensing end effector that measures contact force distribution and finger deformation during grasping for food item identification Zhang et al. [13] (a) shows the schematic of the tactile system and (b) shows the real tactile system grasping a soft fruit for identification.
Sensors 23 07362 g006
Figure 7. Physical manipulation of harvesting environments using tactile sensation: strawberry cluster manipulation using visual tactile sensor.
Figure 7. Physical manipulation of harvesting environments using tactile sensation: strawberry cluster manipulation using visual tactile sensor.
Sensors 23 07362 g007
Figure 8. The block diagram of challenges of using tactile sensors.
Figure 8. The block diagram of challenges of using tactile sensors.
Sensors 23 07362 g008
Figure 9. Examples of tactile sensor calibration setup. (a) Force calibration of a magnetic field-based tri-axial tactile sensor [243], (b) force calibration of a photo-elastic tactile sensor [245], (c) force and slip calibration of an optical-based tactile sensor (papillarrary) tactile sensor [244], and (d) force calibration of an opto-electronic based tactile sensor [246].
Figure 9. Examples of tactile sensor calibration setup. (a) Force calibration of a magnetic field-based tri-axial tactile sensor [243], (b) force calibration of a photo-elastic tactile sensor [245], (c) force and slip calibration of an optical-based tactile sensor (papillarrary) tactile sensor [244], and (d) force calibration of an opto-electronic based tactile sensor [246].
Sensors 23 07362 g009
Table 2. Summarising the most important uses of tactile sensors in robotics and automation.
Table 2. Summarising the most important uses of tactile sensors in robotics and automation.
Robot Task TypeTactile Sensor TypeCited Research
Robot Control TasksForce Control [45,86,122,123,124,124]
Robotic Grasping [123,125,126,127,128,129,130,131,132,133]
Slip Detection [123,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150]
Object Pushing [29,121,179]
Compliance Control [163,164,165,166,167]
Haptic Feedback [174,175,176,177,178]
Feature Extraction TasksTexture Recognition [47,60,151,152,153,155,156,157,158,159,160,161,162]
Object Recognition[43,45,86,120,156,160,168,169,170,171,172]
3D Shape Reconstruction [44,62,173]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mandil, W.; Rajendran, V.; Nazari, K.; Ghalamzan-Esfahani, A. Tactile-Sensing Technologies: Trends, Challenges and Outlook in Agri-Food Manipulation. Sensors 2023, 23, 7362. https://doi.org/10.3390/s23177362

AMA Style

Mandil W, Rajendran V, Nazari K, Ghalamzan-Esfahani A. Tactile-Sensing Technologies: Trends, Challenges and Outlook in Agri-Food Manipulation. Sensors. 2023; 23(17):7362. https://doi.org/10.3390/s23177362

Chicago/Turabian Style

Mandil, Willow, Vishnu Rajendran, Kiyanoush Nazari, and Amir Ghalamzan-Esfahani. 2023. "Tactile-Sensing Technologies: Trends, Challenges and Outlook in Agri-Food Manipulation" Sensors 23, no. 17: 7362. https://doi.org/10.3390/s23177362

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop