Next Article in Journal
Interdisciplinary Co-Design Research Practice in the Rehabilitation of Elderly Individuals with Chronic Low Back Pain from a Senior Care Center in South Korea
Next Article in Special Issue
Deep Recurrent Neural Network Assisted Stress Detection System for Working Professionals
Previous Article in Journal
Impacts of Micro-Deviations of Aperture on the Characteristics of Collision Atomization Field
Previous Article in Special Issue
A Smartphone-Based Mobility Assistant Using Depth Imaging for Visually Impaired and Blind
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Touch, Texture and Haptic Feedback: A Review on How We Feel the World around Us

by
Aaron Raymond See
*,
Jose Antonio G. Choco
and
Kohila Chandramohan
Department of Electrical Engineering, Southern Taiwan University of Science and Technology, Tainan 71005, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(9), 4686; https://doi.org/10.3390/app12094686
Submission received: 22 February 2022 / Revised: 4 May 2022 / Accepted: 5 May 2022 / Published: 6 May 2022

Abstract

:
Touch is one most of the important aspects of human life. Nearly all interactions, when broken down, involve touch in one form or another. Recent advances in technology, particularly in the field of virtual reality, have led to increasing interest in the research of haptics. However, accurately capturing touch is still one of most difficult engineering challenges currently being faced. Recent advances in technology such as those found in microcontrollers which allow the creation of smaller sensors and feedback devices may provide the solution. Beyond capturing and measuring touch, replicating touch is also another unique challenge due to the complexity and sensitivity of the human skin. The development of flexible, soft-wearable devices, however, has allowed for the creating of feedback systems that conform to the human form factor with minimal loss of accuracy, thus presenting possible solutions and opportunities. Thus, in this review, the researchers aim to showcase the technologies currently being used in haptic feedback, and their strengths and limitations.

1. Introduction

Touch is one of the most important senses in the human body. Touch serves as the primary means by which people interact with their environment. It is rooted deeply within people’s everyday lives, from something as mundane and simple as turning on the lights to driving a car to work, nearly every interaction when broken down will involve touch in one form or another. Because of this, much attention and research has been focused on being able to capture and convey that feeling of touch.
Haptics is described as the means by which information is conveyed through touch [1]. Currently many devices use haptics to convey information particularly through vibrations which is used to signify when new information is being expressed [2]. Beyond that, numerous other pieces of research have also been carried out in various fields to try and convey information through tactile stimulation with examples ranging from educational learning [3,4], vehicular navigation [5,6], military training [7], and physical therapy [8]. These methods, however, are rudimentary at best and are only capable of providing surface-level stimulation such as force and shape, especially when compared to the rich amount of detail that our skin is capable of discerning. Texture, for example, is a very subtle but powerful tool for conveying information. Different qualities such as roughness or smoothness convey different information which makes texture simulation an interesting but difficult topic.
Recent advances in technology have greatly increased interest in haptic research especially with regard to virtual reality. Virtual reality environments provide an experience that normally cannot be achieved through regular means by allowing its user to effectively interact and sense objects through a combination of auditory, visual, and tactile feedback systems that immerse the user through the different forms of stimulation and interaction made through the simulated environment [9]. Currently, virtual reality is primarily used as a form of entertainment either through games or interactive media, with companies such as Facebook, Google, Samsung, HTC and many others investing millions in its research and development [10,11,12]. Aside from entertainment research and application of virtual reality, systems have become more widespread in a variety of different fields due to a large number of open-source software such as Unity3D and Vuforia [3,13,14] being available. Fields that have seen significant growth include educational learning [3,4], navigation [5,6], military training [15], architectural design [16], surgery [17,18], and physical therapy [8,19,20]. Beyond virtual reality systems, augmented reality systems have also seen increased interest. Unlike virtual reality systems which have the user interacting solely with the virtual environment, augmented reality systems have the user interacting with both a virtual environment and the real-world environment through the superimposition of various virtual information systems to those of a real-world physical object and interacting with it as you would any other object [9]. More recent, augmented reality systems have also seen multiple applications in various disciplines such as in teaching [21,22], surgery [23,24], and navigation [25].
The expanding interest in haptics is full of different challenges which will require different technology to overcome. The objective of this review is to present different technologies currently being used in the field of haptics and their strengths and weaknesses.

2. The Human Skin and Its Functions in Touch

The human skin is composed of two main receptor types, the slow-adapting mechanoreceptors and the fast-adapting mechanoreceptors. These mechanoreceptors determine several pieces of information that pertains to humans’ sense of touch such as pressure and vibration, both of which play an important role in haptics. Pressure, for example, plays an important role with force sensing and shape discrimination [26,27] and vibration plays an especially important role in a number of different scenarios such as roughness discrimination and pressure sensing in the human body [28,29,30], slip detection [31], and shape perception [32,33,34]. Slow-adapting mechanoreceptors such as the slow-adapting type 1 and type 2, which have the Ruffini endings and Merkel corpuscles attached to them, mainly react to continuously applied stimuli such as pressure and skin stretch, whereas fast-adapting mechanoreceptors such as the fast-adapting type 1 and type 2, which have Pacinian and Meissner corpuscles attached to them, react only to stimuli quickly before discharging completely such as vibrations [35] and was believed to only be a supporting function to that of slow-adapting mechanoreceptors. Recent studies, however, have shown that fast-adapting mechanoreceptors play a crucial role in texture sensing than was previously believed [36,37]. A study in 2013 showed that while the previous statements are only true to a certain extent as both receptor types do react to continuous stimuli, it is only true up to a certain vibrational frequency, as slow-adapting mechanoreceptors showed less response at frequencies lower than 10 Hz, whereas fast-adapting mechanoreceptors continued to exhibit strong responses all throughout [36].
Beyond pressure and vibration, there are also other factors that make up the whole of tactile stimulation. A person’s sense and perception of touch can be described as multidimensional in that it takes into account various properties such as the roughness, hardness, stickiness and other different dimensions of an object that cannot be properly represented or described [29,37,38,39].

3. Sensors

The development of tactile sensors capable of sensing touch is one of the most difficult engineering challenges currently being tackled. While sensing vague descriptions and dimensions such as stickiness, wetness, and other similar properties that allow us to fully encapsulate touch are still beyond what most sensors are capable of accurately defining, the primary factors that determine texture, such as force and vibrations, are not. With regard to texture, an ideal tactile sensor array should then have a number of key defining features, namely the ability to detect pressure and frequency at a very wide range, high sensitivity, and high durability [40]. The sensor array should also be very flexible so as to easily conform to different type of form factors and lightweight so as not to hinder the user’s movements without sacrificing accuracy or sensitivity [41]. Currently, there is a wide variety of sensor types being used for sensing touch which can be classified based on the sensing modality being used as shown in Table 1.

3.1. Strain Gauge

Strain gauges are sensors used to measure strain. They function by measuring the change in resistance that occurs when stretching or compression is being applied. Strain gauges are known for their versatility and accuracy and can be attached to many different surfaces due to being small and light [66]. Their low-cost nature allows for easier implementation in various tests and studies [42,43,67,68,69]; however, humidity must be considered in prolonged usage as this can change the nominal resistance of the sensor. Payo et al., for example, made use of strain gauges in their study of microvibrations and its effects on slip detection. They used a strain gauge encapsulated within an aluminum shell to detect vibrations during grasping, citing observable vibrational frequency between 130 Hz to 300 Hz [44]. While strain gauges are versatile and can be attached to multiple surfaces making it possible to be easily implemented into a variety of different systems’ end effectors, they suffer, however, from a significant amount of noise which can reduce the accuracy when measuring finer textures.

3.2. Accelerometer

Accelerometers are one of the most used sensors for vibration analysis. Accelerometers function by capturing the movements in varying directions as acceleration [46]. There are generally three types of accelerometers: piezoelectric, piezoresistive, and capacitive accelerometers, with capacitive accelerometers being one of the most used. Most accelerometer-based research involves attaching the accelerometer to either a handheld probe or robotic end-effector and sliding it across a number of different surface textures to determine surface roughness from the acceleration data [45,70,71]. MEMS accelerators use integration to calculate information which can result in the noise growing and integration drift through prolonged usage [72].

3.3. Piezoresistive Sensor

Piezoresistive tactile sensors monitor the change in pressure by having two films separated by a thin sheet used to detect minute changes in resistance. As force is applied to the sensor, the thin beam sheet is strained which causes a change in the resistance of the piezoresistive element on the beam surface. Piezoresistive sensors have been used in a variety of different studies, due to their small profile and high performance, showing texture discrimination by showing different signal types at different texture types [47,48,49,50]. Other studies have also shown improved accuracy ranges through layering the piezoresistive film on top of each other or by way of inducing specific shapes and patterns onto the film [51,52,53,54]. However, piezoresistive sensors generally require high power consumption.

3.4. Piezoelectric Sensor

Piezoelectric tactile sensors use various materials or films that exhibit the piezoelectric effect which allows voltage to be built up when a mechanical force or pressure is applied to it. As force is applied to the sensor, an electric charge is generated which can be measured as a voltage proportional to the pressure. The advantage of piezoelectric sensors is in their high frequency response and high sensitivity to dynamic touch. The nature of piezoelectric materials having very large internal resistances make them uniquely similar to fast-adapting mechanoreceptors in the skin, in that they are also primarily concerned with dynamic touch scenarios which have garnered significant research in neuromorphic tactile sensors, sensors that mimic biological functions [55,56,57,58,59,73]. However, as stated previously, the large internal resistance of piezoelectric materials prevents their usage in static touch situations or wherever a continuous amount of pressure or force is required to be measured and is only usable in dynamic touch scenarios and also sees significant drop offs in accuracy at lower frequency ranges.

3.5. Optical Sensor

Optical sensors make use of cameras to distinguish surface characteristics of a textured surface. An event-based camera is used to constantly stream data and detect any changes within the targeted surface which is then used to determine the amount of force being applied onto the area. Recently, research has been carried out on a neuromorphic optical tactile sensor [60]. Neurotac is an optical sensor that has soft skin-like material with various pins attached to it, and whenever the material makes contact with a surface, the pins embedded on the skin move based on the amount of force applied and the direction of force being applied which is captured by an event-based camera. The minute changes in the pixels from the event are pooled together to form a taxel event from which the data are then gathered, similarly to how spike trains behave, which mimic how the body processes stimuli [61].

3.6. Multimodal Sensor

Multimodal tactile sensors can be classified as any device that utilizes different sensor types to improve accuracy or to compensate for deficiencies in certain sensors [74]. The Biotac sensor utilizes a number of different sensors to detect various information related to texture such as pressure, vibration, and temperature through a combination of thermoresistors, piezoresistive components, and pressure sensors [62,63]. Liang et al. describe a multimodal tactile device that seeks to overcome the piezoelectric limitation of only being usable in dynamic touch situations through the use of a photodetector to distinguish surface roughness [64]. Similarly, Ke et al. also used a piezoelectric sensor for dynamic touch situations and complimented it with a force sensor for static touch situations, citing good overall results for force measurement, contact point recognition, and texture recognition [65].

4. Haptic Feedback

Haptic feedback systems are generally classified into two main types based on the type of feedback that they provide. They are classified as cutaneous feedback and kinesthetic feedback. Figure 1 shows several different cutaneous and kinesthetic feedback devices currently being developed. Both kinesthetic and cutaneous feedback devices have their own differences and roles in haptics and understanding these strengths and limitations will allow for the development of new technologies.
Kinesthesia or proprioception is defined as a person’s perception of body movement [79]. Mechanoreceptors within the joints and muscles send signals to the brain that allow a person to effectively judge movement and motion. Proprioception is an important aspect of daily life as it effectively allows a person to “judge limb movements and positions, force, heaviness, stiffness, and viscosity” [80]. Kinesthesia is important in that it is one of the body’s main tools for judging the amount of force being exerted on any given object; however, this can be difficult when applied to virtual reality simulations as the lack of a proper and tangible object can adversely affect a person’s perception of weight [81].
The second type of haptic feedback system is the cutaneous feedback system. Cutaneous feedback refers to all the stimuli received by the mechanoreceptors located in the skin and is generally being referred to when discussing the sense of touch. Touch is responsible for the detection of pressure and vibration stimuli and is most concentrated at the fingertips [82]. The vibrations caused when gripping or holding an object are what determines the amount of force required when grasping to avoid slippage [83]. Beyond the real-world implications of cutaneous feedback, studies have shown the various effects of incorporating tactile feedback in virtual reality simulations. It was found that the implementation of cutaneous feedback can greatly improve the weight perception of users even in the absence of kinesthetic feedback [81,84]. Girard et al. cited in their study using their haptic shear system, HapTip, that the presence of cutaneous feedback can significantly improve pattern recognition and weight distribution of virtual objects [85]. Another study has also shown that cutaneous feedback significantly improved the user’s force estimation when holding objects [86]. Other studies have also shown significant improvements when conducting various tasks within the virtual environment such as writing, puzzle solving, hand-based movement exercises, and the distinguishing of curved surfaces [75,87,88,89]. Unlike kinesthetic feedback, achieving accurate and believable cutaneous feedback is a significantly more difficult challenge.
An ideal haptic device should incorporate both kinesthetic and tactile feedback within itself to better simulate how human skin interacts with objects, with the device capable of locking the hands and fingers when holding or gripping a virtual object to simulate kinesthesis and an actuator to provide haptic feedback either through indentation, vibration, or other similar means, as shown in Figure 2.

4.1. Kinesthetic Feedback

Most kinesthetic feedback devices rely on actuators generally attached to either the hands or the arms that detect movement and apply an appropriate feedback force to that area to either admit or impede movement [78,90,91,92,93,94,95,96] and to simulate various motion types such as gripping and lifting of objects. Unlike cutaneous feedback which has actuators capable of stimulating the mechanoreceptors directly, such as in electrotactile or temperature-based feedback, kinesthetic feedback systems mostly rely on force feedback which generally requires the use of large actuators to apply the appropriate force to the user. Kinesthetic feedback can be categorized into two main types, grounded and exoskeleton-based devices. Grounded force feedback systems are described as feedback systems which are mounted onto a stationary platform where the user is either attached to or holds an end effector which tracks the user’s position while modulating the amount of force that is being applied. Exoskeleton-based feedback systems use actuators that are mounted onto and/or attached to the user’s body, usually their arms or hands, and are ungrounded.
Different actuators have been used throughout the development history of kinesthetic feedback systems. Early kinesthetic feedback systems made use of hydraulic systems placed on exoskeleton devices attached to the arms [97,98]. These devices, however, were often bulky and difficult to use, preventing use outside of laboratory settings. After hydraulic actuators, pneumatic actuators eventually saw more uses in kinesthetic feedback systems due to their lightness, which allowed for less bulky and more comfortable suits [99,100]; however, the prerequisite requirement of needing a source for pressurized fluid is still a major issue.
Advances in technology have allowed for the creation of smaller actuators leading to the creation of lighter and smaller kinesthetic feedback systems allowing for a better form factor when equipped on the hands. Both Dexmo and Maestro make use of a number of motors attached to movable end effectors and synthetic fiber, respectively, as their braking system [95,101]. Cable-based actuators have also seen rising popularity, largely due to being lightweight in form factor and the flexible nature of the cable allowing for better motor placements without obstructing hand movement. Cybergrasp utilizes a series of cables attached to motors that pull the strings to prevent finger movement [102]. ExoTen uses a twisted string actuator system for its kinesthetic feedback system, with the twisted string actuator design allowing for high load outputs while using significantly smaller DC motors [78]. LucidVR also uses a string actuator attached to a series of servo motors as its braking system but with a more modular design owing to it being open source [77].
Aside from more traditional feedback systems such as hydraulics, pneumatics, and motors, there have also been more specialized braking systems that have also seen use and research. SenseGlove uses magnetic brakes instead of traditional motors to transmit force between the wires and the fingertips [103]. Dextres uses a novel electrostatic braking system for its feedback system by using two metal strips and taking advantage of the differences in electrostatic attraction between the two strips to create electrostatic friction [93]. Research has also been carried out regarding the use of soft materials for use in hand exoskeletons. Jadhav et al. made use of McKibben muscles as their primary actuator which closely mimics human muscle and has a low force–weight ratio [104]. Recently, research regarding the use of fiber jamming in robotics has also increased. Fiber jamming methods make use of the properties of certain materials to undergo changes from liquid-like behavior to solid-like behavior, thereby increasing its stiffness and load capacity [105]. The variable stiffness presented by fiber jamming presents numerous opportunities when it comes to its uses in haptics where the goal is to make the overall device as lightweight and responsive as possible so as not to obstruct the user experience [106].

4.1.1. Grounded Force Feedback System

Grounded force feedback systems are some of the oldest forms of force feedback systems. These systems are commonly used in gathering data regarding position and contact forces on the user. The original Phantom by Massie et al. makes use of an end effector attached to mechanical links with three brushed DC motors to facilitate 3-DOF movement for its user [107]. By representing only a single point within the virtual space, interactions within the environment can easily be calculated. Sato et al. designed a pulley-based force feedback system using string-based actuation [108]. The researchers utilized the pulley-based system to impede the user’s movement based on the virtual material being used: hard, soft, movable, or fixed.
One common application for grounded feedback systems in their use as data-gathering devices for force-based applications. Unlike most wearable systems, stationary systems offer a more stable and controlled experience which makes them ideal for gathering or testing data with regard to haptic applications such as in textures. These systems allow for much more space to be dedicated for sensors used in data gathering as well as allowing for platforms more dedicated for testing haptic feedback. Van Der Linde et al. describe the HapticMaster [109] that is based on an admittance control system wherein the user applies the force onto the device and the device will react with the appropriate amount of force or displacement to simulate force reaction. Leuschke et al. developed the finger haptic display which utilizes a flat coil actuator to generate force when rendering virtual objects and was used to test the psychophysical reactions of haptic perception [110].
Another application of grounded kinesthetic feedback devices is in the field of rehabilitation. While wearable exoskeleton-based systems have seen increasingly more usage, they are, however, bulky and cumbersome to wear for certain patients which makes grounded feedback systems more appealing compared to their wearable counterparts. This issue can be particularly useful for those suffering leg injuries wherein having stationary platforms help support the patient walk or stand up. Yoon et al. developed the virtual walk master which utilizes pneumatic actuators that spread to 15-DOF movement across the foot [111] and is meant to simulate different walking conditions and terrain environments. Similarly, Ding et al. developed the NUVABAT [112] which is an ankle-based balance trainer for rehabilitation. NUVABAT utilizes magnetorheological fluids (MRF) fluids to control the braking system and adjusts itself based on the walking strength of the patient.
Grounded feedback systems are also well suited in teleoperation-based systems. In such systems, the user is capable of manipulating objects at a distance by virtually mimicking the objects orientation. These systems are comprised of a leader/follower system. The follower system on the remote site is equipped with different sensors which allow it to sense and perceive the surrounding environment accurately and to relay that information to the leader side of the system. The leader system should then have the proper amount of information to simulate the data in the remote site to feel physically present. In such systems, haptic feedback plays an important role in ensuring the sense of realism or telepresence by the user. Considerations of inertia, clearance, and friction need to be made in these systems while still providing large stiffness and adaptable force feedback to make the user experience more natural. Tobergte et al. developed the sigma.7 [113], a robotic teleoperation system for minimally invasive robotic surgery. The system is based on the delta haptic device [114] which is made up of three double-bar parallelograms which allow the end effector to maintain a position that is parallel to the base of the plane. Each bar is connected via a torque actuator in the form of an electric motor attached to a cable which allows for easily changeable stiffness modulation and a maximum force of around 20 N. In addition, Vuliez et al. developed the Delthaptic that combines two delta robots to obtain a 6-DOF movement [115]. Each delta robot is actuated by three grounded motors with the upper joint being substituted by a rotational joint to supplement the movement range of the human hand.
Grounded force feedback systems offer its users high levels of movement and accuracy when it comes to force feedback. Their grounded nature allows for more complex mechanisms and designs that would be difficult to apply in wearable devices. However, their stationary nature limits their use to more specialized work environments.

4.1.2. Exoskeleton-Based Force Feedback System

Exoskeleton-based force feedback systems have seen increased interest over the past few years due to the increasing popularity of VR systems. Unlike grounded force feedback systems, wearable devices are not limited to a constrained area as they require the user to be able to move around freely while still being able to experience haptic feedback on a more general range. Wearable devices aim to provide the body with natural degrees of freedom. The device must be able to conform to the user without impairing their movement.
Like grounded force feedback systems, wearable force feedback systems utilize actuators which are attached directly onto the body to either admit or impede their movement. Different types of actuators have been used and developed and include string-based systems and pump-based systems. The fundamental principle behind these devices remains largely the same with those of grounded force feedback systems as they also aim to either admit or impede movement of the body.
One of the most common actuator types used in wearable feedback devices are string-based actuators. Cables attached to motors are fixed around key areas within the body such as joints and fingertips which control movement and are controlled via motors which adjust the amount of force onto the string causing them to tense up in response to an applied force. String-based actuators generally require simple design and construction with fast and reliable manipulation and can easily be scaled up, making them popular for many DIY projects. Schielle et al. developed a Bowden cable actuator for use in force feedback exoskeletons in the arm [116]. Within a Bowden transmission, a cable is guided inside a sheath and connected on both ends to the motor and robotic joint in a pull–pull configuration. Hosseini et al. developed the ExoTen glove [78] which utilizes a twisted-string-based system. Two strings are attached to the ends of a rotative DC motor and a moving element. As the DC motor rotates, the length of the strings is shortened which generates a linear motion onto the moving element. Herbin et al. also made use of a Bowden cable-based actuator system in the design of their 7-DOF arm-based exoskeleton. The design of the device is based on three groups of joints based on the main arm divisions. A set of actuators was placed on the shoulder joints, then elbow joint for forearm movements, and finally the wrists for hand movement. The cable was then attached to the fixed actuator base and rotary joints to facilitate movement [117].
Pump-based actuators are also used in wearable devices. In these systems, a pump is used to convert electrical energy into fluid or gas pressure which is used to drive the system. Key challenges in these types of systems generally involve the size of the actuation system and the overall bulk of it all, due to requiring thick wires and a pump system for these to function. Lee et al. utilized a micro-hydraulic system to contract and relax artificial muscles for force feedback applications [98]. In this research, they made use of McKibben muscles wherein an expandable tube is wrapped in a braided veil while filled with working fluid, allowing the volume and pressure within the tube to be controlled. Das et al. designed a force feedback glove using pneumatically powered artificial muscles [118]. The glove includes four sets of pneumatic artificial muscles, each designed for a specific range of movement such as flexion, extension, pronation, and supination which mimic how human muscle works by contracting and relaxing based on the air pressure provided by the valves. Takahashi et al. developed a soft pneumatically powered exoskeleton glove with 20-DOF movement [119]. In this research, two pairs of artificial muscles were attached to each finger to represent flexor and extensor movements of the fingers allowing for multiple postures for a total of 20-DOF on all five fingers. The study also made use of McKibben muscles to more closely mimic human muscle movement.
While wearable force feedback systems offer a promising future when it comes to haptic feedback, they currently still suffer from several major issues, particularly involving their bulk which, in the case of string-based systems, requires multiple lines and cables being attached to multiple individual joints in the arm. For pump-based systems, aside from the thickness of the wiring, there is also an issue with their pump systems needing to be brought constantly alongside the rest of the force feedback system which severely limits their portability while adding to the overall bulk of the system. However, newer research such as that performed in Dextres [93] showcases the thinner and lighter actuator systems which could further push the trend in developing smaller and lighter force feedback systems, although challenges regarding those types of systems’ power consumption still need to be addressed.

4.2. Cutaneous Feedback

Unlike kinesthetic feedback systems, which generally require larger actuators due to their need to be attached to the hands, arms, or both, cutaneous feedback devices only need to be placed in key areas of the hand, usually where the mechanoreceptors are most densely concentrated, to achieve proper feedback [75,86]. The more concentrated characteristic of cutaneous feedback allows for significantly smaller devices to be made while applying significantly less force while still being believable [81,84]. This has resulted in a variety of different methods for replicating touch.

4.2.1. Vibrotactile Stimulation

The most common type of cutaneous feedback applied is vibrotactile stimulation. Vibrotactile stimulation is both low-cost, low-power, and low-profile, allowing for it to be attached in a variety of locations within the hand. Many electronic devices including phones and computers make use of vibrations for alerts and most commercially available virtual reality systems such as the Oculus Quest and HTC Vive make use of vibrotactile stimulation as their primary form of cutaneous feedback [120]. Vibrotactile phantom stimulation is used to provide the illusion of slippage and weight onto objects by way of a series of magnetic pins that vibrate in response to interactions in the virtual system [121,122]. Since human texture recognition is associated with microvibrations in the range of 1 Hz to around 500 Hz, there is a possibility in mimicking these microvibrations through small actuators to replicate finger movements and haptic feedback [123]. The previously mentioned Dextres, aside from generating kinesthetic feedback, also incorporates cutaneous feedback in the form of vibrations from a piezovibe, which is described as a series of resonating piezoelectric beams [93]. Different research has also combined visual and haptic feedback by showing different textures through a vibration/friction highlighting its possible use in mimicking texture [124,125]. Vibrotactile stimulation, while simple and easy to implement, is, however, very limited in the types of stimulation that it can achieve, hence why it is generally paired with other feedback systems instead [90,93].

4.2.2. Skin Indentation

Indentation is another form of cutaneous stimulation that is also commonly used. It relies on the use of actuators that apply a normal force onto the skin by way of indentation and depending on the method of application can be used to project a sense of weight and shape to an object. These devices are generally attached to the skin itself or mounted to gloves. As the fingers have the most dense concentration of receptors [75], they are usually the focal point of where most devices are attached, either directly onto the skin or through a glove. Depending on the amount of force being applied, one can change the perceived amount of force being applied on a virtual object. One of the most common ways of implementing indentation-based feedback is through the use of motors which drive a series of end effectors directly onto the skin at varying forces [86,89,123,126,127]. TacTiles apply indentation using pins attached to an electromagnetic actuator which can lock itself in place to simulate grasping [76]. While skin indentation is also a low-cost and effective means of replicating touch, it suffers from problems regarding bulk as the actuators used to drive its systems need to be placed at the fingertips. In addition, it can inhibit a person’s movements which, while a non-factor in virtual reality environments, is a negative when taken into augmented reality systems where interaction with both virtual and real-world systems is required.
Hydraulically amplified self-healing electrostatic devices are a type of dielectric actuator using liquid dielectric instead of elastomers, providing integrated hydraulic amplification [128] wherein applying voltage pushes the fluid to simulate touch. Fluid actuators such as hydraulically amplified taxel (HAXEL) enable dense and flexible cutaneous haptic feedback. By using segmented electrodes, we can not only push the central bump up, but also shift it north/south and east/west and create rotation-like motion [129]. The VHB 4910 acrylic elastomer gave the highest performance in terms of strain and actuation pressure [130]. By using fluid actuation, we can achieve both tactile and kinesthetic feedback sensation, and this is proven in the research in which the electrorheological (ER) fluids are used, and the haptic sensation is controlled by applied electric field [131]. The use of ER fluids also shows promise as a tool for kinesthetic force feedback systems.

4.2.3. Skin Stretch

Another form of cutaneous feedback can be achieved in the form of skin stretch, wherein skin stretch is used to activate both slow-adapting and fast-adapting mechanoreceptors to simulate motion such as gripping. The common way of inducing skin stretch is using platforms that are attached to the base of the fingers and connected to motors. The rotation of the motors will determine the type of force being exerted, which can be a normal force, shear force, or a combination of both [85,89,132]. Since most of the device is concentrated on the back hand side, additional sensors can also be attached to the movable platform for more accurate readings and output [88,127]. Other research has also made use of fabric instead of a solid platform as the base of their device which allows for a significantly smaller form factor. Using motors to either pull or push the fabric creates either normal force or shear force [75,133]. Recently, the use of electrostatic actuators has also seen usage in cutaneous feedback such as the HAXEL, which uses electrostatic actuators to push fluid onto a vacuum allowing it to change its stiffness [129].

4.2.4. Electrotactile Feedback

The use of electric stimulation in discriminating different surface textures has also seen increased interest, specifically in the field of prosthetics. Most electrotactile stimulation involves the use of microelectrodes which are implanted beneath the skin and directly onto the targeted nerve to provide direct stimulation through electric voltage which mimics human electrical signals based on the type of sensation being simulated [134,135,136]. These methods provide their users the ability to distinguish shape, stiffness, and texture recognition to a high degree and show promising results; however, they come with significant drawbacks such as being limited to invasive surgery and overall system stability and reliability in extended usage [137], which can severely hinder their uses outside of controlled environments. One promising way of electric stimulation, however, can potentially solve the previous issues and instead makes use of electrodes placed on the skin to deliver electric voltage to the skin to stimulate the nerve endings without the use of invasive surgery [138] and this process is known as transcutaneous electrical nerve stimulation or (TENS) and is used primarily for providing sensory information in a prosthesis, one of which involves tactile feedback [139].
Currently, the use of TENS as a means of tactile stimulation is limited primarily to prosthetic devices as is generally the most common way amputees can experience sensory information. Vargas et al. used a 2 × 8 electrode array attached to the upper arm which was used to evoke finger sensation by transforming the fingertip forces to current amplitude, allowing for the distinguishing of shapes and sizes [140]. Edoardo et al. used TENS to stimulate force sensations in the hands and fingertips allowing patients to output force levels that would not be possible otherwise [141]. Other studies have also shown the use of electrotactile stimulation for improved grasping and proprioception [142,143,144].
While research regarding TENS is limited mostly in the field of prosthetics, there is also a possibility of its use in the field of traditional haptics. Heidi et al. used a combination of electrotactile and vibrotactile feedback to improve hand movements [145] in both amputees and non-amputees. Pamela et al. used a microphone to distinguish sound caused by the friction of rubbing different texture types and transforming them into electrical stimulation [146]. The non-invasive approach of TENS systems may allow them to be integrated into other haptic feedback systems such as kinesthetic feedback systems which usually lack any form of cutaneous feedback; however, the high electrical voltage required is a significant issue.

5. Conclusions

In conclusion, research regarding haptic feedback is still in its infancy. The complex nature of the human skin makes it very difficult to properly capture and convey haptic information. As VR and AR systems continue to grow in popularity, haptic feedback systems also need to improve so as to better match the immersive systems in VR and AR settings. Despite improvements, issues such as power and bulk are still major issues that need to be considered when developing newer technologies. Other parameters such as pain, numbness, cold, and hot sensations, for example, could also be explored, with research regarding the use of chemicals to induce haptic stimulation such as pain and numbness seeing interest [147,148]. Furthermore, the need for multiple sensors to concurrently measure cutaneous feedback and kinesthetic feedback would be the next steps but the bulk and stability are challenges that need to be addressed.

Author Contributions

Conceptualization, A.R.S. and J.A.G.C.; methodology, A.R.S.; investigation, A.R.S. and J.A.G.C.; resources, A.R.S.; writing—original draft preparation, J.A.G.C. and K.C.; writing—review and editing, A.R.S.; supervision, A.R.S.; project administration, A.R.S.; funding acquisition, A.R.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Ministry of Science and Technology (MOST), Taiwan, under Grant MOST 109-2222-E-218-001–MY2 and Ministry of Education, Taiwan, under grant MOE 1300-108P097.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hannaford, B.; Okamura, A.M. Haptics. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1063–1084. [Google Scholar]
  2. Salisbury, K.; Conti, F.; Barbagli, F. Haptic rendering: Introductory concepts. IEEE Comput. Graph. Appl. 2004, 24, 24–32. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Li, C.; Tang, B. Research on the application of AR technology based on Unity3D in education. J. Phys. Conf. Ser. 2019, 1168, 032045. [Google Scholar] [CrossRef]
  4. Englund, C.; Olofsson, A.D.; Price, L. Teaching with technology in higher education: Understanding conceptual change and development in practice. High. Educ. Res. Dev. 2017, 36, 73–87. [Google Scholar] [CrossRef]
  5. Xie, B.; Liu, H.; Alghofaili, R.; Zhang, Y.; Jiang, Y.; Lobo, F.D.; Li, C.; Li, W.; Huang, H.; Akdere, M. A Review on Virtual Reality Skill Training Applications. Front. Virtual Real. 2021, 2, 49. [Google Scholar] [CrossRef]
  6. Cao, F.-H. A Ship Driving Teaching System Based on Multi-level Virtual Reality Technology. Int. J. Emerg. Technol. Learn. 2016, 11, 26–31. [Google Scholar] [CrossRef] [Green Version]
  7. Alexander, T.; Westhoven, M.; Conradi, J. Virtual environments for competency-oriented education and training. In Advances in Human Factors, Business Management, Training and Education; Springer: Berlin/Heidelberg, Germany, 2017; pp. 23–29. [Google Scholar]
  8. Reilly, C.A.; Greeley, A.B.; Jevsevar, D.S.; Gitajn, I.L. Virtual reality-based physical therapy for patients with lower extremity injuries: Feasibility and acceptability. OTA Int. 2021, 4, e132. [Google Scholar] [CrossRef] [PubMed]
  9. Cipresso, P.; Giglioli, I.A.C.; Raya, M.A.; Riva, G. The past, present, and future of virtual and augmented reality research: A network and cluster analysis of the literature. Front. Psychol. 2018, 9, 2086. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Castelvecchi, D. Low-cost headsets boost virtual reality’s lab appeal. Nat. News 2016, 533, 153. [Google Scholar] [CrossRef] [Green Version]
  11. Flavián, C.; Ibáñez-Sánchez, S.; Orús, C. The impact of virtual, augmented and mixed reality technologies on the customer experience. J. Bus. Res. 2019, 100, 547–560. [Google Scholar] [CrossRef]
  12. Ebert, C. Looking into the Future. IEEE Softw. 2015, 32, 92–97. [Google Scholar] [CrossRef] [Green Version]
  13. Wang, S.; Mao, Z.; Zeng, C.; Gong, H.; Li, S.; Chen, B. A new method of virtual reality based on Unity3D. In Proceedings of the 2010 18th International Conference on Geoinformatics, Beijing, China, 18–20 June 2010; pp. 1–5. [Google Scholar]
  14. Brookes, J.; Warburton, M.; Alghadier, M.; Mon-Williams, M.; Mushtaq, F. Studying human behavior with virtual reality: The Unity Experiment Framework. Behav. Res. Methods 2020, 52, 455–463. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Ajey, L. Virtual reality and its military utility. J. Ambient Intell. Humaniz. Comput. 2013, 4, 17–26. [Google Scholar]
  16. Song, H.; Chen, F.; Peng, Q.; Zhang, J.; Gu, P. Improvement of user experience using virtual reality in open-architecture product design. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 2018, 232, 2264–2275. [Google Scholar] [CrossRef]
  17. Lelevé, A.; McDaniel, T.; Rossa, C. Haptic training simulation. Front. Virtual Real. 2020, 1, 3. [Google Scholar] [CrossRef]
  18. Piromchai, P.; Avery, A.; Laopaiboon, M.; Kennedy, G.; O’Leary, S. Virtual reality training for improving the skills needed for performing surgery of the ear, nose or throat. Cochrane Database Syst. Rev. 2015, 9, CD010198. [Google Scholar] [CrossRef]
  19. Feng, H.; Li, C.; Liu, J.; Wang, L.; Ma, J.; Li, G.; Gan, L.; Shang, X.; Wu, Z. Virtual reality rehabilitation versus conventional physical therapy for improving balance and gait in parkinson’s disease patients: A randomized controlled trial. Med. Sci. Monit. Int. Med. J. Exp. Clin. Res. 2019, 25, 4186. [Google Scholar] [CrossRef]
  20. Kim, K.-J.; Heo, M. Comparison of virtual reality exercise versus conventional exercise on balance in patients with functional ankle instability: A randomized controlled trial. J. Back Musculoskelet. Rehabil. 2019, 32, 905–911. [Google Scholar] [CrossRef] [PubMed]
  21. Akçayır, M.; Akçayır, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educ. Res. Rev. 2017, 20, 1–11. [Google Scholar] [CrossRef]
  22. Khan, T.; Johnston, K.; Ophoff, J. The impact of an augmented reality application on learning motivation of students. Adv. Hum.-Comput. Interact. 2019, 2019, 7208494. [Google Scholar] [CrossRef] [Green Version]
  23. De Buck, S.; Maes, F.; Ector, J.; Bogaert, J.; Dymarkowski, S.; Heidbuchel, H.; Suetens, P. An augmented reality system for patient-specific guidance of cardiac catheter ablation procedures. IEEE Trans. Med. Imaging 2005, 24, 1512–1524. [Google Scholar] [CrossRef]
  24. Jiang, H.; Xu, S.; State, A.; Feng, F.; Fuchs, H.; Hong, M.; Rozenblit, J. Enhancing a laparoscopy training system with augmented reality visualization. In Proceedings of the 2019 Spring Simulation Conference (SpringSim), Tucson, AZ, USA, 29 April–2 May 2019; pp. 1–12. [Google Scholar]
  25. Satriadi, K.A.; Ens, B.; Cordeil, M.; Jenny, B.; Czauderna, T.; Willett, W. Augmented reality map navigation with freehand gestures. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 593–603. [Google Scholar]
  26. Lederman, S.J. Tactile roughness of grooved surfaces: The touching process and effects of macro-and microsurface structure. Percept. Psychophys. 1974, 16, 385–395. [Google Scholar] [CrossRef] [Green Version]
  27. Lederman, S.J.; Taylor, M.M. Fingertip force, surface geometry, and the perception of roughness by active touch. Percept. Psychophys. 1972, 12, 401–408. [Google Scholar] [CrossRef] [Green Version]
  28. Klatzky, R.L.; Lederman, S.J. Tactile roughness perception with a rigid link interposed between skin and surface. Percept. Psychophys. 1999, 61, 591–607. [Google Scholar] [CrossRef] [PubMed]
  29. Tiest, W.M.B. Tactual perception of material properties. Vis. Res. 2010, 50, 2775–2782. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. LaMotte, R.H.; Mountcastle, V.B. Capacities of humans and monkeys to discriminate vibratory stimuli of different frequency and amplitude: A correlation between neural events and psychological measurements. J. Neurophysiol. 1975, 38, 539–559. [Google Scholar] [CrossRef] [PubMed]
  31. Barrea, A.; Delhaye, B.P.; Lefèvre, P.; Thonnard, J.-L. Perception of partial slips under tangential loading of the fingertip. Sci. Rep. 2018, 8, 7032. [Google Scholar] [CrossRef] [Green Version]
  32. Darian-Smith, I.; Davidson, I.; Johnson, K.O. Peripheral neural representation of spatial dimensions of a textured surface moving across the monkey’s finger pad. J. Physiol. 1980, 309, 135–146. [Google Scholar] [CrossRef]
  33. Lamb, G.D. Tactile discrimination of textured surfaces: Peripheral neural coding in the monkey. J. Physiol. 1983, 338, 567–587. [Google Scholar] [CrossRef] [Green Version]
  34. Lamb, G.D. Tactile discrimination of textured surfaces: Psychophysical performance measurements in humans. J. Physiol. 1983, 338, 551–565. [Google Scholar] [CrossRef]
  35. Ergen, E.; Ulkar, B. Proprioception and Coordination; Elsevier Health Sciences: Amsterdam, The Netherlands, 2007; pp. 237–255. [Google Scholar]
  36. Weber, A.I.; Saal, H.P.; Lieber, J.D.; Cheng, J.-W.; Manfredi, L.R.; Dammann, J.F.; Bensmaia, S.J. Spatial and temporal codes mediate the tactile perception of natural textures. Proc. Natl. Acad. Sci. USA 2013, 110, 17107–17112. [Google Scholar] [CrossRef] [Green Version]
  37. Lieber, J.D.; Bensmaia, S.J. High-dimensional representation of texture in somatosensory cortex of primates. Proc. Natl. Acad. Sci. USA 2019, 116, 3268–3277. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Bensmaia, S. Texture from touch. Scholarpedia 2009, 4, 7956. [Google Scholar] [CrossRef]
  39. Okamoto, S.; Nagano, H.; Yamada, Y. Psychophysical dimensions of tactile perception of textures. IEEE Trans. Haptics 2012, 6, 81–93. [Google Scholar] [CrossRef] [PubMed]
  40. Bartolozzi, C.; Natale, L.; Nori, F.; Metta, G. Robots with a sense of touch. Nat. Mater. 2016, 15, 921–925. [Google Scholar] [CrossRef]
  41. Chortos, A.; Liu, J.; Bao, Z. Pursuing prosthetic electronic skin. Nat. Mater. 2016, 15, 937–950. [Google Scholar] [CrossRef]
  42. Kim, K.; Lee, K.R.; Kim, W.H.; Park, K.-B.; Kim, T.-H.; Kim, J.-S.; Pak, J.J. Polymer-based flexible tactile sensor up to 32× 32 arrays integrated with interconnection terminals. Sens. Actuators A Phys. 2009, 156, 284–291. [Google Scholar] [CrossRef]
  43. Hu, Y.; Katragadda, R.B.; Tu, H.; Zheng, Q.; Li, Y.; Xu, Y. Bioinspired 3-D tactile sensor for minimally invasive surgery. J. Microelectromech. Syst. 2010, 19, 1400–1408. [Google Scholar] [CrossRef]
  44. Fernandez, R.; Payo, I.; Vazquez, A.S.; Becedas, J. Micro-vibration-based slip detection in tactile force sensors. Sensors 2014, 14, 709–730. [Google Scholar] [CrossRef]
  45. Chathuranga, K.; Hirai, S. A bio-mimetic fingertip that detects force and vibration modalities and its application to surface identification. In Proceedings of the 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China, 11–14 December 2012; pp. 575–581. [Google Scholar]
  46. Romano, J.M.; Kuchenbecker, K.J. Creating realistic virtual textures from contact acceleration data. IEEE Trans. Haptics 2011, 5, 109–119. [Google Scholar] [CrossRef]
  47. Oddo, C.M.; Controzzi, M.; Beccai, L.; Cipriani, C.; Carrozza, M.C. Roughness encoding for discrimination of surfaces in artificial active-touch. IEEE Trans. Robot. 2011, 27, 522–533. [Google Scholar] [CrossRef]
  48. Wei, Y.; Chen, S.; Yuan, X.; Wang, P.; Liu, L. Multiscale wrinkled microstructures for piezoresistive fibers. Adv. Funct. Mater. 2016, 26, 5078–5085. [Google Scholar] [CrossRef]
  49. Rongala, U.B.; Mazzoni, A.; Oddo, C.M. Neuromorphic artificial touch for categorization of naturalistic textures. IEEE Trans. Neural Netw. Learn. Syst. 2015, 28, 819–829. [Google Scholar] [CrossRef] [PubMed]
  50. Wang, Y.; Chen, J.; Mei, D. Recognition of surface texture with wearable tactile sensor array: A pilot Study. Sens. Actuators A Phys. 2020, 307, 111972. [Google Scholar] [CrossRef]
  51. Nguyen, H.; Osborn, L.; Iskarous, M.; Shallal, C.; Hunt, C.; Betthauser, J.; Thakor, N. Dynamic texture decoding using a neuromorphic multilayer tactile sensor. In Proceedings of the 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, OH, USA, 17–19 October 2018; pp. 1–4. [Google Scholar]
  52. Cao, Y.; Li, T.; Gu, Y.; Luo, H.; Wang, S.; Zhang, T. Fingerprint-inspired flexible tactile sensor for accurately discerning surface texture. Small 2018, 14, 1703902. [Google Scholar] [CrossRef]
  53. Sankar, S.; Balamurugan, D.; Brown, A.; Ding, K.; Xu, X.; Low, J.H.; Yeow, C.H.; Thakor, N. Texture discrimination with a soft biomimetic finger using a flexible neuromorphic tactile sensor array that provides sensory feedback. Soft Robot. 2020, 8, 577–587. [Google Scholar] [CrossRef]
  54. Gupta, A.K.; Ghosh, R.; Swaminathan, A.N.; Deverakonda, B.; Ponraj, G.; Soares, A.B.; Thakor, N.V. A neuromorphic approach to tactile texture recognition. In Proceedings of the 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), Cleveland, OH, USA, 17–19 October 2018; pp. 1322–1328. [Google Scholar]
  55. Yi, Z.; Zhang, Y.; Peters, J. Bioinspired tactile sensor for surface roughness discrimination. Sens. Actuators A Phys. 2017, 255, 46–53. [Google Scholar] [CrossRef]
  56. Qin, L.; Zhang, Y. Roughness discrimination with bio-inspired tactile sensor manually sliding on polished surfaces. Sens. Actuators A Phys. 2018, 279, 433–441. [Google Scholar] [CrossRef]
  57. Yi, Z.; Zhang, Y. Bio-inspired tactile FA-I spiking generation under sinusoidal stimuli. J. Bionic Eng. 2016, 13, 612–621. [Google Scholar] [CrossRef]
  58. Birkoben, T.; Winterfeld, H.; Fichtner, S.; Petraru, A.; Kohlstedt, H. A spiking and adapting tactile sensor for neuromorphic applications. Sci. Rep. 2020, 10, 17260. [Google Scholar] [CrossRef]
  59. Shaikh, M.O.; Lin, C.-M.; Lee, D.-H.; Chiang, W.-F.; Chen, I.-H.; Chuang, C.-H. Portable pen-like device with miniaturized tactile sensor for quantitative tissue palpation in oral cancer screening. IEEE Sens. J. 2020, 20, 9610–9617. [Google Scholar] [CrossRef]
  60. Ward-Cherrier, B.; Pestell, N.; Cramphorn, L.; Winstone, B.; Giannaccini, M.E.; Rossiter, J.; Lepora, N.F. The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies. Soft Robot. 2018, 5, 216–227. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  61. Ward-Cherrier, B.; Pestell, N.; Lepora, N.F. Neurotac: A neuromorphic optical tactile sensor applied to texture recognition. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Virtual Event, USA, 31 May–31 August 2020; pp. 2654–2660. [Google Scholar]
  62. Fishel, J.A.; Santos, V.J.; Loeb, G.E. A robust micro-vibration sensor for biomimetic fingertips. In Proceedings of the 2008 2nd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Scottsdale, AZ, USA, 19–22 October 2008; pp. 659–663. [Google Scholar]
  63. Fishel, J.A.; Loeb, G.E. Sensing tactile microvibrations with the BioTac—Comparison with human sensitivity. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–28 June 2012; pp. 1122–1127. [Google Scholar]
  64. Liang, Q.; Yi, Z.; Hu, Q.; Zhang, Y. Low-cost sensor fusion technique for surface roughness discrimination with optical and piezoelectric sensors. IEEE Sens. J. 2017, 17, 7954–7960. [Google Scholar] [CrossRef]
  65. Ke, A.; Huang, J.; Chen, L.; Gao, Z.; Han, J.; Wang, C.; Zhou, J.; He, J. Fingertip tactile sensor with single sensing element based on FSR and PVDF. IEEE Sens. J. 2019, 19, 11100–11112. [Google Scholar] [CrossRef]
  66. Knud, G. NDE Handbook; Butterworth-Heinemann: Oxford, UK, 1989; pp. 295–301. [Google Scholar]
  67. Wiertlewski, M.; Lozada, J.; Hayward, V. The spatial spectrum of tangential skin displacement can encode tactual texture. IEEE Trans. Robot. 2011, 27, 461–472. [Google Scholar] [CrossRef] [Green Version]
  68. Nobuyama, L.; Kurashina, Y.; Kawauchi, K.; Matsui, K.; Takemura, K. Tactile estimation of molded plastic plates based on the estimated impulse responses of mechanoreceptive units. Sensors 2018, 18, 1588. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  69. Yin, J.; Aspinall, P.; Santos, V.J.; Posner, J.D. Measuring dynamic shear force and vibration with a bioinspired tactile sensor skin. IEEE Sens. J. 2018, 18, 3544–3553. [Google Scholar] [CrossRef]
  70. Lang, J.; Andrews, S. Measurement-based modeling of contact forces and textures for haptic rendering. IEEE Trans. Vis. Comput. Graph. 2010, 17, 380–391. [Google Scholar] [CrossRef] [Green Version]
  71. Culbertson, H.; Kuchenbecker, K.J. Ungrounded haptic augmented reality system for displaying roughness and friction. IEEE/ASME Trans. Mechatron. 2017, 22, 1839–1849. [Google Scholar] [CrossRef]
  72. Mohd-Yasin, F.; Nagel, D.J.; Korman, C.E. Noise in MEMS. Meas. Sci. Technol. 2009, 21, 012001. [Google Scholar] [CrossRef]
  73. Hu, H.; Han, Y.; Song, A.; Chen, S.; Wang, C.; Wang, Z. A finger-shaped tactile sensor for fabric surfaces evaluation by 2-dimensional active sliding touch. Sensors 2014, 14, 4899–4913. [Google Scholar] [CrossRef] [Green Version]
  74. Almassri, A.M.; Wan Hasan, W.; Ahmad, S.A.; Ishak, A.J.; Ghazali, A.; Talib, D.; Wada, C. Pressure sensor: State of the art, design, and application for robotic hand. J. Sens. 2015, 2015, 846487. [Google Scholar] [CrossRef] [Green Version]
  75. Pacchierotti, C.; Salvietti, G.; Hussain, I.; Meli, L.; Prattichizzo, D. The hRing: A wearable haptic device to avoid occlusions in hand tracking. In Proceedings of the 2016 IEEE Haptics Symposium (HAPTICS), Philadelphia, PA, USA, 8–11 April 2016; pp. 134–139. [Google Scholar]
  76. Vechev, V.; Zarate, J.; Lindlbauer, D.; Hinchet, R.; Shea, H.; Hilliges, O. Tactiles: Dual-mode low-power electromagnetic actuators for rendering continuous contact and spatial haptic patterns in VR. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 312–320. [Google Scholar]
  77. LucidVR, LucidGloves. Available online: https://github.com/LucidVR/lucidgloves (accessed on 10 October 2021).
  78. Hosseini, M.; Sengül, A.; Pane, Y.; De Schutter, J.; Bruyninck, H. Exoten-glove: A force-feedback haptic glove based on twisted string actuation system. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 320–327. [Google Scholar]
  79. Placzek, J.D.; Boyce, D.A. Orthopaedic Physical Therapy Secrets-E-Book; Elsevier Health Sciences: Amsterdam, The Netherlands, 2016. [Google Scholar]
  80. Taylor, J. Proprioception; Elsevier: Amsterdam, The Netherlands, 2009; pp. 1143–1149. [Google Scholar]
  81. Park, J.; Son, B.; Han, I.; Lee, W. Effect of cutaneous feedback on the perception of virtual object weight during manipulation. Sci. Rep. 2020, 10, 1357. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  82. Bloom, F.E.; Spitzer, N.C.; Gage, F.; Albright, T. Encyclopedia of Neuroscience; Academic Press: Cambridge, MA, USA, 2009; Volume 1. [Google Scholar]
  83. Westling, G.; Johansson, R.S. Responses in glabrous skin mechanoreceptors during precision grip in humans. Exp. Brain Res. 1987, 66, 128–140. [Google Scholar] [CrossRef] [PubMed]
  84. Quek, Z.F.; Schorr, S.B.; Nisky, I.; Provancher, W.R.; Okamura, A.M. Sensory substitution and augmentation using 3-degree-of-freedom skin deformation feedback. IEEE Trans. Haptics 2015, 8, 209–221. [Google Scholar] [CrossRef] [PubMed]
  85. Girard, A.; Marchal, M.; Gosselin, F.; Chabrier, A.; Louveau, F.; Lécuyer, A. Haptip: Displaying haptic shear forces at the fingertips for multi-finger interaction in virtual environments. Front. ICT 2016, 3, 6. [Google Scholar] [CrossRef] [Green Version]
  86. Choi, I.; Culbertson, H.; Miller, M.R.; Olwal, A.; Follmer, S. Grabity: A wearable haptic interface for simulating weight and grasping in virtual reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, Quebec City, QC, Canada, 22–25 October 2017; pp. 119–130. [Google Scholar]
  87. Maisto, M.; Pacchierotti, C.; Chinello, F.; Salvietti, G.; De Luca, A.; Prattichizzo, D. Evaluation of wearable haptic systems for the fingers in augmented reality applications. IEEE Trans. Haptics 2017, 10, 511–522. [Google Scholar] [CrossRef] [Green Version]
  88. Prattichizzo, D.; Chinello, F.; Pacchierotti, C.; Malvezzi, M. Towards wearability in fingertip haptics: A 3-dof wearable device for cutaneous force feedback. IEEE Trans. Haptics 2013, 6, 506–516. [Google Scholar] [CrossRef]
  89. Schorr, S.B.; Okamura, A.M. Three-dimensional skin deformation as force substitution: Wearable device design and performance during haptic exploration of virtual environments. IEEE Trans. Haptics 2017, 10, 418–430. [Google Scholar] [CrossRef]
  90. Cyberglove Systems LLC, Cyberforce. Available online: http://www.cyberglovesystems.com/cyberforce (accessed on 18 October 2021).
  91. Force Dimension, Omega.6. Available online: https://www.forcedimension.com/images/doc/specsheet_-_omega6.pdf (accessed on 10 October 2021).
  92. Phantom Omni—6 DOF Master Device. Available online: https://delfthapticslab.nl/device/phantom-omni/ (accessed on 10 October 2021).
  93. Hinchet, R.; Vechev, V.; Shea, H.; Hilliges, O. Dextres: Wearable haptic feedback for grasping in vr via a thin form-factor electrostatic brake. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, Berlin, Germany, 14–17 October 2018; pp. 901–912. [Google Scholar]
  94. Egawa, M.; Watanabe, T.; Nakamura, T. Development of a wearable haptic device with pneumatic artificial muscles and MR brake. In Proceedings of the 2015 IEEE Virtual Reality (VR), Arles, France, 23–27 March 2015; pp. 173–174. [Google Scholar]
  95. Maestro Glove. Available online: https://contact.ci/#maestro-product (accessed on 10 October 2021).
  96. Carpi, F.; Mannini, A.; De Rossi, D. Elastomeric contractile actuators for hand rehabilitation splints. In Proceedings of the Electroactive Polymer Actuators and Devices (EAPAD) 2008, San Diego, CA, USA, 10–13 March 2008; p. 692705. [Google Scholar]
  97. Mistry, M.; Mohajerian, P.; Schaal, S. An exoskeleton robot for human arm movement study. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 4071–4076. [Google Scholar]
  98. Ryu, D.; Moon, K.-W.; Nam, H.; Lee, Y.; Chun, C.; Kang, S.; Song, J.-B. Micro hydraulic system using slim artificial muscles for a wearable haptic glove. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 3028–3033. [Google Scholar]
  99. Polygerinos, P.; Wang, Z.; Galloway, K.C.; Wood, R.J.; Walsh, C.J. Soft robotic glove for combined assistance and at-home rehabilitation. Robot. Auton. Syst. 2015, 73, 135–143. [Google Scholar] [CrossRef] [Green Version]
  100. Wehner, M.; Quinlivan, B.; Aubin, P.M.; Martinez-Villalpando, E.; Baumann, M.; Stirling, L.; Holt, K.; Wood, R.; Walsh, C. A lightweight soft exosuit for gait assistance. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 3362–3369. [Google Scholar]
  101. Dexmo. Available online: https://www.dextarobotics.com/about (accessed on 10 October 2021).
  102. Cybergrasp. Available online: http://www.cyberglovesystems.com/cybergrasp (accessed on 18 October 2021).
  103. Senseglove. Available online: https://www.senseglove.com/about-us/ (accessed on 10 October 2021).
  104. Jadhav, S.; Kannanda, V.; Kang, B.; Tolley, M.T.; Schulze, J.P. Soft robotic glove for kinesthetic haptic feedback in virtual reality environments. Electron. Imaging 2017, 2017, 19–24. [Google Scholar] [CrossRef] [Green Version]
  105. Brancadoro, M.; Manti, M.; Tognarelli, S.; Cianchetti, M. Fiber jamming transition as a stiffening mechanism for soft robotics. Soft Robot. 2020, 7, 663–674. [Google Scholar] [CrossRef] [PubMed]
  106. Jadhav, S.; Majit, M.R.A.; Shih, B.; Schulze, J.P.; Tolley, M.T. Variable Stiffness Devices Using Fiber Jamming for Application in Soft Robotics and Wearable Haptics. Soft Robot. 2021, 9, 173–186. [Google Scholar] [CrossRef] [PubMed]
  107. Massie, T.H.; Salisbury, J.K. The phantom haptic interface: A device for probing virtual objects. In Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, IL, USA, 13–18 November 1994; pp. 295–300. [Google Scholar]
  108. Sato, M.; Hirata, Y.; Kawarada, H. Space interface device for artificial reality—SPIDAR. Syst. Comput. Jpn. 1992, 23, 44–54. [Google Scholar] [CrossRef]
  109. Van der Linde, R.Q.; Lammertse, P.; Frederiksen, E.; Ruiter, B. The HapticMaster, a new high-performance haptic interface. In Proceedings of the Eurohaptics, Madrid, Spain, 10–13 June 2002; pp. 1–5. [Google Scholar]
  110. Leuschke, R.; Kurihara, E.K.; Dosher, J.; Hannaford, B. High fidelity multi finger haptic display. In Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics Conference, Pisa, Italy, 18–20 March 2005; pp. 606–608. [Google Scholar]
  111. Yoon, J.; Ryu, J.; Burdea, G. Design and analysis of a novel virtual walking machine. In Proceedings of the 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, HAPTICS 2003, Los Angeles, CA, USA, 22–23 March 2003; pp. 374–381. [Google Scholar]
  112. Ding, Y.; Sivak, M.; Weinberg, B.; Mavroidis, C.; Holden, M.K. Nuvabat: Northeastern university virtual ankle and balance trainer. In Proceedings of the 2010 IEEE Haptics Symposium, Waltham, MA, USA, 25–26 March 2010; pp. 509–514. [Google Scholar]
  113. Tobergte, A.; Helmer, P.; Hagn, U.; Rouiller, P.; Thielmann, S.; Grange, S.; Albu-Schäffer, A.; Conti, F.; Hirzinger, G. The sigma. 7 haptic interface for MiroSurge: A new bi-manual surgical console. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 3023–3030. [Google Scholar]
  114. Grange, S.; Conti, F.; Rouiller, P.; Helmer, P.; Baur, C. The Delta Haptic Device; Ecole Polytechnique Fédérale de Lausanne: Lausanne, Switzerland, 2001. [Google Scholar]
  115. Vulliez, M.; Zeghloul, S.; Khatib, O. Design strategy and issues of the Delthaptic, a new 6-DOF parallel haptic device. Mech. Mach. Theory 2018, 128, 395–411. [Google Scholar] [CrossRef] [Green Version]
  116. Schiele, A.; Letier, P.; Van Der Linde, R.; Van Der Helm, F. Bowden cable actuator for force-feedback exoskeletons. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 3599–3604. [Google Scholar]
  117. Herbin, P.; Pajor, M. Human-robot cooperative control system based on serial elastic actuator bowden cable drive in ExoArm 7-DOF upper extremity exoskeleton. Mech. Mach. Theory 2021, 163, 104372. [Google Scholar] [CrossRef]
  118. Das, S.; Kishishita, Y.; Tsuji, T.; Lowell, C.; Ogawa, K.; Kurita, Y. ForceHand glove: A wearable force-feedback glove with pneumatic artificial muscles (PAMs). IEEE Robot. Autom. Lett. 2018, 3, 2416–2423. [Google Scholar] [CrossRef]
  119. Takahashi, N.; Takahashi, H.; Koike, H. Soft exoskeleton glove enabling force feedback for human-like finger posture control with 20 degrees of freedom. In Proceedings of the 2019 IEEE World Haptics Conference (WHC), Tokyo, Japan, 9–12 July 2019; pp. 217–222. [Google Scholar]
  120. Tanjung, K.; Nainggolan, F.; Siregar, B.; Panjaitan, S.; Fahmi, F. The use of virtual reality controllers and comparison between vive, leap motion and senso gloves applied in the anatomy learning system. J. Phys. Conf. Ser. 2020, 1542, 012026. [Google Scholar] [CrossRef]
  121. Ooka, T.; Fujita, K. Virtual object manipulation system with substitutive display of tangential force and slip by control of vibrotactile phantom sensation. In Proceedings of the 2010 IEEE Haptics Symposium, Waltham, MA, USA, 25–26 March 2010; pp. 215–218. [Google Scholar]
  122. Nakagawa, R.; Fujita, K. Wearable 3DOF Substitutive Force Display Device Based on Frictional Vibrotactile Phantom Sensation. In Haptic Interaction; Springer: Berlin/Heidelberg, Germany, 2015; pp. 157–159. [Google Scholar]
  123. Ji, X.; Liu, X.; Cacucciolo, V.; Civet, Y.; El Haitami, A.; Cantin, S.; Perriard, Y.; Shea, H. Untethered feel-through haptics using 18-µm thick dielectric elastomer actuators. Adv. Funct. Mater. 2020, 31, 2006639. [Google Scholar] [CrossRef]
  124. Ito, K.; Okamoto, S.; Yamada, Y.; Kajimoto, H. Tactile texture display with vibrotactile and electrostatic friction stimuli mixed at appropriate ratio presents better roughness textures. ACM Trans. Appl. Percept. (TAP) 2019, 16, 1–15. [Google Scholar] [CrossRef]
  125. Saga, S.; Kurogi, J. Sensing and Rendering Method of 2-Dimensional Haptic Texture. Sensors 2021, 21, 5523. [Google Scholar] [CrossRef]
  126. Solazzi, M.; Frisoli, A.; Bergamasco, M. Design of a novel finger haptic interface for contact and orientation display. In Proceedings of the 2010 IEEE Haptics Symposium, Waltham, MA, USA, 25–26 March 2010; pp. 129–132. [Google Scholar]
  127. Chinello, F.; Malvezzi, M.; Pacchierotti, C.; Prattichizzo, D. Design and development of a 3RRS wearable fingertip cutaneous device. In Proceedings of the 2015 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Busan, Korea, 7–11 July 2015; pp. 293–298. [Google Scholar]
  128. Kellaris, N.; Gopaluni Venkata, V.; Smith, G.M.; Mitchell, S.K.; Keplinger, C. Peano-HASEL actuators: Muscle-mimetic, electrohydraulic transducers that linearly contract on activation. Sci. Robot. 2018, 3, eaar3276. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  129. Leroy, E.; Hinchet, R.; Shea, H. Multimode hydraulically amplified electrostatic actuators for wearable haptics. Adv. Mater. 2020, 32, 2002564. [Google Scholar] [CrossRef] [PubMed]
  130. Pelrine, R.; Kornbluh, R.; Pei, Q.; Joseph, J. High-speed electrically actuated elastomers with strain greater than 100%. Science 2000, 287, 836–839. [Google Scholar] [CrossRef]
  131. Mazursky, A.; Koo, J.-H.; Yang, T.-H. Design, modeling, and evaluation of a slim haptic actuator based on electrorheological fluid. J. Intell. Mater. Syst. Struct. 2019, 30, 2521–2533. [Google Scholar] [CrossRef]
  132. Leonardis, D.; Solazzi, M.; Bortone, I.; Frisoli, A. A wearable fingertip haptic device with 3 DoF asymmetric 3-RSR kinematics. In Proceedings of the 2015 IEEE World Haptics Conference (WHC), Evanston, IL, USA, 22–26 June 2015; pp. 388–393. [Google Scholar]
  133. Minamizawa, K.; Fukamachi, S.; Kajimoto, H.; Kawakami, N.; Tachi, S. Gravity grabber: Wearable haptic display to present virtual mass sensation. In ACM SIGGRAPH 2007 Emerging Technologies; ACM SIGGRAPH: San Diego, CA, USA, 2007; p. 8-es. [Google Scholar]
  134. Tan, D.W.; Schiefer, M.A.; Keith, M.W.; Anderson, J.R.; Tyler, J.; Tyler, D.J. A neural interface provides long-term stable natural touch perception. Sci. Transl. Med. 2014, 6, 257ra138. [Google Scholar] [CrossRef] [Green Version]
  135. Ortiz-Catalan, M.; Håkansson, B.; Brånemark, R. An osseointegrated human-machine gateway for long-term sensory feedback and motor control of artificial limbs. Sci. Transl. Med. 2014, 6, 257re256. [Google Scholar] [CrossRef]
  136. Davis, T.S.; Wark, H.A.; Hutchinson, D.; Warren, D.J.; O’neill, K.; Scheinblum, T.; Clark, G.A.; Normann, R.A.; Greger, B. Restoring motor control and sensory feedback in people with upper extremity amputations using arrays of 96 microelectrodes implanted in the median and ulnar nerves. J. Neural Eng. 2016, 13, 036001. [Google Scholar] [CrossRef]
  137. Farina, D.; Aszmann, O. Bionic limbs: Clinical reality and academic promises. Sci. Transl. Med. 2014, 6, 257ps212. [Google Scholar] [CrossRef]
  138. Jones, I.; Johnson, M.I. Transcutaneous electrical nerve stimulation. Contin. Educ. Anaesth. Crit. Care Pain 2009, 9, 130–135. [Google Scholar] [CrossRef] [Green Version]
  139. Dupan, S.S.; McNeill, Z.; Brunton, E.; Nazarpour, K. Temporal modulation of transcutaneous electrical nerve stimulation influences sensory perception. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 3885–3888. [Google Scholar]
  140. Vargas, L.; Huang, H.; Zhu, Y.; Hu, X. Object shape and surface topology recognition using tactile feedback evoked through transcutaneous nerve stimulation. IEEE Trans. Haptics 2020, 13, 152–158. [Google Scholar] [CrossRef]
  141. D’anna, E.; Petrini, F.M.; Artoni, F.; Popovic, I.; Simanić, I.; Raspopovic, S.; Micera, S. A somatotopic bidirectional hand prosthesis with transcutaneous electrical nerve stimulation based sensory feedback. Sci. Rep. 2017, 7, 10930. [Google Scholar] [CrossRef]
  142. Schweisfurth, M.A.; Markovic, M.; Dosen, S.; Teich, F.; Graimann, B.; Farina, D. Electrotactile EMG feedback improves the control of prosthesis grasping force. J. Neural Eng. 2016, 13, 056010. [Google Scholar] [CrossRef]
  143. Garenfeld, M.A.; Mortensen, C.K.; Strbac, M.; Dideriksen, J.L.; Dosen, S. Amplitude versus spatially modulated electrotactile feedback for myoelectric control of two degrees of freedom. J. Neural Eng. 2020, 17, 046034. [Google Scholar] [CrossRef]
  144. Isaković, M.; Belić, M.; Štrbac, M.; Popović, I.; Došen, S.; Farina, D.; Keller, T. Electrotactile feedback improves performance and facilitates learning in the routine grasping task. Eur. J. Transl. Myol. 2016, 26, 6090. [Google Scholar] [CrossRef] [Green Version]
  145. Witteveen, H.J.; Droog, E.A.; Rietman, J.S.; Veltink, P.H. Vibro-and electrotactile user feedback on hand opening for myoelectric forearm prostheses. IEEE Trans. Biomed. Eng. 2012, 59, 2219–2226. [Google Scholar] [CrossRef]
  146. Svensson, P.; Antfolk, C.; Björkman, A.; Malešević, N. Electrotactile feedback for the discrimination of different surface textures using a microphone. Sensors 2021, 21, 3384. [Google Scholar] [CrossRef]
  147. Jiang, C.; Chen, Y.; Fan, M.; Wang, L.; Shen, L.; Li, N.; Sun, W.; Zhang, Y.; Tian, F.; Han, T. Douleur: Creating Pain Sensation with Chemical Stimulant to Enhance User Experience in Virtual Reality. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 1–26. [Google Scholar] [CrossRef]
  148. Lu, J.; Liu, Z.; Brooks, J.; Lopes, P. Chemical Haptics: Rendering Haptic Sensations via Topical Stimulants. In Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology, Virtual Event, USA, 10–14 October 2021; pp. 239–257. [Google Scholar]
Figure 1. An example of both cutaneous and kinesthetic feedback devices that show the (a) hRing which utilizes skin stretch, reprinted with permission from [75], (b) TacTiles which utilize electromagnetic brakes for applying pressure, reprinted with permission from [76], (c) LucidVR, string-based actuators, retrieved from [77], and (d) ExoTen which utilize McKibben-based artificial muscles, reprinted with permission from [78].
Figure 1. An example of both cutaneous and kinesthetic feedback devices that show the (a) hRing which utilizes skin stretch, reprinted with permission from [75], (b) TacTiles which utilize electromagnetic brakes for applying pressure, reprinted with permission from [76], (c) LucidVR, string-based actuators, retrieved from [77], and (d) ExoTen which utilize McKibben-based artificial muscles, reprinted with permission from [78].
Applsci 12 04686 g001
Figure 2. The proposed workflow of a haptic feedback system. (a) shows the tactile stimuli that is received by the mechanoreceptors, (b) the tactile sensor converts the same stimuli into data before (c) recreates the same feedback sensation.
Figure 2. The proposed workflow of a haptic feedback system. (a) shows the tactile stimuli that is received by the mechanoreceptors, (b) the tactile sensor converts the same stimuli into data before (c) recreates the same feedback sensation.
Applsci 12 04686 g002
Table 1. Comparative study of various actuators used for touch sensing.
Table 1. Comparative study of various actuators used for touch sensing.
Sensing ModalityDesignAdvantagesDisadvantages
Strain Gauge[42,43,44]Low-costProne to errors from moisture
VersatileRequires supplementary devices to amplify data
Good sensing rangeDifficult to assemble
Accelerometer[45,46]Good accuracyNoise
Versatile
High precision
Piezoresistive[47,48,49,50,51,52,53,54]High accuracyHigh power
High spatial resolution
Small and light
Piezoelectric[55,56,57,58,59]High sensing rangePoor spatial resolution
High precisionLimited to dynamic touch scenarios
Optical[60,61]High accuracyBulky
High precision
Good spatial resolution
Multimodal[62,63,64,65]Compensates for other sensor limitationsHigh cost
Difficult to manufacture
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

See, A.R.; Choco, J.A.G.; Chandramohan, K. Touch, Texture and Haptic Feedback: A Review on How We Feel the World around Us. Appl. Sci. 2022, 12, 4686. https://doi.org/10.3390/app12094686

AMA Style

See AR, Choco JAG, Chandramohan K. Touch, Texture and Haptic Feedback: A Review on How We Feel the World around Us. Applied Sciences. 2022; 12(9):4686. https://doi.org/10.3390/app12094686

Chicago/Turabian Style

See, Aaron Raymond, Jose Antonio G. Choco, and Kohila Chandramohan. 2022. "Touch, Texture and Haptic Feedback: A Review on How We Feel the World around Us" Applied Sciences 12, no. 9: 4686. https://doi.org/10.3390/app12094686

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop