Next Article in Journal
Automated IoT Device Identification Based on Full Packet Information Using Real-Time Network Traffic
Next Article in Special Issue
An Intelligent In-Shoe System for Gait Monitoring and Analysis with Optimized Sampling and Real-Time Visualization Capabilities
Previous Article in Journal
Load Resistance Optimization of a Magnetically Coupled Two-Degree-of-Freedom Bistable Energy Harvester Considering Third-Harmonic Distortion in Forced Oscillation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Systematic Review of Commercial Smart Gloves: Current Status and Applications

by
Manuel Caeiro-Rodríguez
*,
Iván Otero-González
,
Fernando A. Mikic-Fonte
and
Martín Llamas-Nistal
atlanTTic Research Center for Telecommunication Technologies, Universidade de Vigo, 36312 Vigo, Spain
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(8), 2667; https://doi.org/10.3390/s21082667
Submission received: 4 March 2021 / Revised: 2 April 2021 / Accepted: 8 April 2021 / Published: 10 April 2021
(This article belongs to the Special Issue Applications of Body Worn Sensors and Wearables)

Abstract

:
Smart gloves have been under development during the last 40 years to support human-computer interaction based on hand and finger movement. Despite the many devoted efforts and the multiple advances in related areas, these devices have not become mainstream yet. Nevertheless, during recent years, new devices with improved features have appeared, being used for research purposes too. This paper provides a review of current commercial smart gloves focusing on three main capabilities: (i) hand and finger pose estimation and motion tracking, (ii) kinesthetic feedback, and (iii) tactile feedback. For the first capability, a detailed reference model of the hand and finger basic movements (known as degrees of freedom) is proposed. Based on the PRISMA guidelines for systematic reviews for the period 2015–2021, 24 commercial smart gloves have been identified, while many others have been discarded because they did not meet the inclusion criteria: currently active commercial and fully portable smart gloves providing some of the three main capabilities for the whole hand. The paper reviews the technologies involved, main applications and it discusses about the current state of development. Reference models to support end users and researchers comparing and selecting the most appropriate devices are identified as a key need.

1. Introduction

Over the recent years, virtual, augmented and mixed reality systems (also known as extended reality or XR) have evolved significantly yielding enriched immersive experiences. Current low-cost head mounted displays (HMDs), such as the Oculus Rift or HTC Vive, provide high-fidelity 3D graphically-rendered environments that enable users to immerse in virtual experiences as never before. These solutions are specially focused on the visual and auditory senses. Nevertheless, for a more realistic experience, other senses should be considered, particularly haptic feedback based on kinesthetic and tactile interactions [1]. Research has already shown that users feel more immersed in XR if they can touch and get feelings in the forms of haptic interaction [2]. Similarly, interaction based on active movements contributes to the “sense of agency”, that is, the sense of having “global motor control, including the subjective experience of action, intention, control, motor selection and the conscious experience of will” [3].
To date, most commercial solutions use hand-based controllers with click buttons and inertial sensors for user interaction with XR devices. Even in many cases, vibration motors are included to provide some kind of haptic feedback. For example, when a collision with an object or a structure (e.g., a wall), a vibration alert is provided [4]. There are also solutions that perform some kind of body tracking, enabling to represent the user or a part of him/her (e.g., his/her hand) in the virtual scenario [5]. Nevertheless, these solutions are not perceived as natural [6], particularly because while users hold the provided controllers they cannot grab or touch objects in the virtual experience. Therefore, the development of real immersive XR demands other kinds of devices that facilitate a more natural human interaction, particularly freeing the user’s hands, recognizing gestures, and offering haptic feedback that allow users to feel what is happening in the virtual experience as if it was real [7].
From all the solutions considered, the concept of smart gloves is the most promising one in order to improve the immersive sensation, the degree of embodiment and presence in virtual/mixed reality [8,9,10]. Smart gloves are intended to enable users to touch and manipulate virtual objects in a more intuitive and direct way. They also pretend to provide sensitive stimuli that can be perceived by the human hands, particularly, kinesthetic and tactile feedback that simulates touching and manipulating objects. Non-functional requirements are also important: the glove device should be small, light, easy to carry, comfortable and it should not impair the motion and actions of the wearer. In addition, it should be adjustable to the variety of sizes and forms of human hands and fingers. There is a general understanding that this kind of device would enable users to experience more realistic XR, support patients’ rehabilitation, remote teleoperation, virtual surgery and experimentation, implementation of work sites, playing videogames, etc.
The vision for this kind of more interactive and immersive glove technology is nothing new. The first proposal of a hand-based device was done more than 40 years ago, in 1978 [11]. In 1982, Zimmerman applied for a patent (USA Patent 4542291) of a flexible optic sensor worn in a glove to measure the flexion of the fingers [12]. Zimmerman worked with Lanier to include ultrasonic and magnetic technology to track the hand position and create the Power Glove and the Data Glove (US Patent 4988981) [13]. Since then and along all these years, the pursuit of a device that facilitates hand-based interaction has been continuous, exploring different technologies and approaches. It is interesting to notice the different names used to refer to this kind of device (in alphabetical order): “cyber gloves“, “data gloves”, “force-feedback globes”, “glove-based systems“, “haptic gloves”, “sensory gloves”, “smart gloves”, “virtual gloves”, or “VR gloves”. Generally, the name is used to highlight some main purpose or device capability. For example, data gloves, by far the most frequent used name, highlights the capability to capture data from glove sensors, mainly related to hand and finger pose estimation and motion tracking. Meanwhile, haptic gloves are used to name those devices capable of providing some kind of kinesthetic or tactile feedback, despite generally they also involve some data capture capability. In this paper, we prefer the smart gloves name because, although it is the second more used name after “data gloves”, it encompasses the variety of purposes and capabilities in a better way.
Despite the many years devoted to the development of smart gloves, failures to satisfy the complex requirements have been continuous and this device has not become mainstream yet. In any case, the research focused on smart gloves has not decline and during recent years, there has been a growing interest, particularly in the commercial area. The great advances on related technologies, such as wearables and HMDs, have fueled the emergence of new initiatives. Nowadays, there exists a good number of commercial smart gloves and, more interestingly, many pieces of research are being developed based on them. At this point, a main problem is to be able to analyze the features of the different gloves to decide the most appropriate one for a certain application. The goal of this paper is to offer a classification and analysis of existing commercial smart gloves, distinguishing among the different goals and providing a common basis for the decision making.
The rest of the paper is organized as follows. The next section describes the hand anatomy and possible movements that can be produced and captured by the smart gloves. Then, Section 3 introduces the related work, focusing on other surveys and reviews performed about this topic along the years. Next, the method followed to carry out this review based on PRISMA is described. Section 5 introduces the 24 commercial smart gloves identified and Section 6 analyzes them considering 3 main capabilities: hand and finger pose estimation and motion tracking, kinesthetic feedback and tactile feedback. Section 7 reviews the main application areas of these gloves, based on the scientific literature and on the info provided by smart gloves vendors. Finally, Section 8 provides a discussion about existing smart gloves and Section 9 presents the conclusions of the paper.

2. The Human Hand

The features of smart gloves are closely related to the anatomy and physiology of the human hand. The concept of degree of freedom (DoF) is particularly important [14,15,16], which refers to the different basic movements that can be performed with the hand and fingers. More complex movements can be performed as combination of basic ones.
Before considering the DoF, it is important to have a good knowledge of the human hand anatomic structure, see Figure 1. A hand is made up by five fingers. Each finger, except the thumb, has three bones (distal, intermediate, and proximal phalanges), and three joints: meta-carpophalangeal (MCP), proximal-interphalangeal (PIP), and distal-interphalangeal (DIP). The thumb has two bones (distal and proximal phalanges) and two joints: meta-carpophalangeal (MCP) and inter-phalangeal (IP). Nevertheless, the thumb has an additional mobile joint: the trapecio-metacarpal (TM).
Regarding movement, the human hand can be modeled with 23 DoF [14]: four in each one of the four fingers, four in the thumb and three in the wrist.
For each finger, except the thumb, PIP and DIP joints can perform an extension/flexion (E/F) movement, while MCP joint can perform E/F and adduction/abduction (A/A) movements, see Figure 2. In practice, depending on the subject, some movements at certain joints cannot be performed independently. For example, many people cannot perform DIP E/F without performing PIP E/F. By the contrary, other movements can be produced by injuries, such as hyperextension and supraduction, but they are not considered.
The thumb has 6 DoF: the IP and MCP joints have an E/F movement and the TM has E/F and A/A movements. Thumb E/F and A/A movements are represented in Figure 3. Active A/A of the thumb MCP joint is limited, considered accessory motions and therefore we do not consider it. Notice these movements are not rectilinear in each of the axes separately, but they are usually carried out jointly, resulting in complex rotational movements.
The wrist provides three more DoF to complete 23 DoF for the whole hand, see Figure 4:
  • E/F or pitch. Extension is the dorsal tilt movement where the hand approaches the back of the wrist. Flexion is the palmar tilt movement where the hand approaches the anterior aspect of the wrist.
  • A/A or yaw. Abduction is the radial or lateral deviation movement where the hand moves away from the midline of the body. Adduction is the ulnar or medial deviation movement where the hand approaches the midline of the body.
  • Pronation/supination (P/S) or roll. Pronation is the internal rotation movement from a neutral position, so that the hand rotates until the back of the hand is facing up (position to catch bread). Supination is the external rotation movement from a neutral position, so that the hand rotates until the palm of the hand is facing upwards (begging position).
The hand can develop other movements, such as the palm bending, but these movements are less important, particularly related to user interaction. Usually, smart gloves are not developed to detect them.

3. Related Work

As it has been introduced, smart gloves have a history of more than 40 years. During this time s everal surveys have been published in the scientific literature, most of them during the last years, see Figure 5.
The first survey about glove-based input and electronic gloves was published as early as 1994 [17]. Hands were already considered as the natural way of human interaction with the world, in contrast to the common way of interaction with computers constrained by “clumsy intermediary devices such as keyboards, mice and joysticks” [7]. The goal, at this time, was to collect data about the movement and pose of the hand and fingers, naming them as “data gloves”. Some glove devices were already commercialized, mainly related to the needs of the video game industry: The Visual Programming Language (VPL) Data glove (VPL Research, San Francisco, CA, USA) considered as the first data glove appeared in 1987; the Exos Dexterous HandMaster (Dexta Robotics, Hong Kong, China); the Mattel Intellivision Power Glove (Mattel, Inc., El Segundo, CA, USA) as a low-cost version to be used as a control device for the Nintendo video game console in 1989; the CyberGlove from Virtual Technologies (Maumee, OH, USA); and W Industries’ (Houston, TX, USA) Space Glove. This first survey was focused on the hand-tracking features of the gloves, based on three technologies: optical, magnetic and acoustic. Most gloves, for example the VPL Data Glove, were based on the use of optical fibers along the fingers, attenuating the light they transmit when the finger flexion bends the fibers. In other cases, they used Hall-effect sensors as potentiometers at the finger joints that were also able to measure the bending. Despite the existence of these devices as commercial products, this survey concludes that the area of glove-based input was at its infancy. Features such as haptic feedback or wearability were not considered at all.
The next survey in the literature about smart gloves was published fourteen years later, in 2008 [11], reflecting a slow progress in the technology. This survey used the name “Glove-based systems”, described as “composed of an array of sensors, electronics for data acquisition/processing and power supply, and a support for the sensors that can be worn on the user’s hand”. Typical gloves at that time were described as “a cloth glove made of Lycra where sensors are sewn”. They had limitations in the form of portability, as they required wired physical connections, limited haptic sensing and naturalness of movement. At this time, actuators and kinesthetic feedback are considered as glove accessories and not as an essential feature. Thirty different gloves are described in this survey, both commercially available and prototypes, ranging from 1978 to 2008 and classified in three stages in the evolution:
  • Early research. Gloves equipped a limited number of sensors, hard wired and developed to serve specific applications, never commercialized.
  • Data glove-like systems. These shared three basic design concepts: (i) they measured finger joint bending using bend sensors; (ii) they used a cloth for supporting sensors; and (iii) they were usually meant to be general-purpose devices. Several commercialized products are referenced: VPL Data Glove in 1987 by VPL Research, Inc.; the Power Glove by Mattel Intellivision; Super Glove in 1995 by Nissho Electronics (Tokio, Japan); the P5 Glove in 2002 by Essential Reality, LLC (Shelby, OH, USA). Other devices mentioned were the Space Glove, CyberGlove (CyberGlove Systems LLC, San José, CA, USA), Humanglove (Humanware, Pisa, Italy), 5DT Data Glove (Fith Dimension Technologies, Orlando, FL, USA), TCAS Glove (T.C.A.S. Effects Ltd., city, state abbrev if USA, country), StrinGlove (Teiken Limited, Osaka, Japan) and Didjiglove (Didjiglove Pty Ltd., Melbourne, Australia). Interestingly, CyberGlove has been a major reference in the domain but was recently discontinued (see Section 4) and 5DT is still active (see Section 5).
  • Beyond Data Gloves. This category gathers devices with no cloth, such as rings, and using new sensor technologies (e.g., infrared LEDs and changes in skin coloration, accelerometers, LED scanner), trying to support specific applications, particularly alphanumeric character entry.
The next survey, published nine years later in 2017 [18], was focused on wearable haptics for the finger and the hand and not on the data capture capabilities. In contrast to grounded and bulky haptic devices, the paper highlights the efforts to provide “wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges”, such as the CyberGrasp exoskeleton (CyberGlove Systems LLC, San José, CA, USA) or the Rutgers Master (Burdea, Romania), but yet too complex and expensive in consumer terms. This survey provides a classification distinguishing among the type of tactile stimuli provided to the wearer (kinesthetic, pressure, contact, vibration, curvature, softness); area of the end-effector (fingertip or whole hand); technologies (e.g., DC motor, air jet nozzles, servo motors, voice coils, vibrating motors, pneumatic actuators, dielectric elastomer actuators) and level of wearability (weight and dimensions). It analyzes 23 fingertip devices and 23 haptic devices for the whole hand. These devices were prototypes described in papers available in the scientific literature and were used to provide kinesthetic and tactile (pressure, contact or vibration) stimuli. No commercial devices are analyzed.
The next year, 2018, haptic gloves were reviewed in [19], including a good number of commercial products. This survey distinguishes among traditional gloves, thimbles and exoskeletons. In total, 13 different devices were analyzed, considering that although belonging to different classes they all share the same objectives and constraints. In more detail:
  • The “traditional glove” refers to “a garment made of some sort of flexible fabric, which fits the shape of the hand and lets the fingers move individually”. “The sensors to measure the flexion of the fingers and the actuators to apply a feedback on the skin or skeleton are either sewn within the fabric or fixed on the outside of these gloves”. As examples of this type, paper authors include the Avatar VR by Neurodigital Technologies (Seville, Spain), which evolved to become the Sensorial XR, included in this paper. Other examples described have been discontinued, such as Maestro (Markham, ON, Canada).
  • A “thimble” is “a configuration with an actuator attached to a fingertip. It is possible to combine several thimbles in order to provide feedback on several fingers at the same time. In such a way, a function similar to that of a haptic glove can emerge.” In this paper, we do not consider these devices or their combinations as commercial gloves. In any case, some commercial solutions, such as Polhemus (Polhemus, Colchester, VT, USA), could be considered in this category.
  • Exoskeletons. “An exoskeleton is an articulated structure which the user wears over his/her hand and which transmits forces to the fingers…” enabling in this way the provision of kinesthetic feedback. Examples of commercial exoskeletons are: CyberGrasp, HaptX (HaptX Inc., San Luis Obispo, CA, USA), Dexmo by Dexta Robotics (Hong Kong, China), VRGluv (VRgluv, Georgia, GA, USA), Sense Glove DK1 (Sense Glove, Delft, The Netherlands) and HGlove (Haption SA, Soulgé-sur-Ouette, France). Some of them have been discontinued recently.
Next year, in 2019, three papers were published that can be considered as smart gloves surveys, two of them by the same authors: Wang et al. The first one was focused on force feedback gloves [20]. This includes a detailed classification featuring motion tracking and kinesthetic feedback capabilities. Specifications used to quantify the performance of motion tracking are: degrees of freedom (DoF), motion range, sensing accuracy and update rate. For kinesthetic feedback, the following specifications were used: dimension (actuated DoF), range of applicable force, resolution and dynamic response of feedback forces. This work analyzed several research prototypes and two commercial gloves: CyberGrasp (CyberGlove Systems LLC, San José, CA, USA) and Dexmo (Dexta Robotics, Hong Kong, China). Gloves were also classified according to the location of the actuation into four sub-categories:
  • Ground-based systems. The base is fixed on the ground or a desk. From our point of view, these are not real smart gloves.
  • Dorsal-based systems. It is a wearable exoskeleton system grounded to the back of the hand.
  • Palm-based systems. Grounded to the users’ palm. The force is provided directly between the fingers and the palm to simulate palm opposition type grasping.
  • Digit-based systems. Grounded to the digit, provides forces directly between the finger and the thumb to simulate pad opposition or precision type grips.
The second review paper by Wang et al. focused on haptic displays for VR [21]. This distinguishes among desktop haptics, surface haptics, and wearable haptics. Haptic gloves are considered in the case of wearable haptics, providing both force and tactile feedback to fingertips and the palm. This work ref. [11] commercial gloves providing motion track, force feedback and tactile feedback. Nevertheless, as this work is about haptic displays in general, it does not provide a detailed analysis of the features of the commercial gloves.
Also, in 2019, a survey about wearable technologies for hand joints monitoring for rehabilitation was published [22]. This survey introduces several smart gloves, some of them commercial, analyzing their capability to support human hand rehabilitation. The different gloves are classified in accordance to their technology, distinguishing among the following ones:
  • Flex sensor-based technologies, referencing the commercial ones: CyberGlove III (CyberGlove Systems LLC, San José, CA, USA), 5DT Data Glove (Fith Dimension Technologies, Orlando, Fla., USA), X-IST Data Glove (SouVR International Trading Co. Ltd., Beijing, China) and DG5 VHand 2.0 Data Glove (DGTech Engineering Solutions, Bazzano, Italy).
  • Accelerometer based technologies, referencing the commercial ones: KeyGlove (Jeff Rowberg, Roanoke, VA, USA) and AcceleGlove (Whashington, DC, USA).
  • Hall-effect sensor-based technologies, referencing the commercial one Humanglove.
  • Stretch sensor-based technologies. No commercial gloves are referenced.
  • Magnetic sensor-based technologies. No commercial gloves are referenced.
  • Vision-based technologies. These cannot be considered as smart gloves, because they are based on the use of external cameras and gloves painted with different colors to facilitate the recognition of the fingers and hand position.
Last year (2020), a survey about hand pose estimation with wearable sensors and computer-vision-based methods was published [23]. It analyses various types of gloves and computer-vision-based methods proposed for hand pose estimation in recent years. This paper introduces a sensor taxonomy for gloves distinguishing among bend (flex) sensors, stretch (strain) sensors and other types, such as inertial measurement units (IMUs) and magnetic sensors. Besides, references to research and commercial gloves and the type of sensors used are also included. For the commercial ones, they mention: CyberGlove III, 5DT and Hi5. In addition, this survey also describes computer-vision-based methods, based on the use of cameras to capture RGB images or commodity depth sensors, which enable them to create depth maps. A major problem for these methods is occlusion as they rely on line-of-sight observation, namely, they are very likely to be blocked or partially blocked while doing activities and manipulating objects. As a conclusion, this is a main argument in favor of developing smart gloves. This survey does not consider any kind of haptic feedback features in gloves.
Also in 2020, a paper about tactile feedback in VR was published [24], where four commercial ones are referenced: Manus VR (Manus Machinae B.V., Geldrop, The Netherlands), VR Free (Sensory Ag, Zürich, Switzerland), Plexus VR (Digital Kinematics, London, UK) and Dexmo VR (Dexta Robotics, Hong Kong, China). In any case, this cannot be considered as a survey as the gloves are neither described nor analyzed.
No one of the existing surveys has been done following a review methodology, such as PRISMA. In addition, previous surveys are focused on some specific kind of capability, such as hand or finger tracking, but not on the variety of capabilities. In other way, some of the surveys do not involve gloves only, but also other kind of devices, such as thimbles or rings. More importantly, no survey has focused on commercial devices exclusively. Therefore, taking into account the current interest on these devices, we consider a survey about smart gloves an interesting topic.

4. Methods

The present survey of smart gloves is aligned with the PRISMA guidelines for systematic reviews and meta-analysis [25]. As this is the first systematic review on this topic, the review protocol has not been registered. Data was sourced from published articles, see Figure 6. The primary databases searched were MDPI, Elsevier, Springer, Taylor and Francis, and IEEE Xplore. As a secondary source, Google Scholar was used. The primary keywords were “cyber glove”, “data glove”, “force-feedback glove”, “glove-based system”, “haptic glove”, “sensory glove”, “smart glove”, “virtual glove” and “VR glove”. In cases where the plural form provided additional relevant resources, it was also used. The inclusion and exclusion criteria were characterized by a title and abstract screening followed by a full-text and abstract screening process. Despite a large number of references were found in the primary and secondary sources, the screening process was quite straightforward because we focused on commercial devices. Many papers were discarded because no mention of commercial smart gloves was found. Next, 598 were discarded because they just mentioned some commercial glove, but not their use or an analysis of them.
Searches were limited to the period 2015–2021 and only papers in English language were included. The inclusion criteria were to be an active commercial wearable device for the full hand. Therefore, the exclusion causes were as follows:
  • Non-commercial devices. Many papers in the literature describe the development or proposal of smart gloves, based on the use of special sensors, actuators or developed towards a specific purpose. Some devices commercialized in restricted contexts were neither included, such as the NuGlove (Anthro Tronix, Silver Spring, MD, USA), only available for military purposes.
  • Recently discontinued devices. There are many commercial smart gloves that are no longer available in the market. The most relevant case is related to the four different solutions (CyberGlove, CyberTouch, CyberGrasp and CyberForce) produced by Cyberglove Systems, discontinued in 2019. Remarkably, this company has a long history in the smart gloves domain, including 446 references in the literature since 2015. Other examples of recently discontinued products are: HaptX, Keyglove, HumanGlove, Plexus, Maestro Gesture Glove, Teslasuit Gloves (VR Electronics Ltd., London, UK) and VRGluv. In many cases, the discontinued gloves evolved to new products as different companies, such as the AcceleGlove that evolved to become NuGlove and GoGlove. It is more common that a company issues new version of its device under a different name, such as Neurodigital with Gloveone and Avatar VR, previous to Sensorial XR. In the case of Peregrine by IronWill, despite not currently commercialized yet, it is announced that they will be available in the second semester of 2021.
  • Non-wearable devices. Some devices are not really wearable, but they are connected by strings or rigid structures to some ground system or device to be carried on by the user. For example: ExoHand by Festo (Esslingen, Germany) is attached to a pneumatic system, HGlov by Haption is also connected to a rigid structure, Glohera Sinfonia (Lumezzane, Italy) and Esoglove (Roceso Technologies, Bukit Merah, Singapore).
  • Devices that do not allow one to use the hands freely. For example: Microsoft Haptic Pivot and Valve Index controllers. Commercially these devices are very relevant, as they are linked to some of the main companies in the XR domain. Nevertheless, they are based on the use of vision-based solutions to recognize the hand and finger pose and movement. This limits user movements as the user can clash with objects while moving the hands.
  • Not full-hand devices. There are some proposals that consider just some fingers or a part of the hand, such as rings or thimbles, or only the wrist. For example: Fingertracking by ART (Advanced Realtime Tracking, Bayern, Germany), Polhemus, and EXOS wrist DK2 by Exiii (Exiii Inc., Tokyo, Japan).
  • Devices that do not provide real smart gloves capabilities. For example, some gloves just detect touches at specific hand/finger locations, such as the touch of the fingertips of the thumb and index finger. They are intended to be used as a kind of remote control, such as Saebo Glove (Saebo, Charlotte, NC, USA) and GoGlove (GoGlove, Los Angeles, CA, USA); or for rehabilitation purposes, such as the MusicGlove (Flint Rehab, Irvine, CA, USA). In other cases, the purpose is to quantify the pressure applied to and exerted by the hand, such as the TactileGlove (PPS, Boston, MA, USA). In any case, these gloves have some capability to enable hand interaction in XR environments.
In total, 29 devices have been discarded. They are referenced in Table 1.

5. Results

This section introduces the commercial smart gloves identified in the survey performed in accordance to the method described in the previous section. The obtained gloves are shown in next Table 2 and Figure 7. Despite the numerous discards, the number of gloves is quite large: 24. The table includes information about the country of the company, URL, glove type as described in next Section 5.1, capabilities supported and price. Three main capabilities are recognized (see Section 5.2): hand and finger pose estimation and motion tracking, kinesthetic or force feedback and tactile feedback. Additionally, Section 5.3 analyses issues related to ergonomics and wearability.
Smart gloves companies can be found all around the world, mainly in USA and Europe, but also in China, Russia, Israel and New Zealand. The discarded gloves include products from other countries, such as Japan or Canada, showing the global interest in this type of device.
Many companies have different versions of their gloves active: 5DT, Dexmo, Synertial, Manus, Nansense, Noitom, SenseGlove and VMG. In many cases, they provide the same glove with a different number of sensors to enable the tracking of more DoF, such as 5DT, Cobra Glove, Dexmo, Nansense and VMG. In other cases, companies offer gloves with different capabilities, such as Manus and VMG that offer gloves supporting just tracking and positioning and another model offers kinesthetic feedback. As particular cases, the Chinese Noitom company sells the Hi5 VR and the Perception Neuron Studio Gloves (despite the fact different URLs may be shown in Table 2) and SenseGlove from the Netherlands have two gloves with similar features based on different technologies. Finally, some of the gloves are not the first ones developed by the company, but evolutions from previous models, such as Manus and Neurodigital devices.
Many of the companies producing smart gloves are startups that have the gloves as their only product. Some examples are: CaptoGlove, ManusPrime, SenseGlove or SensorialXR or VRGluv. Notice, VRGluv was launched in 2017 at a crowdfunding web page, but it has since been discontinued. In other cases, smart gloves companies are involved in the motion capture business, and they have other products such as body suits to recognize the human body movements. This is the case of Cobra Glove, Nansense, Perception Neuron and Rokoko. In many of these cases, the gloves are not sold separately. The New Zealand stretchsense is based on a special stretch sensor technology that is also included in other products of the company.
Regarding prices, it is important to notice that these are approximations. We have tried to indicate the cost of the pair of gloves. Nevertheless, there are some variations depending on possible accessories included, such as batteries or connectivity options, and also related to special guaranties or licenses. In any case, the prices are generally above €1000.

5.1. Glove Types

Commercial smart gloves are usually classified in two main categories: exoskeleton and fabric. In addition, when we pay attention to Figure 7, it is possible to identify other distinguishing features, such as strips and open fingertips. As a result, we propose the following classification:
  • Exoskeleton. This refers to a structure located in the back of the hand involving some strings or rigid links attached to the fingers. They are used to provide kinesthetic feedback to the hands.
  • Fabric. This refers to a piece of fabric that covers the full hand and fingers. Inside the fabric, some sensors and actuators are included to perform the desired capability.
  • Strips of fabric, plastic or other materials. Some smart gloves do not cover completely the skin of the fingers and hand. Instead, they have just fabric, plastic or other materials in the locations where the sensors and actuators are located. This kind of gloves can facilitate the fitting to different hands and fingers shapes and forms.
  • Open fingertips. Some smart gloves have open fingertips. This feature facilitates the use of touch screens and other activities where the finger sensitivity is important. It can also enable a better glove fitting.

5.2. Capabilities

Smart gloves can be used for different purposes and the following ones are generally recognized [20,26,27]:
  • Hand and finger pose estimation and motion tracking. This capability is also known as “hand posture reconstruction”, “hand movement tracking” and “hand movement synthesis”. This involves the capability to measure the position and movements of the fingers and the whole hand, as described in Section 2. Motion tracking is necessary to detect user’s manipulation gestures and to drive the motion of a hand avatar in virtual environments. Another issue related to hand and finger pose estimation is gesture detection. Gestures can be determined from hand and finger tracking, but there are some gloves that also determine gestures by other means. For example, gestures such as the joining of two fingers can be detected with sensors located at the fingertips. High DoF and large motion range are required for recognizing dexterous manipulation and grasping. Furthermore, high resolution and update rate are required for simulating fine manipulation and actions such as the pushing of a button. Specifications used to quantify the performance of motion tracking are [20]: DoF, motion range, resolution and sampling/update rate. In addition to position-motion, it would be also very interesting to measure the force exerted, but this is something more complex that is not supported by existing commercial gloves.
  • Haptic feedback. This is related to the human perceptual system which includes various kinesthetic and cutaneous receptors in our body, located in the skin, muscles or tendons. Haptics technology simulates the sense of touch in computing [28] and involves two different features [19]:
    Kinesthetic or force feedback. Referred to provide the impression of movement and resistance through the muscles, like the feeling of weight, inertia, or resistance. It involves the reproduction of movements and resistances by means of actuators, such as electric motors, to exert specific forces in the hand and fingers. This can be used to simulate the touch of immovable objects such as walls, the grasping of virtual objects, the use of triggers, etc. To this end, a sufficient range of force/torque and magnitude is required. In addition, it is also important to have a good force resolution in order to simulate subtle changes and contact with small objects. The following features are important for the kinesthetic feedback [20]: dimension (actuated DoF), range of applicable force (e.g., maximum fingertip force), resolution and update rate. Notice that kinesthetic feedback requires hand tracking, but not vice versa.
    Tactile feedback. Tactile feedback devices provide input to the user skin [29] to recreate different sensations, such as shape, texture, thermal, smoothness, etc. In haptic devices, this is achieved through different elements [21], such as mechanical vibration, surface shape changing and friction modulation. In the case of commercial smart gloves, mechanical vibration is the option, involving the use of motors, linear resonant actuators, voice coils, solenoid and piezoelectric actuators.

5.3. Ergonomics and Wearability

Ergonomics and wearability concepts are related to the usability of the smart glove. It is desirable that smart gloves are comfortable to wear, easy to put on and off and do not limit or restrict the activities performed by the user. In addition, to avoid users’ fatigue, gloves should be as lightweight as possible, including its battery and controller. Another requirement is related to safety, as gloves should never injure the user even in the occurrence of system failures, particularly in case of kinesthetic gloves.
There are many features that can be considered related to the ergonomics and wearability: size, weight, power consumption, etc. Kinesthetic gloves [20] usually involve the method of mounting the gloves to the human hand, the way of transmitting forces and torques to fingers, that makes them especially complex. From all the possible features, taking into account the information available from the different products, we have gathered the following ones, see Table 3:
  • Size. Gloves should fit an arbitrary size and form of the hand or should be easily adaptable. This constraint can be addressed by offering a selection of sizes within a certain working range. In any case, some gloves just provide a unique size.
  • Weight. Gloves should not be heavy. From the data collected, weight varies from 50 to 300 g.
  • Battery and Autonomy. Most gloves include some kind battery. Autonomy varies between 2 and 10 h, but this depends on the operation level.
  • Wireless. The ability to work without cables improves freedom of movement, especially in cases where manual activity measurement is required. Most gloves include some kind of wireless technology, whether Bluetooth or Wi-Fi based. Most models also include a cable connection and, in some cases, (e.g., Exo Glove, MoCap Pro SuperSplay) an SD Card to store the data captured.

6. Analysis

This section analyzes the identified commercial gloves paying attention to the three main capabilities recognized: hand and finger pose estimation and motion tracking, kinesthetic feedback and tactile feedback. Therefore, next subsections describe features and capabilities of the smart gloves under each one of these capabilities.

6.1. Gloves for Hand and Finger Pose Estimation and Motion Tracking

The main smart gloves capability is hand and finger pose estimation and motion tracking. This can be achieved using different kinds of sensors (e.g., IMU, stretch and strain sensors) located on gloves or using visual-based solutions, such as leap motion based on infrared sensor [29]. Visual-based solutions present important problems, such as occlusion, that cannot be easily solved, as they cannot capture hand and finger out-of-sight movements. In some cases, mixed solutions are proposed, with gloves including especial markers that can be easily recognized by image sensors (e.g., the Vincon motion system described at vicon.com last accessed on 27 February 2021). These systems offer higher precision and faster measurements than the markerless vision-based ones.
The approach to capture movements through sensors generally involves a mapping that goes from the sensor output to hand and fingers joint angles. Some devices will allow direct measurement of all finger DoF. For others, inverse kinematics may be used, for example, to calculate internal joint angles from the position of fingertips relative to the palm. Relationships between the DIP and PIP angles can also be enforced to reduce the active number of DoF [30].
Table 4 includes the commercial smart gloves that support this capability, indicating technology and sensors included. All the gloves identified as smart gloves support this capability, as long as it is basic to support the other ones. Most gloves employ sensors from three categories: IMU, bend (flex) sensors and strain (stretch) sensors. Other proposals in research have used magnetic sensing [31], capacitive sensors [32], or electromyography (EMG) [33] for gesture recognition. For a complete overview, we refer to the existing surveys [11,22]. In more detail, and regarding commercial smart gloves these are the technologies involved:
  • IMU. This device can measure acceleration, rotational speed and orientation [27]. It is made up by several sensors: a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer. These sensors are relatively low cost and can have very high sampling rates. A main issue with IMU is related to drift. The data collected from sensors is integrated from a known starting configuration. As a result, errors are accumulated, and even small variations can lead to large errors over time. Another major drawback of IMU in the smart gloves’ context is their rigidity and bulkiness compared to the size of human fingers. In any case, the use of IMU is very common to estimate hand position and movement, mainly when the goal is related to motion capture, such as Cobra Glove, Exo Glove, Hi5, Nansense R2, Perception Neuron, Sensorial XR, Senso Glove DK3 and VR Free. A particular case is Rokoko, which uses a IMU without a magnetometer to ensure immunity from magnetic distortion. Indeed, usually smart gloves vendors based on other technologies highlight that their gloves are immune to magnetic fields, in contrast to IMU-based gloves. Finally, in many cases, external IMU can be attached to the gloves over the wrist to estimate the hand position and movement in the space, as it is shown at column “External IMU” in Table 4. Usually, commercial trackers provided by VR vendors, such as Oculus or HTC Vive, are attached. These devices help to make the gloves more modular, but they need to be calibrated in conjunction.
  • Bend (flex) sensors. These are piezo resistive elements that change their resistance as they are bent or flexed, creating variations in the transmitted electrical signal. Such variations can be measured and mapped to changes in joint angles. The ideal solution involves the availability of linear behaviors, where the change in voltage can be linearly related to the finger bend, as this would facilitate the mapping between the signal and the joint angle. Bend sensors should be placed in the gloves in such a way that they are exactly in the location of the joints of interest, such as the interphalangeal joints. In such a position, a one-to-one mapping between the joint blend and the sensor reading could provide an accurate measurement. It is particularly difficult to accurately measure joints involved in abduction and adduction movements [27]. Another problem for bend sensors is their short lifespan, as the continuous bending makes them to break quite fast. In any case, bend sensors have been extensibility applied in commercial smart gloves like the 5DT, Capto Glove, Forte Data Glove, HandTutor, Manus Prime II, Rapael, VMG and VR Free.
  • Strain (stretch) sensors. Strain sensors provide a changing signal as they stretch. This effect can be obtained from resistance or capacitive elements. Capacitive sensors can be relatively smaller, facilitating the integration of a larger number in a reduced area, such as in the case of The StretchSense MoCap Pro.
  • Rotational sensors. There exist some gloves that include mechanical rotational sensors, such the Dexmo exoskeleton, or rotational encoders, such as the SenseGlove DK1. These sensors are able to convert the angular position of a shaft to a signal. They are quite bulk, but they can be embedded in these gloves because their exoskeleton structure facilitates it.
  • Hybrid approaches. There are several approaches that combine different technologies. The idea is to solve the issues present in some approaches and to take advantage of the strengths. For example, to mitigate the drift of IMU, the poor abduction tracking of bend sensors, or the interference of magnetic fields. Rokoko combines IMU with magnetic tracking. Manus Prime II combines bend sensors with IMU. SenseGlove Nova and VRFree combine IMU with a visual-based method.
Some commercial gloves have not been included in Table 4 because we have not been able to get the information about their sensor technology. This is the case of Anika Rehap and Cynteract. Both of them are focused on supporting hand motor skills coordination and rehabilitation, and therefore the capture of specific DoF is not a main concern.
Taking the information provided by the vendors, number of sensors and technology, we have compiled Table 5 showing the DoF per hand for the commercial smart gloves. This table includes not just the global number of DoF, but also the precise DoF for the fingers and the hand according to the reference model described in Section 2. This table includes many annotations regarding the different ways in which DoF have been estimated, because information provided by vendors is not completely clear in all cases.
A main appreciation needs to be noted regarding IMU-based smart gloves. In this case, the relationship between sensor location and DoF involved is not straightforward. This kind of sensor measures the global movement. Therefore, a reference to a fixed point needs to be established to estimate the actual DoF performed. In case of bend or stretch sensors, the relationship between the movement and the DoF is clearer.
More info about the tracking and positioning capabilities of the smart gloves would be interesting, such as motion range, resolution and sampling/update rate. Nevertheless, many vendors do not publish this data, or such data is not provided in a standardized way. For example, in some cases, raw sensor values are provided, while in other cases a normalized value in the range 0-1 is facilitated. Similarly, some DoF are usually not provided as a result of a sensor measure, but as an interpolation of related measures.

6.2. Gloves for Kinesthetic Feedback

Actuator technologies for kinesthetic feedback can be classified into two modes [20]:
  • Passive actuation principle or impedance control. This involves the application of a resistance to the hand and fingers in accordance with their movement. An impedance glove has to detect the movement of the fingers (sensing of the motion) and to apply a resistance force to provide kinesthetic feedback. Therefore, these gloves only provide feedback when the user tries to move, but not when the user’s hand remains motionless. This technology is intrinsically safe as there is no chance of harm for the user, even in case of system failure. Some technologies used to provide passive kinesthetic feedback are: magnetorheological fluids (MRFs), brakes, clutches and springs, and pneumatic jamming.
  • Active actuation principle or admittance control. In this case, the smart gloves apply force to fingers to make them move. This technology can provide not only active motion, but also resistance force or torque. The advantage of the active solution is to provide active control and simulate active force/motion output in a high update rate, while its disadvantage is potential risk of injuring the fingers in the event of a system failure. To avoid this possible failure, most of the active gloves limit the maximum output force to about 10 Newtons. Some technologies used are: DC servo motors, hydro pump or valves, pneumatic pump or valves, dielectric elastomer, etc.
In the literature, most force-feedback gloves are using the passive control principle [20]. Similarly, commercial smart gloves providing kinesthetic feedback also develop the passive mode, see Table 6. Notice, just three among all the selected gloves provide this feature. Two models have been recently discontinued: VR Gluv and TeslaSuit Gloves. In this table, it is shown the different technologies used and the actuated DoF. Despite the differences in technologies, it can be observed a similar performance in terms of actuated DoF and force exerted. The discontinued VR Gluv provided 10 actuated DoF, with two actuated points per finger, in the distal and proximal phalanges.

6.3. Gloves for Tactile Feedback

A good number of smart gloves have focused on tactile feelings at hands and fingers, see Table 7. The scientific literature describes many attempts to provide tactile feedback involving different features [18], such as perception hardness (hard/soft), warmness (warm/cold) macro roughness (uneven/flat), fine roughness (rough/smooth), and friction (moist/dry, sticky/slippery). Several technologies have also been explored, mainly electric motors, but also microfluidic arrays or electrostatic attraction. In case of the commercial solutions, vibration is the unique technology in use. Particularly, it is rather common to include linear resonant actuators (LRAs), similar to the vibrating motors used in game controllers and smartphones. In most cases, the sensations produced by the tactile actuators are not specified, but they can be programmed in accordance with the desired application.
Table 7 gathers the nine out of 24 commercial smart gloves providing tactile feedback. As it can be observed, they share many features. Most of them involve the use of LRA over the fingertips, thumb and palm. Usually, they are programmable, and, in some cases, they involve the provision of specific feedback, such as the detection of collisions, textures, button clicks, etc. The VR Free Haptics gloves have not been included because no information about sensors and features was found.

6.4. Other Features

In addition to the main capabilities described in the previous sections, commercial smart gloves include some capabilities to facilitate the user interaction:
  • Pressure sensors. Several gloves include pressure sensors to measure the force exerted. The CaptoGlove includes this kind of sensor on the thumb’s fingertip able to perceive pressure from 100 g. to 10 Kg, oriented to detect specific gestures, such as the pushing of a button. The Senso Glove also integrates pressure sensors to measure the grip pressure. VMG 13, 30, 35 and PS gloves also include pressure sensors, one per finger, which can be used to emulate a mouse/keyboard or develop custom actions. The Cynteract also has a pressure sensor.
  • Screen interaction. Some gloves include a special fabric at the fingertips to facilitate the interaction with touch screens, such as smart phones and table ones. The CaptoGlove also has this feature for the fingertips of the index finger and thumb.
  • Touch points. Some gloves include conductive points at certain points of the fingers or hand that are activated when the user touches them. The sensorial XR has conductive zones that enable users to trigger specific customized actions. The Peregrine, a previous version of the future Peregrine VR glove, includes 17 touch points: five points per finger (three on the pinky), each makes a keystroke when touched by the thumb tip contact. The MusicGlove includes six Nora-LX Conductive Metallized Fabric (Ni/Cu plated plain weave fabric) located on all five fingertips and one on the proximal interphalangeal joint on the lateral aspect of the index finger. When the lead of the thumb touches any of the other five leads, an electrical connection is closed which is then registered by the computer as an event. Notice we discarded some devices as smart gloves because they only provide this feature (see Section 3).

7. Application Areas

The applications and projects in which the commercial smart gloves have been used as well as the motivation for their use give us an idea of the capacity for measurement or response that can be expected from them. For this reason, this section was born, not only to have a starting point for possible uses and applications but also to learn more about gloves in their study environments.
During our search for commercial smart gloves, we have come across a multitude of fields of application where they are used, see Table 8. Many gloves provide a new range of applications in gaming, industry, surgery training, rehabilitation and education. Throughout this section, we will try to classify these fields of application, bearing in mind that in many cases, these fields are not independent, but rather overlap or even more generic ones include more specific ones. For example, some gloves with haptic feedback are being proposed for hand and finger rehabilitation or surgery training, both fields belonging to a medicine scope. Another example would be the communication by means of gestures (motion capture) with a person with some disability during an emergency (medicine and health care). Taking this into account, we have classified the fields of application into six categories, which we will detail in the following subsections, together with those gloves most commonly used in each of them.

7.1. Medicine and Remote Health Care

This category encompasses all those gloves that are used in the field of medicine in general and in particular in issues directly related to remote health care. In this category, we find two main areas of application, such as remote manipulation and rehabilitation:
  • In the case of remote manipulation of robot arms/hands, its main application is related to teach the fundamental skills of robotic surgery to novice and experienced surgeons (usually through simulations) and that of performing surgery on real patients.
  • In the field of rehabilitation, we can find two main types of applications. Firstly, those aimed at making a diagnosis of the functionality of the hand, that is, to check if the movements of the hand when carrying out certain actions are adequate (for example, for patients who have suffered strokes). Secondly, we have applications dedicated to actively treating mobility problems, making users perform a series of exercises (usually through the development of specific video games for it). In many cases, they are focused on the recovery of hand mobility after a stroke. In this field, one of the most outstanding smart gloves is Rapael. Some smart gloves have been specifically developed for rehabilitation purposes: Anika Rehap, CaptoGlove, Rapael, Handtutor, MusicGlove.

7.2. Motion Capture

Gloves whose main purpose is to capture movements belong to this category. Obviously, capturing the movement of the wearer’s hand is a requirement of virtually any application that uses the type of gloves discussed in the paper. However, in this category we will refer above all to two main areas of application: (i) motion capture that is carried out mainly for use in animations, digital avatars, etc.; and (ii) the specific analysis of hand gestures.
Motion capture has the field of entertainment as one of its main applications. This type of capture has been used profusely in recent years in the world of film and television, music concerts, as well as in video games, largely for the recreation of virtual characters. Said movement of the hands using gloves is usually accompanied in certain cases by other elements, such as full-body suits. Some smart gloves have been specifically developed for this purpose, and indeed, they are sold as a part of a smart suit for whole body tracking: Cobra Glove, Nansense, Perception Neuron and Rokoko.
Gestural analysis of hand movements is largely used for communication and sign language, which rely on hand poses that can be relatively more complex and involve close interaction of the fingers. These types of data captures can be applied in multiple areas, such as patient monitoring, virtual and augmented reality navigation and manipulation, home automation, robotics, vehicle interfaces, PC interfaces, and lexicon translation of sign languages, among others [95]. In particular, the use of smart gloves in the field of sign language has acquired special relevance in recent years [96], being 5DT one of the more relevant smart gloves in this field.

7.3. Video Games

In this case, we will address exclusively those video games focused on the world of leisure, entertainment, or even sport. We will not take into account other types of video games in the style of serious games, such as those used to support rehabilitation tasks, simulation training, teaching, etc.
The field of video games is eminently transversal, as we see in the rest of the categories, we could include it in practically all of them (video games of one type or another are used in medicine, motion capture, simulators, manipulation of 3D objects, etc.). However, beyond that generic approach, touching on so many fields of application, we wanted to highlight in this category its specific use as an independent field focused on the purely playful side of its application.

7.4. Simulation and Training

This category includes all those tasks that are oriented to carry out some type of training or simulation but excluding those related to medicine (since they are dealt with in the previous Section 7.1), and that have a learning component. In another case, they would be considered within the previous video game category.
In most cases, these tasks are related to training and learning in the use or simulation of different types of devices. We also consider here several industries, such as trucking, construction, mining, agriculture, aviation and many more.
In this category we also find gloves used in simulators for music learning, such as Captoglove.

7.5. Manipulation of 3D Objects (Both Real and Virtual)

Object manipulation relies on contact between the hand and the object. This contact can involve any or all of the fingers, ranging from the fingertip to the entire length of the finger (even involving the palm). For example, grasping is an important class of these manipulations.
Gloves that are used in the manipulation of 3D objects that are in a totally virtual environment, as well as those that we can find in the real world, will be considered in this category. In the latter case, most of the applications in which gloves are used are related to the remote manipulation of a robot arm/hand (and as was the case with the previous section, we will not take into account in this case those uses related to medicine). We will also take into account in this category the gloves used both for design and for product testing.

7.6. XR Applications

This is a general category in which to accommodate those applications that do not have a clearly defined category, or whose entity is not sufficient to constitute an independent category. Therefore, here we can find applications that, from a certain point of view, we could have assigned to any of the previous categories, but that we have preferred not to categorize in such a specific way, or that could directly be assigned with the same weight to more than one category.
Example of this type of application are: transmit sensations of the virtual world with realism (such as textures or even raindrops on the hands); natural and accurate testing of pressures applied to and exerted by the hand; providing spatial guidance in 3D space; manipulation of multimodal data; virtual visits such as a zoo, interactivity in virtual worlds, etc.

8. Discussion

For a long time there have been a great interest in the development of smart gloves. They are considered as a natural way for human-computer interaction, particularly in XR environments where the user immersion and embodiment are given a great importance. Such importance has attracted the interest of many companies that have delivered numerous devices along the last years. Similarly, many researchers have attempted to take advantage of commercial devices to solve problems in multiple domains. Nevertheless, on the contrary, current technology seems not mature enough to provide satisfactory results in all the fields considered. There are some smart gloves whose development is mature enough to use in the scopes of medicine, simulation, or motion capture. However, there is still a long way to go in other fields of application such as videogames or manipulation of 3D objects, or even in XR in general. Also, and related to the fields of application, it is difficult to carry out a clear categorization. Except for some specific cases, most gloves, even being designed for a specific application, can actually reach to be used for general-purpose, and as a result the field of application becomes heterogeneous and difficult to define.
Related to the possible applications we consider prices are still high for the general consumer market and this is rather volatile, with new companies launched and other ones discontinued. This is something that can be specially observed during the last years, maybe fueled by the successes of related products, such as smartwatches and HMDs, in the context of startup initiatives. On the one hand, some of new companies described in this paper are still very active, such as Manus VR, SenseGlove, Neurodigital or Sensoryx, while other ones already discontinued, such as Plexus, Teslasuit or VRGluv. On the other hand, companies with a larger lifespan and a significant presence in the market and scientific literature have been recently discontinued, remarkably Cyberglove Systems. In any case, from the number of products identified it is clear the global and increased interest about this type of device, particularly from the view of potential applications.
The current smart gloves market can also be featured by the variety of options and features that make comparison difficult among available products. Firstly, a generally accepted name for the variety of devices does not exist. We opted by the smart gloves, but other names are already used in the domain. Similarly, a variety of glove types exist, being the more recognizable ones the exoskeletons. Secondly, there is not a clear set of features to be supported. Clearly, hand and finger pose estimation and motion tracking should be provided, while kinesthetic and tactile feedback can be optional. Furthermore, other features such as pressure or touch capabilities should be avoided as main capabilities, despite they can be very useful for certain applications. In any case, it would be important to clarify these capabilities in the description of the products. Thirdly, related to the main pose estimation and motion tracking capability, there does not exist any reference model that enables to compare the capabilities of the different products. As a clear example, some companies indicate that their gloves recognize more DoF that the available ones in the hand, a least according to the model described in Section 2. More commonly, it is not clear how many DoF are really supported because the vendors do not provide all the needed information. Actually, different technologies do not provide the same level of performance. IMU based solutions can be used to estimate the pose and tracking, but not to provide a direct measurement of specific DoF such as bend or stretch sensors. Vision-based methods have not been analyzed in this survey, because they are not really part of the gloves and do not support tracking beyond the field of view. Nevertheless, recently some smart gloves also include this technology and some kind of sensor fusion, such as SenseGlove Nova or VRFree. As a result, it is very difficult to provide a clear comparison. In this review, we have struggled to analyze the DoF capabilities, but other features such as motion range, sensing accuracy and update rate would be needed in order to have a clear picture. Information about these features is particularly difficult to find for the smart gloves and it is usually heterogeneous in nature, as some manufactures provide certain data, while other do not, which makes it difficult to compare. In any case, features obtained by vendors have been compiled in Table 9.
Many of the previous variability can be related to the desired application. IMU based gloves are usually found in motion capture initiatives where the key goal is to recognize the general position. Meanwhile, remote manipulation and videogames are more interested in capturing precise hand and finger movements to support the natural interaction of the user and provide a fully immersive and embodiment experience. By the contrary, medicine applications are usually related to the hand rehabilitation where the need for a precise tracking of the movements is not so important. Some other products cannot be clearly framed on some particular application area, but they are offered as a general-purpose solution, such as Hi5 VR or Manus. This review shows the existence of different approaches and it is not clear if in the future, a general-purpose solution will be available or application-specific products will be adopted.
Finally, there are a few other issues that have not been analyzed in this review but that would have a certain importance in order to select a smart glove. One issue is related to the calibration, particularly in the case of using an off-the-shelf IMU. Smart gloves have to be fitted to the human hand and fingers, which have a great variety of sizes and shapes depending on the person. Therefore, companies usually provide different gloves sizes. Nevertheless, because the sensor position along the hand and fingers is very important in order to detect the movements, a calibration process is needed before gloves can be used. Usually, companies include specific procedures and special software to support this in a more or less autonomous way. As an interesting example, Exo gloves offer a hand scanner that facilitates the measurement of the human hand and speeds up the calibration process. Related to the previous one, the reliability and lifespan of sensors is another main issue, particularly in the case of bend sensors, as they typically suffer from continuous operation. Another issue is related to the interoperability with XR platforms, such as programming frameworks (e.g., Unreal, Unity) and platforms (e.g., Oculus Rift, HTC Vive, Windows Mixed Reality). In this case, there exists a great variability of options that change frequently in short time. At this point, the development of open interoperable frameworks that facilitates the integration of third parties, particularly researchers, would be very interesting.

9. Conclusions

This review focused on active commercial smart gloves shows that there exists a great interest in these devices. There is a large number of running initiatives and, most importantly, many pieces of research are being developed involving commercial products, from sign language gesture detection to XR interaction. There are main differences in relation to previous years where the adoption were not so clear, as long as published reviews about smart gloves had not identified a so large number of solutions and applications in the commercial sector. We conclude that there is a recent trend that demonstrates and improved technological performance, particularly current devices are really portable and autonomous, and the maturity of the solutions captures real interest from users and researchers.
In any case, we would like to note that this review covers just a part of all the research on this field: active commercial smart gloves. Many research prototypes are also under development, involving the use of new types of sensors or materials. Similarly, vision-based methods have not been considered in this paper, but they are also being explored to recognize hand and finger position and tracking. Despite some problems that have not a clear solution, particularly occlusion, they also offer good performance in certain situations.
The first contribution of this paper is the precise description of the human and finger hand DoF and subsequent commercial smart gloves analysis. This provides a clear reference to compare the solutions available in the market. In addition, it also helps to understand the features and limitations of the various technologies. Another contribution is the review itself. To the best of our knowledge, this is the first time a review methodology, such as PRISMA, has been applied to perform a review about commercial smart gloves. This will facilitate the reproduction of the study in the future and the comparison with the current situation. From the review, we have identified two new glove types, strips of fabric and open fingertips, in addition to the generally recognized ones, exoskeleton and fabric. Similarly, smart gloves have been classified in accordance with three main capabilities: hand and finger pose estimation and motion tracking, kinesthetic feedback and haptic feedback. The review of the technologies shows a predominance for IMU and resistive bend sensors, but with a lot of variations and a trend towards the combination of technologies and sensor fusion. In addition, other related features are usually included: pressure sensors, screen interaction and touch points. All this will facilitate the comparison among smart gloves.
Finally, from our point of view, based on the depicted situation, it is very important to define frameworks that allow us to check and compare the different solutions available against clear references. The identification of DoF in commercial smart gloves is an example of the difficulties involved, not only because the partial and incomplete information provided by the vendors, but also because the differences among technologies. In this context some papers have been published comparing the hand pose recognition for specific gloves [26,34]. In any case, other features should be considered beyond, such as indicators to precisely compare the performance, and interoperability and integration of solutions, enabling users to move from the solutions of one vendor to another. There is a need for standardization and open-source initiatives that would contribute to a better development of this market, mainly for the development of final applications and for the application in research.

Author Contributions

Conceptualization, M.C.-R., I.O.-G.; investigation, M.C.-R., I.O.-G., F.A.M.-F., M.L.-N.; writing—original draft preparation, M.C.-R., F.A.M.-F.; writing—review and editing, M.C.-R., I.O.-G., F.A.M.-F. M.L.-N.; visualization, I.O.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

We acknowledge the atlanTTic research center for administrative and technical support given.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Slater, M.; Lotto, B.; Arnold, M.M.; Sanchez-Vives, M.V. How we experience immersive virtual environments: The concept of presence and its measurement. Anu. Psicol. 2009, 40, 193–210. [Google Scholar]
  2. Kim, M.; Jeon, C.; Kim, J. A study on immersion and presence of a portable hand haptic system for immersive virtual reality. Sensors 2017, 17, 1141. [Google Scholar] [CrossRef] [Green Version]
  3. Blanke, O.; Metzinger, T. Full-body illusions and minimal phenomenal selfhood. Trends Cogn. Sci. 2009, 13, 7–13. [Google Scholar] [CrossRef]
  4. Valkov, D.; Linsen, L. Vibro-tactile feedback for real-world awareness in immersive virtual environments. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; IEEE: New York, NY, USA, 2019; pp. 340–349. [Google Scholar] [CrossRef]
  5. Caserman, P.; Garcia-Agundez, A.; Konrad, R.; Göbel, S.; Steinmetz, R. Real-time body tracking in virtual reality using a Vive tracker. Virtual Real. 2019, 23, 155–168. [Google Scholar] [CrossRef]
  6. Bowman, D.A.; McMahan, R.P.; Ragan, E.D. Questioning naturalism in 3D user interfaces. Commun. ACM 2012, 55, 78–88. [Google Scholar] [CrossRef]
  7. Johnson-Glenberg, M.C. Immersive VR and education: Embodied design principles that include gesture and hand controls. Front. Robot. AI 2018, 5, 81. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Bohil, C.J.; Alicea, B.; Biocca, F.A. Virtual reality in neuroscience research and therapy. Nat. Rev. Neurosci. 2011, 12, 752–762. [Google Scholar] [CrossRef] [PubMed]
  9. Steuer, J. Defining virtual reality: Dimensions determining telepresence. J. Commun. 1992, 42, 73–93. [Google Scholar] [CrossRef]
  10. Sheridan, T.B. Musings on telepresence and virtual presence. Presence Teleoper. Virtual Environ. 1992, 1, 120–126. [Google Scholar] [CrossRef]
  11. Dipietro, L.; Sabatini, A.M.; Dario, P. A survey of glove-based systems and their applications. IEEE Trans. Syst. Man Cybern. Part C-Appl. Rev. 2008, 38, 461–482. [Google Scholar] [CrossRef]
  12. Zimmerman, T.G.; VPL Research Inc. Optical Flex Sensor. U.S. Patent 4,542,291, 17 September 1985. [Google Scholar]
  13. Zimmerman, T.G.; Lanier, J.Z.; VPL Research Inc. Computer Data Entry and Manipulation Apparatus and Method. U.S. Patent 4,988,981, 29 January 1991. [Google Scholar]
  14. Mansor, N.N.; Jamaluddin, M.H.; Shukor, A.Z. Concept and application of virtual reality haptic technology: A review. J. Theor. Appl. Inf. Technol. 2017, 95, 3320–3326. [Google Scholar]
  15. Zhao, Y.; Bennett, C.L.; Benko, H.; Cutrell, E.; Holz, C.; Morris, M.R.; Sinclair, M. Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–14. [Google Scholar] [CrossRef]
  16. Lee, J.; Sinclair, M.; Gonzalez-Franco, M.; Ofek, E.; Holz, C. TORC: A virtual reality controller for in-hand high-dexterity finger interaction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–13. [Google Scholar] [CrossRef]
  17. Sturman, D.J.; Zeltzer, D. A survey of glove-based input. IEEE Comp. Graphl. Appl. 1994, 14, 30–39. [Google Scholar] [CrossRef]
  18. Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Trans. Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Perret, J.; Vander Poorten, E. Touching virtual reality: A review of haptic gloves. In Proceedings of the ACTUATOR 2018: 16th International Conference on New Actuators, Bremen, Germany, 25–27 June 2018; VDE Verlag GmbH: Berlin, Germany, 2018; pp. 1–5. [Google Scholar]
  20. Wang, D.; Song, M.; Naqash, A.; Zheng, Y.; Xu, W.; Zhang, Y. Toward whole-hand kinesthetic feedback: A survey of force feedback gloves. IEEE Trans. Haptics 2019, 12, 189–204. [Google Scholar] [CrossRef] [PubMed]
  21. Wang, D.; Guo, Y.; Liu, S.; Zhang, Y.; Xu, W.; Xiao, J. Haptic display for virtual reality: Progress and challenges. Virtual Real. Intell. Hard. 2019, 1, 136–162. [Google Scholar] [CrossRef] [Green Version]
  22. Rashid, A.; Hasan, O. Wearable technologies for hand joints monitoring for rehabilitation: A survey. Microelectr. J. 2019, 88, 173–183. [Google Scholar] [CrossRef]
  23. Chen, W.; Yu, C.; Tu, C.; Lyu, Z.; Tang, J.; Ou, S.; Fu, Y.; Xue, Z. A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods. Sensors 2020, 20, 1074. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Yoon, Y.; Moon, D.; Chin, S. Fine Tactile Representation of Materials for Virtual Reality. J. Sens. 2020. [Google Scholar] [CrossRef] [Green Version]
  25. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; Prisma Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Glauser, O.; Wu, S.; Panozzo, D.; Hilliges, O.; Sorkine-Hornung, O. Interactive hand pose estimation using a stretch-sensing soft glove. ACM Trans. Graph. 2019, 38, 1–15. [Google Scholar] [CrossRef] [Green Version]
  27. Jörg, S.; Ye, Y.; Mueller, F.; Neff, M.; Zordan, V. Virtual hands in VR: Motion capture, synthesis, and perception. In SIGGRAPH Asia 2019 Courses; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–32. [Google Scholar] [CrossRef]
  28. Chen, D.K.; Chossat, J.B.; Shull, P.B. Haptivec: Presenting haptic feedback vectors in handheld controllers using embedded tactile pin arrays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–11. [Google Scholar] [CrossRef]
  29. Dzikri, A.; Kurniawan, D.E. Hand gesture recognition for game 3D object using the leap motion controller with backpropagation method. In Proceedings of the 2018 International Conference on Applied Engineering (ICAE), Batam, Indonesia, 3–4 October 2018; pp. 1–5. [Google Scholar] [CrossRef]
  30. Rijpkema, H.; Girard, M. Computer animation of knowledge-based human grasping. Acm Siggraph Comput. Graph. 1991, 25, 339–348. [Google Scholar] [CrossRef]
  31. Chen, K.Y.; Patel, S.N.; Keller, S. Finexus: Tracking precise motions of multiple fingertips using magnetic sensing. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 1504–1514. [Google Scholar] [CrossRef]
  32. Truong, H.; Zhang, S.; Muncuk, U.; Nguyen, P.; Bui., N.; Nguyen, A.; Lv, Q.; Chowdhury, K.; Dinh, T.; Vu, T. Capband Battery-free successive capacitance sensing wristband for hand gesture recognition. In Proceedings of the 16th ACM Conference on Embedded Networked Sensor Systems, Shenzhen, China, 4 November 2018; pp. 54–67. [Google Scholar] [CrossRef]
  33. Saponas, T.S.; Tan, D.S.; Morris, D.; Balakrishnan, R.; Turner, J.; Landay, J.A. Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22nd Annual ACM symposium on User interface software and technology, Victoria, BC, Canada, 4–7 October 2009; pp. 167–176. [Google Scholar] [CrossRef]
  34. Mizera, C.; Delrieu, T.; Weistroffer, V.; Andriot, C.; Decatoire, A.; Gazeau, J.P. Evaluation of Hand-Tracking Systems in Teleoperation and Virtual Dexterous Manipulation. IEEE Sens. J. 2020, 20, 1642–1655. [Google Scholar] [CrossRef]
  35. Arkenbout, E.A.; de Winter, J.C.; Breedveld, P. Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements. Sensors 2015, 15, 31644–31671. [Google Scholar] [CrossRef] [PubMed]
  36. Samraj, A.; Kumarasamy, R.; Rajendran, K.; Selvaraj, K. High-speed gesture modelling through boundary analysis of active signals from wearable data glove. Int. J. Grid Util. Comput. 2019, 10, 29–35. [Google Scholar] [CrossRef]
  37. Brand, J.; Piccirelli, M.; Hepp-Reymond, M.C.; Eng, K.; Michels, L. Brain Activation during Visually Guided Finger Movements. Front. Hum. Neurosci. 2020, 14. [Google Scholar] [CrossRef] [PubMed]
  38. Oliveira, T.; Escudeiro, N.; Escudeiro, P.; Rocha, E.; Barbosa, F.M. The VirtualSign Channel for the Communication Between Deaf and Hearing Users. IEEE Rev. Iberoam. Tecnol. Del Aprendiz. 2019, 14, 188–195. [Google Scholar] [CrossRef]
  39. Yang, X.; Zhou, Y.; Liu, H. Wearable Ultrasound-based Decoding of Simultaneous Wrist/Hand Kinematics. IEEE Trans. Ind. Electron. 2020. [Google Scholar] [CrossRef]
  40. Gao, Q.; Jiang, S.; Shull, P.B. Simultaneous Hand Gesture Classification and Finger Angle Estimation via a Novel Dual-Output Deep Learning Model. Sensors 2020, 20, 2972. [Google Scholar] [CrossRef] [PubMed]
  41. Saquib, N.; Rahman, A. Application of Machine Learning Techniques for Real-Time Sign Language Detection Using Wearable Sensors. In Proceedings of the 11th ACM Multimedia Systems Conference (MMSys ′20), Istanbul, Turkey, 8–11 June 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 178–189. [Google Scholar] [CrossRef]
  42. Cao, Y.; ShangGuan, W.; Shang, X.; Qiu, W.; Du, Y. Dynamical Interaction based Multi-objects Operation Simulation for Hub Airport APM System. In Proceedings of the IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 900–905. [Google Scholar] [CrossRef]
  43. Inoue, Y.; Kato, F.; Tachi, S. Finger Motion Measurement System for Telexistence Hand Manipulation. In Proceedings of the 2019 IEEE International Symposium on Measurement and Control in Robotics (ISMCR), Houston, TX, USA, 19–21 September 2019; pp. C2-1-1–C2-1-4. [Google Scholar] [CrossRef]
  44. Gu, X.; Zhang, Y.; Sun, W.; Bian, Y.; Zhou, D.; Kristensson, P.O. Dexmo: An Inexpensive and Lightweight Mechanical Exoskeleton for Motion Capture and Force Feedback in VR. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; Association for Computing Machinery: New York, NY, USA, 2016. [Google Scholar] [CrossRef]
  45. Steed, A.; Friston, S.; Pawar, V.; Swapp, D. Docking Haptics: Dynamic Combinations of Grounded and Worn Devices. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 22–26 March 2020; pp. 626–627. [Google Scholar] [CrossRef]
  46. Malik, A.; Lhachemi, H.; Shorten, R. I-nteract: A Cyber-Physical System for Real-Time Interaction with Physical and Virtual Objects Using Mixed Reality Technologies for Additive Manufacturing. IEEE Access 2020, 8, 98761–98774. [Google Scholar] [CrossRef]
  47. Hsu, Y.H. Exploring the Effect of Using Vibrate-Type Haptic Glove in the VR Industrial Training Task; KTH, School of Electrical Engineering and Computer Science: Stockholm, Sweden, 2020; Volume 431. [Google Scholar]
  48. Baritz, M.I. Impact of Effort Degree Developed in Fingers-Hand-arm Assembly, on the Hand Dexterity; Case Study; Bulletin of the Transylvania, University of Brasov: Brasov, Romania, 2016; Volume 9. [Google Scholar]
  49. Jaskiewicz, F.; Kowalewski, D.; Starosta, K.; Cierniak, M.; Timler, D. Chest compressions quality during sudden cardiac arrest scenario performed in virtual reality: A crossover study in a training environment. Medicine 2020, 99. [Google Scholar] [CrossRef]
  50. EL-Qirem, F.; Malak, M.Z.; Bani Salameh, A. Virtual Reality (VR) in Nursing Education: Jordan Case Study. Adv. Intell. Syst. Comput. 2020, 1205. [Google Scholar] [CrossRef]
  51. Xiao, S.; Ye, X.; Guo, Y.; Gao, B.; Long, J. Transfer of Coordination Skill to the Unpracticed Hand in Immersive Environments. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 258–265. [Google Scholar] [CrossRef]
  52. Jahn, K.; Kordyaka, B.; Ressing, C.; Roeding, K.; Niehaves, B. Designing Self-presence in Immersive Virtual Reality to Improve Cognitive Performance—A Research Proposal. In Information Systems and Neuroscience; Lecture Notes in Information Systems and Organization; Davis, F., Riedl, R., vom Brocke, J., Leger, P.M., Randolph, A., Fischer, T., Eds.; Springer: Cham, Switzerland, 2020; Volume 32. [Google Scholar] [CrossRef]
  53. Ricca, A.; Chellali, A.; Otmane, S. Influence of hand visualization on tool-based motor skills training in an immersive VR simulator. In Proceedings of the 19th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2020), Recife, Brazil, 9–13 November 2020. [Google Scholar]
  54. Trinon, H. Immersive Technologies for Virtual Reality—Case Study: Flight Simulator for Pilot Training. Master’s Thesis, University of Liege, Liege, Belgium, 2019. Available online: https://matheo.uliege.be/handle/2268.2/6443 (accessed on 19 January 2021).
  55. Englmeier, D.; Schönewald, I.; Butz, A.; Höllerer, T. Feel the Globe: Enhancing the Perception of Immersive Spherical Visualizations with Tangible Proxies. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1693–1698. [Google Scholar] [CrossRef]
  56. Englmeier, D.; O’Hagan, J.; Zhang, M.; Alt, F.; Butz, A.; Höllerer, T.; Williamson, J. TangibleSphere—Interaction Techniques for Physical and Virtual Spherical Displays. In Proceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society (NordiCHI ′20), Tallinn, Estonia, 25–29 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–11. [Google Scholar] [CrossRef]
  57. Meattini, R.; Biagiotti, L.; Palli, G.; Melchiorri, C. Grasp-Oriented Myoelectric Interfaces for Robotic Hands: A Minimal-Training Synergy-Based Framework for Intent Detection, Control and Perception. In Human-Friendly Robotics 2019 (HFR 2019); Springer Proceedings in Advanced Robotics; Ferraguti, F., Villani, V., Sabattini, L., Bonfè, M., Eds.; Springer: Cham, Switzerland, 2019; Volume 12. [Google Scholar] [CrossRef]
  58. Tseng, W.J.; Wang, L.Y.; Chan, L. FaceWidgets: Exploring Tangible Interaction on Face with Head-Mounted Displays. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST‘19), New Orleans, LA, USA, 20–23 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 417–427. [Google Scholar] [CrossRef]
  59. Freire, R.; Glowacki, B.R.; Williams, R.R.; Wonnacott, M.; Jamieson-Binnie, A.; Glowacki, D.R. OMG-VR: Open-Source Mudra Gloves for Manipulating Molecular Simulations in VR. 2020. Available online: https://arxiv.org/abs/1901.03532 (accessed on 20 January 2021).
  60. Kühn, V.; Abrami, G.; Mehler, A. WikNectVR: A Gesture-Based Approach for Interacting in Virtual Reality Based on WikNect and Gestural Writing. Lect. Notes Comput. Sci. 2020, 12190. [Google Scholar] [CrossRef]
  61. Rahmani Hanzaki, M. Surgical Training Using Proxy Haptics; A Pilot Study. Master’ Thesis, University of Alberta Libraries, Edmonton, AB, Canada, 2020. Available online: https://doi.org/10.7939/r3-6047-hz13 (accessed on 19 January 2021).
  62. Lacey, G.; Ridgway, P.; Bhattacharya, J. Measuring Surgical Knot Tying with 3D Vision and VR Gloves. 2019. Available online: https://www.semanticscholar.org/paper/Measuring-surgical-knot-tying-with-3D-vision-and-VR-Lacey-Ridgway/012bf54cc7728bd4cf96bb0892e6e4e1c32f9bb9?p2df (accessed on 20 January 2021).
  63. Balzano, W.; Minieri, M.; Stranieri, S. ManDri: A New Proposal of Manus VR Facility Integration in Everyday Car Driving. Adv. Intell. Syst. Comput. 2019, 927. [Google Scholar] [CrossRef]
  64. Black, J.M.; Bradshaw, J.K.; Cokas, C.A.; Ham, K.H.; McNair, E.D.; Rooney, B.T.; Swartz, J.M. ESPN VR Batting Cage. In SIGGRAPH ′20: ACM SIGGRAPH 2020 Immersive Pavilion; Article No. 16; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–2. [Google Scholar] [CrossRef]
  65. Olbrich, M.; Graf, H.; Keil, J.; Gad, R.; Bamfaste, S.; Nicolini, F. Virtual Reality Based Space Operations—A Study of ESA’s Potential for VR Based Training and Simulation. Lect. Notes Comput. Sci. 2018, 10909. [Google Scholar] [CrossRef]
  66. Gregory, J.M.; Reardon, C.; Lee, K.; White, G.; Ng, K.; Sims, C. Enabling Intuitive Human-Robot Teaming Using Augmented Reality and Gesture Control. In Proceedings of the Artificial Intelligence for Human-Robot Interaction AAAI Symposium Series (AI-HRI 2019), Arlington, Virginia, USA, 7–9 November 2019. [Google Scholar]
  67. Allspaw, J.; Heinold, L.; Yanco, H.A. Design of Virtual Reality for Humanoid Robots with Inspiration from Video Games. Lect. Notes Comput. Sci. 2019, 11575. [Google Scholar] [CrossRef]
  68. Capelle, E.; Benson, W.N.; Anderson, Z.; Weinberg, J.B.; Gorlewicz, J.L. Design and Implementation of a Haptic Measurement Glove to CreateRealistic Human-Telerobot Interactions. In Proceedings of the International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020. [Google Scholar]
  69. Fermoselle, L.; Gunkel, S.; ter Haar, F.; Dijkstra-Soudarissanane, S.; Toet, A.; Niamut, O.; van der Stap, N. Let’s Get in Touch! Adding Haptics to Social VR. In Proceedings of the ACM International Conference on Interactive Media Experiences (IMX ‘20), Barcelona, Spain, 17–19 June 2020; pp. 174–179. [Google Scholar] [CrossRef]
  70. Parger, M.; Tang, C.; Xu, Y.; Twigg, C.; Tao, L.; Li, Y.; Wang, R.; Steinberger, M. UNOC: Understanding Occlusion for Embodied Presence in Virtual Reality. 2020. Available online: https://arxiv.org/abs/2012.03680 (accessed on 20 January 2021).
  71. Crisan, S. A Novel Perspective on Hand Vein Patterns for Biometric Recognition: Problems, Challenges, and Implementations. In Biometric Security and Privacy; (Signal Processing for Security Technologies); Jiang, R., Al-maadeed, S., Bouridane, A., Crookes, P., Beghdadi, A., Eds.; Springer: Cham, Switzerland; New York, NY, USA, 2017. [Google Scholar] [CrossRef]
  72. Wei, X.; Wan, X.; Huang, S.; Sun, W. The Application of Motion Capture and 3D Skeleton Modeling in Virtual Fighting. Lect. Notes Comput. Sci. 2017, 10582. [Google Scholar] [CrossRef]
  73. Kang, M.G.; Yun, S.J.; Lee, S.Y.; Oh, B.M.; Lee, H.H.; Lee, S.U.; Seo, H.G. Effects of Upper-Extremity Rehabilitation Using Smart Glove in Patients with Subacute Stroke: Results of a Prematurely Terminated Multicenter Randomized Controlled Trial. Front. Neurol. 2020, 11. [Google Scholar] [CrossRef]
  74. Lee, H.J.; Chang, W.H.; Lee, A.; Kim, H.G.; Ko, S.H.; Seong, H.Y.; Shin, Y.I.; Kim, Y.H. A Smart Glove Digital System Promotes Restoration of Upper Limb Motor Function and Enhances Cortical Neuroplastic Changes in Subacute Stroke Patients: A Randomized Controlled Trial. 2019. Available online: https://doi.org/10.21203/rs.2.15307/v1 (accessed on 20 January 2021).
  75. Kim, Y.H.; Park, S.Y.; Jung, J.H. Effect of ‘RAPAEL Smart Glove’s on Cognitive Function and Activities of Daily Living in Mild Cognitive Impairment. J. Soc. Occup. Aged Dement. 2018, 12, 75–85. [Google Scholar] [CrossRef]
  76. Kim, K. Effect of Virtual Reality Rehabilitation Program with RAPAEL Smart Glove on Stroke Patient’s Upper Extremity Functions and Activities of Daily Living. J. Korean Soc. Intergr. Med. 2019, 7, 69–76. [Google Scholar] [CrossRef]
  77. Kim, H.; Lee, A.; Shin, Y.I.; Chang, W.H.; Koo, K.H.; Seong, H.; Kim, Y.H. Effects of digital smart glove system on motor recovery of upper extremity in subacute stroke patient. Ann. Phys. Rehabil. Med. 2018, 61, e28. [Google Scholar] [CrossRef]
  78. Jung, H.; Kim, H.; Jeong, J.; Jeon, B.; Ryu, T.; Kim, Y. Feasibility of using the RAPAEL Smart Glove in upper limb physical therapy for patients after stroke: A randomized controlled trial. In Proceedings of the 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Seogwipo, Korea, 11–15 July 2017; pp. 3856–3859. [Google Scholar] [CrossRef]
  79. Lee, H.S.; Lim, J.H.; Jeon, B.H.; Song, C.S. Non-immersive Virtual Reality Rehabilitation Applied to a Task-oriented Approach for Stroke Patients: A Randomized Controlled Trial. Restor. Neurol. Neurosci. 2020, 38, 165–172. [Google Scholar] [CrossRef] [PubMed]
  80. Dyck, F.; Stöcklein, J.; Eckertz, D.; Dumitrescu, R. Mixed Mock-up—Development of an Interactive Augmented Reality System for Assembly Planning. Lect. Notes Comput. Sci. 2020, 12190. [Google Scholar] [CrossRef]
  81. Kuling, I.A.; Gijsbertse, K.; Krom, B.N.; van Teeffelen, K.J.; van Erp, J.B.F. Haptic Feedback in a Teleoperated Box & Blocks Task. Lect. Notes Comput. Sci. 2020, 12272. [Google Scholar] [CrossRef]
  82. Henny, J. Designing a Haptic Palm Strap for the SenseGlove. Master’s Thesis, Delft University of Technology, Delft, The Netherlands, 2019. Available online: http://resolver.tudelft.nl/uuid:e5e74dfd-0b78-44dc-bf23-d23100fc460f (accessed on 20 January 2021).
  83. Shor, D.; Zaaijer, B.; Ahsmann, L.; Weetzel, M.; Immerzeel, S.; Eikelenboom, D.; Hartcher-O’Brien, J.; Aschenbrenner, D. Designing Haptics: Improving a Virtual Reality Glove with Respect to Realism, Performance, and Comfort. Int. J. Autom. Technol. 2019, 13, 453–463. [Google Scholar] [CrossRef]
  84. Zaaijer, B. Designing a Haptic Interface from a Vision-centred Approach. Master’s Thesis, Delft University of Technology, Delft, The Netherlands, 17 July 2020. Available online: http://resolver.tudelft.nl/uuid:f7c60e77-6c3a-4d33-8192-12ceeff43c48 (accessed on 20 January 2021).
  85. Kreimeier, J.; Karg, P.; Götzelmann, T. Tabletop virtual haptics: Feasibility study for the exploration of 2.5D virtual objects by blind and visually impaired with consumer data gloves. In Proceedings of the 13th ACM International Conference on Pervasive Technologies Related to Assistive Environments (PETRA ‘20), New York, NY, USA, 30 June–3 July 2020; pp. 1–10. [Google Scholar] [CrossRef]
  86. Soccini, A.M.; Grangetto, M.; Inamura, T.; Shimada, S. Virtual Hand Illusion: The Alien Finger Motion Experiment. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1165–1166. [Google Scholar] [CrossRef] [Green Version]
  87. Nainggolan, F.; Siregar, B.; Fahmi, F. Design of Interactive Virtual Reality for Erection Steel Construction Simulator System Using Senso Gloves. J. Phys. Conf. Ser. 2019, 1542, 012019. [Google Scholar] [CrossRef]
  88. Fahmi, F.; Nainggolan, F.; Siregar, B.; Soeharwinto; Zarlis, M. User experience study on crane operator erection simulator using senso glove in a virtual reality environment. In Proceedings of the 2020 International Conference on Information Technology and Engineering Management (ITEM 2020), Batam, Indonesia, 2–4 April 2020. [Google Scholar]
  89. Chan, J.C.P.; Irimia, A.S.; Ho, E.S.L. Emotion Transfer for 3D Hand Motion using StarGAN. In Proceedings of the EG UK Computer Graphics & Visual Computing 2020 (CGVC 2020), London, UK, 10–11 September 2020. [Google Scholar]
  90. Dutta, D.; Modak, S.; Kumar, A.; Roychowdhury, J.; Mandal, S. Bayesian network aided grasp and grip efficiency estimation using a smart data glove for post-stroke diagnosis. Biocybern. Biomed. Eng. 2017, 37, 44–58. [Google Scholar] [CrossRef]
  91. Agada, R.; Akinlaja, T.; Yan, J. Multi Sensor Performance Driven Data Fusion for Sign Language Synthesis. Proceeding of the 2018 International Conference on Computational Science and Computational Intelligence (CSCI), Las Vegas, NV, USA, 12–14 December 2018; pp. 458–461. [Google Scholar] [CrossRef]
  92. Simmons, S.; Clark, K.; Tavakkoli, A.; Loffredo, D. Sensory Fusion and Intent Recognition for Accurate Gesture Recognition in Virtual Environments. Lect. Notes Comput. Sci. 2018, 11241. [Google Scholar] [CrossRef]
  93. Masoud, S.; Chowdhury, B.; Son, Y.J.; Kubota, C.; Tronstad, R. A dynamic modelling framework for human hand gesture task recognition. In Proceedings of the 2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018, Orlando, FL, USA, 19–22 May 2018; pp. 563–568. [Google Scholar]
  94. Tu, D.; Bein, D.; Gofman, M. Designing a Unity Game Using the Haptic Feedback Gloves, VMG 30 Plus. In Proceedings of the 17th International Conference on Information Technology–New Generations (ITNG 2020), Las Vegas, NV, USA, 12 May 2020; Volume 1134. [Google Scholar] [CrossRef]
  95. Galván-Ruiz, J.; Travieso-González, C.M.; Tejera-Fettmilch, A.; Pinan-Roescher, A.; Esteban-Hernández, L.; Domínguez-Quintana, L. Perspective and Evolution of Gesture Recognition for Sign Language: A Review. Sensors 2020, 20, 3571. [Google Scholar] [CrossRef]
  96. Ahmed, M.A.; Zaidan, B.B.; Zaidan, A.A.; Salih, M.M.; Lakulu, M.M.B. A Review on Systems-Based Sensory Gloves for Sign Language Recognition State of the Art between 2007 and 2017. Sensors 2018, 18, 2208. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Human hand model.
Figure 1. Human hand model.
Sensors 21 02667 g001
Figure 2. Finger movements (except thumb).
Figure 2. Finger movements (except thumb).
Sensors 21 02667 g002
Figure 3. Thumb movements.
Figure 3. Thumb movements.
Sensors 21 02667 g003
Figure 4. Wrist movements.
Figure 4. Wrist movements.
Sensors 21 02667 g004
Figure 5. Published reviews about smart gloves timeline.
Figure 5. Published reviews about smart gloves timeline.
Sensors 21 02667 g005
Figure 6. Main results of the PRISMA literature review stages.
Figure 6. Main results of the PRISMA literature review stages.
Sensors 21 02667 g006
Figure 7. Pictures of commercial smart gloves (the VMG 35 Haptic is not included because it is like the VMG-8).
Figure 7. Pictures of commercial smart gloves (the VMG 35 Haptic is not included because it is like the VMG-8).
Sensors 21 02667 g007
Table 1. Discarded gloves.
Table 1. Discarded gloves.
Smart GloveCountryURL 1
AcceleGloveUSAanthrotronix.com (accessed on 27 February 2021)
Avatar VRSpainneurodigital.es (accessed on 27 February 2021)
CyberGloveUSAcyberglovesystems.com (accessed on 27 February 2021)
CybertouchUSAcyberglovesystems.com (accessed on 27 February 2021)
CyberGraspUSAcyberglovesystems.com (accessed on 27 February 2021)
CyberForceUSAcyberglovesystems.com (accessed on 27 February 2021)
EsogloveSingaporeroceso.com/esoglove-pro (accessed on 27 February 2021)
ExoHandGlobalfesto.com/group/en/cms/10233.htm (accessed on 27 February 2021)
EXOS wrist DK2Japanexiii.jp (accessed on 27 February 2021)
FingerTrackingGermanyar-tracking.com/en/product-program/fingertracking (accessed on 27 February 2021)
Glohera SinfomiaItalygloreha.com/sinfonia/ (accessed on 27 February 2021)
GloveOneSpainneurodigital.es (accessed on 27 February 2021)
GoGlove USAgoglove.io (accessed on 27 February 2021)
Haptic PivotGlobalmicrosoft.com/en-us/research/publication/haptic-pivot-on-demand-handhelds-in-vr (accessed on 27 February 2021)
HaptxUSAhaptx.com (accessed on 27 February 2021)
HGloveFrancehaption.com/en/products-en/hglove-en.html (accessed on 27 February 2021)
HumanGloveItalyhmw.it/en (accessed on 27 February 2021)
KeyGloveUSAkeyglove.net (accessed on 27 February 2021)
Maestro Gesture GloveCanadamaestroglove.com (accessed on 27 February 2021)
Music GloveUSAflintrehab.com/product/musicglove-hand-therapy (accessed on 27 February 2021)
NugloveUSAanthrotronix.com (accessed on 27 February 2021)
PeregrineCanadaperegrineglove.com (accessed on 27 February 2021)
PlexusUKplexus.in (accessed on 27 February 2021)
PolehemusCanadapolhemus.com/motion-tracking/hand-and-finger-trackers (accessed on 27 February 2021)
SaeboGloveUSAsaebo.com/shop/saeboglove (accessed on 27 February 2021)
Tactile GloveGlobalpressureprofile.com/body-pressure-mapping/tactile-glove (accessed on 27 February 2021)
TeslaSuit GlovesUSAteslasuit.io/blog/vr-glove-by-teslasuit/ (accessed on 27 February 2021)
Valve Index ControllersUSAvalvesoftware.com/es/index/controllers (accessed on 27 February 2021)
VRGluvUSAvrgluv.com/enterprise (accessed on 27 February 2021)
1 All URLs were last accessed at 27 February 2021.
Table 2. Commercial smart gloves (first row abbreviations: Tr. = “Hand and finger pose estimation and motion tracking”; KF = “Kinesthetic Feedback”; TF = ”Tactile Feedback”; first column abbreviations: v. = “versions”).
Table 2. Commercial smart gloves (first row abbreviations: Tr. = “Hand and finger pose estimation and motion tracking”; KF = “Kinesthetic Feedback”; TF = ”Tactile Feedback”; first column abbreviations: v. = “versions”).
Smart GloveCountryURL 1Glove TypeTr.KFTFPrice
5DT (2 v.)USA5dt.comOpen tipsX $2990–$5495
Anika RehapRussiazarya-med.comStripsX $1300
CaptoGloveItalycaptoglove.comFabricX $315
Cobra Glove Germanysynertial.comOpen tipsX $7450
CynteractGermanycynteract.com/enFabricX X€500
Dexmo (3 v.)Chinadextarobotics.comExoskeletonXXX$36,000
Exo Glove (3 v.) Germanysynertial.comStrips 2X $4980
Forte Data GloveUSAbebopsensors.comStripsX X$3000
HandTutorIsraelhandtutor.comFabricX €3400
Hi5 VRChinanoitom.comOpen tipsX X$999
Manus Prime IIThe Netherlandsmanus-vr.comOpen tipsX €1499
Manus Prime II HapticsThe Netherlandsmanus-vr.comOpen tipsX X€2499
MoCap Pro
SuperSplay
New Zealandstretchsense.com/product/mocap-pro-super-splayFabricX $7150
Nansense R2 (3 v.)USAnansense.com/FabricX $4798
Perception Neuron Studio GlovesChinaneuronmocap.com/content/product/perception-neuron-studio-glovesOpen tipsX $1499
RapaelGermanyneofect.comStripsX $1925
RokokoDanishrokoko.comOpen tipsX $995
SenseGlove DK1The Netherlandssenseglove.comExoskeletonXXX€2999
SenseGlove NovaThe Netherlandssenseglove.comExoskeleton 3XXX€4500
Sensorial XRSpainneurodigital.esFabricX X€11,995
Senso Glove DK3USAsenso.me/orderOpen tipsX X$999
VMG (4 v.)USAvirtualmotionlabs.comFabricX $1000-
VMG 35 HapticUSAvirtualmotionlabs.comFabricX X-
VRFree 4Switzerlandsensoryx.comOpen tipsX CHF750
1 All URLs were last accessed on 27 February 2021. 2 Includes rings for each of the fingers and thumb. 3 The SenseGlove Nova is actually a kind of tighten exoskeleton, wrapped around the hand, similar to a strips glove. 4 VRFree also offers a Haptic version but no information was found about features.
Table 3. Ergonomics and wearability features in commercial smart gloves.
Table 3. Ergonomics and wearability features in commercial smart gloves.
Smart GloveSizeWeight (Grams)Battery and
Autonomy
ConnectionOther Features
5DTUniqueNAA battery packUSB, RS232
Bluetooth
-
Black stretch Lycra
-
Wireless kit
Anika RehapAdaptable200Wired providedUSB
CaptoGloveUniqueNALi-Ion Polymer 10 hBLE 4.0
-
Fabric glove
-
Washable and breathable
Cobra Glove4 (SS, S, M, L)70–150AA batteriesWi-Fi
-
Detachable electronics
Cynteract3 (S, M, L)NANAUSB
DexmoUnique300Li-Ion Polymer 5 hWi-Fi 2.4
USB
-
Memory foam hand pad
-
Mechanism to minimize sweating
Exo Glove3 (S, M, L)145External AA batteriesWi-Fi 2.4
BLE 5.0, SD Card
-
Modularity
-
Finger freedom with ring system
Forte Data GloveUnique 103.5Li Polymer
6–8 h
BLE
USB
-
Neoprene, nylon and Lycra
HandTutor5200NoUSB
Hi5 VR2 (S, M)105AA batteries
3 h
Wi-Fi 2.4
-
Antibacterial, breathable elastic textile
Manus Prime II 1Unique60Batteries
5 h
Wi-Fi 2.4
-
Antibacterial
-
Sports polyester
MoCap Pro SuperSplay2 (S/M, M/L)110Battery
8 h
Bluetooth, Wi-Fi,
USB-c, SD card
-
Antibacterial, breathable, stretchy fabric
-
Palm rubberized grips
-
Velcro for optical markers
Nansense R23 (S, M, L)255Battery
6-8 h
Wi-Fi 2.4, 5
USB-A
-
Single piece of fabric
-
Velcro for markers
-
No calibration automatic sensor compensation
Perception Neuron3 (S, M, L)105AA batteries
5 h
Wi-Fi
RapaelUnique132BatteryBluetooth
-
Elastomer material
-
Easy cleaning
Rokoko4 (S, M, L, XL)70External power bankWi-Fi 2.4 Wi-Fi 5
-
Tight fit to keep sensor in place
SenseGlove DK1Unique300Li-Ion battery
2 h
USB
Bluetooth
-
Plastic and fabric
-
Wireless kit
SenseGlove NovaUnique315Battery
4 h
Bluetooth
-
Kind of armored glove
SensorialXRUnique140600 mA Li-Po 6-8 hBLE 5.0
-
Lycra with antibacterial
-
Fire-resistant treatments
Senso Glove DK35 (S, M, ML, L, XL)300Li-Ion Polymer
1.5 h
USB
RF, BLE
VMG 1NANALi-Po Battery
5–6 h
USB, Bluetooth
VRFree4 (S, M, L, XL)40Replaceable rechargeable batt.Wireless
USB-C
-
Multiple sensor types
-
A module has to be clipped on an HMD headset.
1 All versions share the same features.
Table 4. Commercial smart gloves for hand and finger pose estimation and motion tracking.
Table 4. Commercial smart gloves for hand and finger pose estimation and motion tracking.
Smart GloveSensor TechnologyIMUSensorsExternal IMU
5DT (2 v.)Fiber optic bend sensorsNo5/14No
CaptoGloveBend sensors15No
Cobra Glove (3 v.)IMU7/13/16NoYes
Dexmo (3 v.)Mechanical rotational sensorsNo5Yes
Exo Glove IMU6NoYes
Forte Data GloveBend sensors and IMU110Yes
HandTutorBend sensorsNo5No
Hi5 VRIMU/Optical hybrid6NoYes
Manus Prime IIResistive bend sensors and IMU110Yes
M. Prime II HapticsResistive bend sensors and IMU110Yes
MoCap Pro SuperSplay3 sensing zones splay sensors No6Yes
Nansense R2 (3v.)IMU7/12/15NoYes
Perception NeuronIMU6NoNo
RapaelResistor bend sensors and IMU15No
RokokoIMU without magnetometers7NoNo
SenseGlove DK1IMU and rotation encoders120Yes
SenseGlove NovaIMU + Vision (Pico Neo 2)15Yes
SensorialXRIMU7NoNo
Senso Glove DK3IMU8NoNo
VMG (4 v.)IMU and bend sensors5/5/16/01No
VMG 35 HapticIMU and bend sensors211No
VRFree 16 sensor types: bend, IMU, etc.NANANo
1 VRFree refers the use of 6 different, complementary sensor types that are fully integrated, but without detailing neither types, number nor location.
Table 5. DoF for commercial smart gloves (av. is used to indicate that the average movement of two joints is measured).
Table 5. DoF for commercial smart gloves (av. is used to indicate that the average movement of two joints is measured).
Smart GloveDoFFour FingersThumbOther Ones
5DT 5 sensors5 1E/F (av. PIP, MCP)E/F (av. PIP, MCP)No
5DT 14 sensors14 1E/F × 2(PIP, MCP) + A/AE/F × 2 (PIP, MCP)No
CaptoGlove5 2E/F (av. DIP, PIP)E/F × 2 (av. PIP, MCP)No
Cobra Glove 77 3E/F (PIP)E/F × 2 (PIP, MCP)Palm 5
Cobra Glove 1313 3E/F × 2 (PIP, MCP)E/F × 3 + A/APalm 5 × 2
Cobra Glove 1616 3E/F × 3 − 2 4E/F × 3 + A/APalm 5 × 2
Dexmo10E/F (MCP) + A/AE/F (MCP) + A/ANo
Exo Glove6 3E/F (av. PIP, MCP)E/F × 2 ((av. PIP, MCP), TM)No
Forte Data Glove13 1,6E/F + A/AE/F + A/AWrist E/F + A/A + P/S
HandTutor5 1E/F (av. DIP, PIP, MCP)E/F (av. PIP, MCP)No
Hi5 VR8 3E/FE/FWrist E/F + A/A + P/S
Manus Prime II14 1E/F × 2E/F × 2 + A/AWrist E/F + A/A + P/S
Manus Prime II Haptics14 1E/F × 2E/F × 2 + A/AWrist E/F + A/A + P/S
MoCap Pro SuperSplay11E/F + AAE/F + A/AWrist A/A
Nansense R2 77 3E/F (PIP)E/F × 2 + A/A Palm 5
Nansense R2 1212 3E/F × 2 (PIP, MCP)E/F × 3 + A/APalm 5
Nansense R2 1515 3E/F × 3 − 2 4E/F × 3 + A/APalm 5
Perception Neuron8 3E/FE/FWrist E/F + A/A + P/S
Rapael5E/F (av. DIP, PIP, MCP)E/F (av. PIP, MCP)Wrist E/F + AA + P/S
Rokoko9 3E/FE/F + A/AWrist E/F + AA + P/S
SenseGlove DK123 7E/F × 3 + A/AE/F × 3 + A/AWrist E/F + A/A + P/S
SenseGlove Nova8 8E/FE/F + A/AWrist E/F + A/A + P/S
SensorialXR9 3E/FE/F + A/AWrist E/F + A/A + P/S -
Senso Glove DK39 3E/FE/F + A/AWrist E/F + A/A + P/S
VMG 88 1E/FE/FWrist E/F + A/A + P/S
VMG 138 1E/FE/FWrist E/F + A/A + P/S
VMG 3019 1E/F × 2 + A/AE/F × 2 + A/AWrist E/F + A/A + P/S + Palm 5
VMG 35 Haptic23 1E/F × 3 + A/AE/F × 3 + A/AWrist E/F + A/A + P/S + Palm 5
VMG PS3 1NoNoWrist E/F + A/A + P/S
VRFree23 9E/F × 3 + A/AE/F × 3 + A/AWrist E/F + A/A + P/S
1 Estimated, taking into account the number of sensors and their location. 2 The CaptoGlove includes five sensors situated over the fingers (PID and DIP joins) and the vendor indicates that it provides 10 DoF, with two DoF per finger. This is a bit strange. Probably, each sensor is used to provide the average value of the two DoF. 3 Estimated, based on IMU. In case of Cobra Glove versions with seven and 13 sensors, interpolation to approximate the untracked finger joints is used. In the case of Manus Prime II an IMU sensor is located at the thumb MCP. 4 All fingers have three E/F movements, except for the pinky fingertip that just has two E/F movements (the DIP is not included). 5 Palm bending is detected. 6 The Forte Data Glove vendor indicates 28 DoF. According to our model, this is beyond the possible DoF for the human hand. 7 The SenseGlove DK1 has 23 DoF that have been described in [34]. 8 The SenseGlove Nova eight DoF are estimated considering just the sensors available. The vendor indicates additional DoF can be obtained by fusion with proprietary vision-based algorithms. 9 The VRFree indicates 23 DoF are obtained by fusion of different sensor technologies, including vision-based algorithms.
Table 6. Commercial smart gloves for kinesthetic feedback.
Table 6. Commercial smart gloves for kinesthetic feedback.
Smart GloveModeTechnologyActuated DoFForce
DexmoActiveServo motors50.3 N m
SenseGlove DK1PassiveMagnetic friction brakes and strings540 N
SenseGlove NovaPassiveBrakes and
mechanical wires
4 120 N
1 From thumb to ring finger.
Table 7. Commercial smart gloves for tactile feedback.
Table 7. Commercial smart gloves for tactile feedback.
Smart GloveTechnologyActuators
Number
Location Type
DexmoLRA6Fingertips, thumb and palmProgrammable
Forte Data GloveNon-resonant actuators6Fingertips, thumb and palmProgrammable
Hi5 VRVibration “rumbler”1WristProgrammable
Manus Prime II HapticsLRA5Fingertips and thumbProgrammable
SenseGlove DK1Vibration motors6Fingertips, thumb and palmCollisions, textures, button clicks
SenseGlove Nova2 LRA
1 Voice coil
3Thumb, index finger and hand 1Fell shapes, textures, stiffness, impacts and button clicks
SensorialXRCustomized low-latency LRA10Palm, thumb, index and middle finger1024 vibration profiles
Senso Glove DK3LRA vibration motor6Fingers and thumb
(under the last phalange)
More than 100 haptic effects
VMG 35 HapticVibro-tactile actuators5Fingers and thumbProgrammable
1 Thumb and index finger have a vibro-tactile actuator each one. The voice coil is located at the hub of the glove.
Table 8. Commercial smart gloves by application area.
Table 8. Commercial smart gloves by application area.
Smart GloveMedicine & Remote HealthcareMotion
Capture
Video GameSimulation & TrainingManipulation of 3D ObjectsXR
Applications
5DT[35,36,37][36,38,39,40,41] [42]
Anika RehapX
CaptoGloveX XXXX
Cobra GlovesX[43] XX
CynteractX X X
DexmoX[44]XX[45,46]
Exo Gloves X XX
Forte Data GloveX [47]XX
HandTutor[48]
Hi5 VR[49,50,51][52]X[53,54][55,56,57,58,59][60,61]
Manus[62][63][64][65][63,66,67,68][69]
MoCap Pro S. X
Nansense R2XXX [70]
Perception Neuron[71][72]X X
Rapael[73,74,75,76,77,78,79]
Rokoko X
SenseGlove DK1X [80][81,82,83][84,85]
Sense Glove Nova X
Senso GloveX[86]X[87,88] [89]
SensorialXRX XXX
VMG[90][91,92,93][94]XXX
VRfree XXX[34]X
Table 9. Commercial smart gloves metrological features.
Table 9. Commercial smart gloves metrological features.
Smart GloveFeatures
5DT
-
Continuous data for each sensor: 0–1
-
Minimum sampling rate for the full hand (all available sensors): 75 Hz
-
Flexure resolutions: 12-bit A/D for each sensor
-
Minimum dynamic range: 8 bits (256 angular values) per joint
CaptoGlove
-
Extension/flexion movements resolution: <1 degree
-
Tactile sensor: 1 pressure sensor for thumb’s fingertip 100 g–10 kg
Cobra Glove
-
Internal update rate: 500 Hz
-
Gyro range: 2000 degrees/s
-
Accelerometer range: 1–6 Gs
Dexmo
-
Kinesthetic-feedback: 1 DoF per finger, with a maximum force of 0.3 Nm
-
Frequency Transmission Range: 2.4 GHz
-
Accuracy: +/−0.5 degrees
Exo Glove
-
Gyro rotation vector: 1000 times/s
-
Rotation vector: 400 times/s
-
Gravity: 400 times/s
-
Linear acceleration: 400 times/s
-
Accelerometer: 500 times/s
-
Gyroscope: 400 times/s
-
Magnetometer: 100 times/s
Forte Data Glove
-
Accuracy and repeatability: +/−1.5 degrees
-
Latency: 150 frames/sec (<6 ms)
-
Sensor performance sample rate: 200 Hz
-
Frequency response: 100–2000 Hz
HandTutor
-
Sensitivity: 0.05 mm of wrist and fingers
-
Motion capture speed: Up to 1 m/s
Hi5 VR
-
Latency: <5 ms
-
Data rate: Up to 180 Hz
Manus Prime II 1
-
Sensor sampling rate: 90 Hz
-
Orientation accuracy: +/−2.5 degrees
-
Signal latency: <5 ms
-
Finger flexible sensor repeatability: >1,000,000 cycles
-
Orientation sensor accuracy: +/−2.5 degrees
Nansense R2
-
Data rate: 240 fps
-
Latency: +/−30 ms
Perception Neuron
-
Accelerometer range: +/−8 g
-
Gyroscope range: +/−2000 dps
-
Resolution: 0.02 degree
-
Frequency: 2400–2483 MHz
-
Accuracy: Roll 0.7°/Pitch 0.7°/Yaw 2°
-
Internal processing rate: 800Hz
-
Output rate: 60/90/96/100 Hz
Rokoko
-
Frequency: 400 Hz
-
3D orientation accuracy: +/−1 degree
-
Data rate: 100 fps
-
Latency: +/−20 ms
SenseGlove DK1
-
Force feedback output: 40 N
Senso Glove DK3
-
Frequency: 400 Hz
-
Latency: 15 ms
VMG 1
-
Finger Sensing Resolution: 12 bit (4096 step)
-
Sampling rate: 10–100 Hz
-
Accuracy: Roll +/−0.01 °/Pitch +/−0.01°/Yaw +/−0.05°
-
Scale range: +/−2 g, +/−4 g, +/−8 g
VRFree
-
Data rate: 100 MHz
-
Orientation resolution: 0.01 degree
-
Displacement resolution: 0.3 mm
-
Frequency: 120 Hz (8 ms)
1 All versions share the same features.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Caeiro-Rodríguez, M.; Otero-González, I.; Mikic-Fonte, F.A.; Llamas-Nistal, M. A Systematic Review of Commercial Smart Gloves: Current Status and Applications. Sensors 2021, 21, 2667. https://doi.org/10.3390/s21082667

AMA Style

Caeiro-Rodríguez M, Otero-González I, Mikic-Fonte FA, Llamas-Nistal M. A Systematic Review of Commercial Smart Gloves: Current Status and Applications. Sensors. 2021; 21(8):2667. https://doi.org/10.3390/s21082667

Chicago/Turabian Style

Caeiro-Rodríguez, Manuel, Iván Otero-González, Fernando A. Mikic-Fonte, and Martín Llamas-Nistal. 2021. "A Systematic Review of Commercial Smart Gloves: Current Status and Applications" Sensors 21, no. 8: 2667. https://doi.org/10.3390/s21082667

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop