Next Article in Journal
Optimization of Bandwidth Allocation and UAV Placement in Active RIS-Assisted UAV Communication Networks with Wireless Backhaul
Previous Article in Journal
UAV Communication in Space–Air–Ground Integrated Networks (SAGINs): Technologies, Applications, and Challenges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Smart Glove: A Cost-Effective and Intuitive Interface for Advanced Drone Control

1
Department of Theoretical and Applied Sciences, eCampus University, Via Isimbardi 10, 22060 Novedrate, Italy
2
Department of Computer, Control and Management Engineering, Sapienza University of Rome, Via Ariosto 25, 00185 Roma, Italy
3
Institute for Systems Analysis and Computer Science, Italian National Research Council, Via dei Taurini 19, 00185 Roma, Italy
4
Department of Computational Intelligence, Czestochowa University of Technology, ul. Dąbrowskiego 69, 42-201 Częstochowa, Poland
*
Author to whom correspondence should be addressed.
Drones 2025, 9(2), 109; https://doi.org/10.3390/drones9020109
Submission received: 20 December 2024 / Revised: 16 January 2025 / Accepted: 28 January 2025 / Published: 1 February 2025

Abstract

:
Recent years have witnessed the development of human-unmanned aerial vehicle (UAV) interfaces to meet the growing demand for intuitive and efficient solutions in UAV piloting. In this paper, we propose a novel Smart Glove v 1.0 prototype for advanced drone gesture control, leveraging key low-cost components such as Arduino Nano to process data, MPU6050 to detect hand movements, flexible sensors for easy throttle control, and the nRF24L01 module for wireless communication. The proposed research highlights the design methodology of reporting flight tests associated with simulation findings to demonstrate the characteristics of Smart Glove v1.0 in terms of intuitive, responsive, and hands-free piloting gesture interface. We aim to make the drone piloting experience more enjoyable and leverage ergonomics by adapting to the pilot’s preferred position. The overall research project points to a seedbed for future solutions, eventually extending its applications to medicine, space, and the metaverse.

1. Introduction

At present, the application of unmanned aerial vehicles (UAVs), or drones, has steeply increased, thus garnering increased focus not just from the military but also in the important sectors of agriculture, cinematography, surveillance, and industrial inspection [1,2]. In their original use for defense purposes, drones were so advanced in their technological development that they could be employed in many other non-military aspects, such as aerospace projects, without limitations [3,4,5]. Consequently, it is true that the adoption of drones in non-military sectors, such as agriculture, surveying, and mapping, has brought numerous advantages.
For example, drones in agriculture corps save employers much time scanning and controlling vast farming land that otherwise would have been covered by humans [6]. Drones have revolutionized practices such as targeted spraying and crop monitoring, thus helping to increase efficiency in favor of better yields and waste reduction [7]. They also help other operational fields by providing overall cost-effective solutions that diminish the needs of personnel and equipment in contrast to traditional methods. Above all, drones ensure safety by taking on risky tasks or responding to emergencies in dangerous places without unnecessarily risking human life [8,9]. UAVs have also proven to be a modern and irreplaceable tool in the energy sector in several specific applications, such as pipeline inspections and infrastructure monitoring.
Drones have incorporated advanced features like high-resolution cameras, lidar, and thermal sensors that have facilitated them in becoming impeccable in precision, data collection, and control, thus attaining optimum efficiency and safety in their uses, and they are always ready to perform tasks [10,11]. Modern UAV systems integrate multiple subsystems, emphasizing the aerial vehicle and the whole operational setup, including control stations and data links [5,12]. Acronyms such as RPAS (Remotely Piloted Aircraft Systems) and UAS (Unmanned Aircraft Systems) place stress on the view that drones are considered parts of more significant systems. This point of view is also advocated by the International Civil Aviation Organization (ICAO) and Eurocontrol [13,14,15], a United Nations agency. ICAO, in particular, plays a key role in defining global standards for safety and efficiency in aviation, focusing particularly on the seamless integration of drones into regulated airspace. These efforts highlight the complexity of managing these devices within current regulatory frameworks, underlining the need for continuous and fruitful cooperation between governments, industries, and regulatory bodies.
This technological advancement is also observable outside the operational scope and has been reflected in the human–drone interaction (HDI) field. Researchers have been focusing on how to make drones become a seamless part of human workflows by evaluating these systems [6]. A notable innovation in this sector is using gloves as a gesture-based apparatus designed for user-friendliness and state-of-the-art technology [16]. Aside from these components being the latest sensor technology, these gloves also can recognize hand and finger motions, besides registering them as cues for air traffic by radio, satellite signals, or the Internet. This approach gives a smoother and more involved way to drive drones, which does not require traditional remotes or complicated controls.
By employing innovative systems, such gloves permit users to control drones using such instruments as natural hand and finger movements, thus creating a very intuitive pilot experience [17,18]. Inherent ergonomics not only improves usability but also renders the use case for drone operation easy, thus enabling even users lacking previous experience to fly the drones. This could allow their incorporation into various other sectors, such as precision in medical procedures [19,20], sports-related experiences such as sports fans [14], and even initiatives that promote social inclusion [5,12], as well.
The development of human–drone interactions (HDIs) is exemplified by the smart glove, which enhances the operational aspect of these systems and improves control and precision. Several studies in the scientific literature [21,22,23] examine in detail the creation of devices capable of interacting with humans through simple hand and finger movements using smart gloves.
Many studies focus on hand gestures and recognition; for example, in [21], the authors discuss the development of a novel piezoelectric sensory glove designed for human–machine interactions. These sensors are used to detect hand and finger movements, which are then processed by a Machine Learning algorithm for gesture recognition. Another study [22] used a glove for hand and arm movement tracking with gesture recognition. Others [24] present similar studies where a wearable drone controller is implemented based on hand gesture recognition and vibrotactile feedback. These studies analyze and implement hand gestures, which are useful but also require very high accuracy of hand gestures and recognition by the drone. In [10], the authors present CaptAinGlove, a glove-based system utilizing capacitive textiles and inertial sensors designed for real-time hand gesture recognition aimed at drone control. In this study, in [25] they use a Raspberry Pi Zero W and an MPU-6050 in order to capture hand movements and elaborate them in real time. Here, the data are elaborated through neural networks in order to classify the different hand gestures.
Less attention has been paid to a direct approach to moving the drone simply by using a glove and turning the hand. In this paper, we propose a simple, functional, and low-cost approach to control a drone without employing a complicated interpretation of hand gestures but rather using a glove to directly issue movement commands by simply bending the hand up, down, right, and left.
The remainder of this paper is structured as follows: Section 2 details the research methodology used to address the problem and describes the enhancements made to the Smart Glove v1.0 prototype. Section 3 presents and discusses the results obtained from the prototype’s development. Finally, in Section 4, we conclude our study by suggesting new and potential developments that could broaden the scope of possible applications for Smart Glove v1.0.

2. Methods

In the present study, we approached integrating hardware design and software development principles to develop the Smart Glove v1.0 prototype. The design phase was characterized by critical and challenging activities, including formulating efficient algorithms for data processing, carefully selecting quality motion sensors, and defining the ergonomic requirements to be satisfied. These efforts aimed to improve the functionality of the glove while maximizing user comfort and ensuring an effective and intuitive interface for the intended applications. This final realization allowed us to obtain a very intuitive user interface in responding to commands, offering the pilot a more natural feeling when managing the flight.
Figure 1 illustrates a block diagram of the smart glove–drone system, showcasing the communication between the transmitter (glove-mounted), the receiver (drone-mounted), and the drone. The receiver (RX) and the flight controller communicate via signals. In specific configurations, we adopted the PWM technology to transmit information based on the pilot’s commands to the flight controller, which processed these data to manage the motors and ensure the drone’s stability and control.
So, the smart glove system is composed of two main parts: the transmitter (TX) and the receiver (RX) (Figure 2 and Figure 3). The receiver is a radio frequency (RF) module that modulates and sends the signals from the drone’s flight controller through PWM (pulse width modulated) signals. The glove (TX) interprets the pilots’ movements using the MPU6050 sensor, particularly leveraging the accelerometer and gyroscope, along with the flex sensor (which is placed on the glove’s index finger) for throttle data.
Next, the transmitter processes the data and transmits them from the nRF24L01 wireless module to the receiver (RX), which then forwards it to the drone’s flight controller using 6 PWM channels. Furthermore, a button is positioned on the transmitter (TX) for sensor calibration based on the glove’s position.
Among other things, the low cost of the hardware of the Smart Glove v1.0 prototype is a remarkable feature of its hardware design. The prototype includes the following low-cost hardware components:
  • Two Arduino Nano boards with Atmega 328P processors for data management and processing, one mounted on the glove and the other mounted on the drone.
  • Two nRF24L01+PA+LNA Wi-Fi modules (TX and RX) for transmitting and receiving communication signals. This device allows you to transmit the signal up to a distance of 1100 m, a significant increase compared to the 100 m of the basic version, which is typically used to create this type of prototype. The extended border can only be made if the SMA connector with the SMA antenna and range extension chip RFX2401C are present; the latter comprises PA, LNA, and switching circuits between the transmitter and receiver.
The PA is powerful, so the signal is merely amplified. Conversely, the LNA (low-noise amplifier) is the type of equipment that receives a pretty weak signal (practically under the microvolts level, i.e., approximately −100 dBm) coming from the antenna and, therefore, enough voltage (usually it is generated from about 0.5 to 1 V). This is implemented from the interconnection between the low-noise amplifier, the LNA in the receiver path, and the PA, the power amplifier in the transmission path, if a duplexer is applied in the vehicle’s lifecycle. This part splits the two signals, so the powerful output of PA causes no overload on the sensitive input of the LNA (Figure 4).
  • An MPU6050 is a motion sensing device based (Motion Processing Unit) on an MEMS (Micro-Electro Mechanical System) that integrates a 6-degrees-of-freedom (6-DOF) sensor in a single compact chip in terms of a three-axis accelerometer and 3-axis gyroscope. We integrated this sensor into the Smart Glove v1.0 device to simultaneously measure linear acceleration and angular velocity on the x, y, and z axes. This device communicates efficiently with microcontrollers like Arduino via serial data transmission using the I2C bus.
The accelerometer calculates acceleration using Newton’s second law (F = ma) and inertial forces resulting from movement or gravity. Using MEMS technology, the device functions like an elastic mass system that determines acceleration by measuring the elastic properties of its microstructures, including the gravitational force of 9.8 m/s2 in the z component.
The integrated gyroscope measures the angular velocity using the Coriolis force, which results from rotation. It is built on the principle of the conservation of angular momentum, with MEMS-based microscopic structures being the points of rotational movement. The gyroscope’s data are especially valuable in determining the object’s rotation and orientation. What distinguishes the MPU6050 is its Digital Motion Processor (DMP). The DMP is the first link in the chain of sensors, i.e., it internally filters the data and performs quite complex calculations simply and autonomously. Hence, the microcontroller has to process a smaller amount of data, and the software is simplified by defining tasks like data fusion and calibration of the sensors.
Therefore, the MPU6050 provides accurate orientation and motion data and can be used in various applications such as drone control.
  • One Flex sensor, placed on the glove’s index finger, detects deflection or flexion. The flex sensor is a device that changes its resistance based on its curvature or bending. These sensors are typically made of a thin layer of resistive material, such as graphene or conductive ink, deposited on a flexible substrate. Graphene-based materials are particularly noteworthy for their high conductivity and flexibility among piezoresistive materials.
  • One LiPo battery, 1200 mAh 7.2 V, whose capacity was chosen to ensure more than 10 h of the continuous Smart Glove v1.0 operation, has a longer operating time than the drone’s time of approximately 45 min.
Figure 5 shows the prototype of the TX Smart Glove v1.0.
The firmware, the heart of the system’s functionality, was developed using the Arduino Integrated Development Environment (IDE). Its implementation had the dual objective of managing the acquisition of data from inertial sensors and real-time processing for distinguishing hand and wrist gestures and managing the drone’s wireless communication.
  • A. Sensor Calibration
Since in applications that rely on sensors such as accelerometers and gyroscopes, the raw data often contain minor inaccuracies, typically due to the presence of noise or stochastic fluctuations that can hurt the measurement accuracy, in the case of the acceleration measurement, we used the average measurement over 200 periods. In this way, we could improve the stability and reliability of the measurements. This averaging process was performed by a specific loop that iterated the calculation of rotation angles, utilizing data from both the accelerometer and gyroscope across 200 periods, ultimately yielding the arithmetic mean of the collected data.
Specifically, Equation (1) computes the pitch angle average derived from accelerometer values as follows:
a c c _ a n g l e _ a v g = i = 1 200 θ i / 200
where θ, depending on the angle, is calculated using the following trigonometric Equations (2) and (3):
θ p i t c h = a t a n A x A y 2 + A z 2
θ r o l l = a t a n A y A x 2 + A z 2
For the calculation of gyroscope raw data-based angle averages, a generic Equation (4) was utilized for both the pitch and roll angles, as well as yaw, as follows:
G y r o _ a n g l e _ a v g = i = 1 200 G y r o _ a n g l e i / 200
where the Gyro_angle represents the angle measured using raw gyroscope data multiplied by the time elapsed between measurements.
  • B. Complementary Filter and Angle Estimation
During initial flight tests, it was evident that enhancing the glove’s orientation precision during the drone flight was imperative. Therefore, we implemented a complementary filter to obtain the glove’s Euler angles from the gyroscope and accelerometer data acquired by the MPU6050 sensor (IMU).
T o t a l _ a n g l e = α × T o t a l a n g l e t 1 + G y r o a n g l e + β × A c c _ a n g l e
where
  • Total_angle represents the total Euler angle calculated.
  • Gyro_angle denotes the rotational angle measured by the gyroscope.
  • Acc_angle signifies the rotation angle derived from the accelerometer.
Considering that the sum of α and β is always equal to 1, with Total_angle (t − 1) representing the previous value, for the roll and pitch stabilization, we set α = 0.98 and β = 0.02. Therefore, Equation (5) calculates the total Euler angle, integrating the information from the gyroscope and accelerometer measurements. Thanks to this filtering, the system achieved greater precision and reliability during flight maneuvers because it eliminated unintentional hand movements or tremors that could introduce noise and lead to unintentional or erratic drone commands.
This methodology significantly improved the precision and stability of determining the orientation, which is essential for effectively controlling the drone during the various flight maneuvers.
To further improve safety, the software limited the angle reading between the glove and the zero point obtained during the calibration phase to a maximum range of 180 degrees.

3. Results and Discussion

Creating simple and low-cost drone management equipment, such as Smart Glove v1.0, can lead to the implementation of intuitive and hands-free piloting in the digital world. By combining hardware with software that is merged, Smart Glove v1.0 uses low-cost tools to achieve the connection between the pilot and the drone. The performance of Smart Glove v1.0, described in the present report, implies its potential for developing human–drone interactions (HDIs) under the control of an engaging and user-friendly mechanism. The hardware of Smart Glove v1.0 comprises low-cost elements, such as two nano microcontrollers, an MPU-6050 sensor that measures inertia, flex sensors, and two nRF24L01 radio modules for wireless connectivity, costing around EUR 70, demonstrating how it is readily available and accessible to everyone. So, Smart Glove v1.0 is a good option in many cases, one of them being for small budgets.
This process led to the invention of a user interface that is both user-friendly and sensitive, and it allows the user to interact with the UAV easily. The Smart Glove v1.0 system comprises a glove as a transmitter (TX) and a receiver (RX). The MPU6050 sensor [23,24] is most important in Smart Glove v1.0 as it accurately tracks the movements of the user’s hand into precise drone commands. The glove is also integrated with flex sensors that capture the movements of fingers, resulting in the completion of complex maneuvers. Piezoresistive sensors are the object of study in this paragraph. According to the 2020 research by Dong W. et al. [26], these piezoresistive sensors are better than conventional strain gauges in terms of deformation (>30%), sensitivity to finger bending, and the control of finger movements.
The drone wirelessly communicates using nRF24L01 modules that enable seamless and timely communication, which is the most critical aspect of any drone. Smart Glove v1.0 has the further advantage of being light and thus very easy to carry; it is only 56 g without the battery, so the user has a lighter load in their hand, and thus, it becomes more comfortable and easier to use. Besides beginners, the glove is also recommended for experienced users as it allows them to stay energized and acquire skills faster. With the cut in weight and the increase in function, the invention of new wearable device technologies passes a vital stage in aerospace and robotic applications.
  • A. Sensor Calibration results
One of the first things discovered about the Smart Glove v1.0 prototype was the necessity for sensor calibration to establish an error-free Cartesian reference system and correct measurement errors. Consequently, each sensor was subjected to a moving average strategy during calibration to identify trends in the collected raw data from the gyroscope and accelerometer in the IMU sensor, which then generated a more precise deviation from the initial angle. This is shown in Figure 6; in particular, raw gyroscope data initially displayed errors that prevented the proper centering of the Cartesian reference system at around zero, underscoring the importance of this calibration step.
This results in raw data that are subsequently corrected by the average of the data, which are, thus, approximately centered around zero, as indicated by the blue line in Figure 7.
Finally, the angular velocity data were multiplied by the sampling interval to obtain the degrees of the initial angle.
This correction was very helpful in lowering noise and random variation, thus proportionally increasing the overall measurement accuracy. The results were evident from the relevant tests right after this improvement, which strongly impacted precision-demanding applications like control or navigation systems.
  • B. Complementary Filter and Angle Estimation results
To carry out the effective flight control of the drone during the lifted stage, the complementary filter and the implementation of the accelerometer and gyroscope data were most advantageous in better estimating the movement angle. In the long run, the gyroscope started to drive, causing wrong data to be detected. On the one hand, the accelerometer provided the exact values when the mechanism slowly accelerated, but when the device vibrated, it passed wrong and excessive angle values. From the two pieces of data, namely the gyroscope and accelerometer, the glove pilot should rectify them to eliminate noise and unexpected motions. In Figure 8, comparing the two curves, the correction provided by implementing the complementary filter is evident. The detailed insets show that the unfiltered data curve exhibits a series of ripples, while the filtered data curve is smoother and more linear.
Thanks to the integration of the complementary filter, it was possible to effectively reduce the impact of noise and possible sudden movements of the hand, obtaining a more stable and reliable control capacity, which is an essential and fundamental characteristic for precision maneuvers. Integrating movement constraints in the Smart Glove v1.0 system has significantly improved the stability of drone control during flights, especially in scenarios characterized by particularly stressful flight conditions. By applying the pitch and roll detection limitation to a range of ±90°, it was possible to very effectively mitigate the possibility of the pilot making sudden and particularly exaggerated gestures, consequently reducing the probability of obtaining involuntary or irregular behaviors from the drone. Furthermore, the tests demonstrated that the limitation of movement effectively reduced the instability of the drone flight, especially in delicate emergency maneuvers. As shown in Figure 9, the default signals (255 for above +90° and 0 for below −90°) provided clear input limits, allowing the drone to respond to commands more smoothly and predictably.
These constraints prevent overcorrection and abrupt reactions, which are common when users make exaggerated gestures during high-pressure situations. Experienced users highlight the importance of proper training. Additionally, the system’s reliance on precise IMU calibration underscores the need for robust hardware to maintain accuracy in varying environmental conditions.
The ±90° movement limitation in the Smart Glove v1.0 system proves an effective solution for stabilizing drone control during critical situations. By reducing exaggerated input and improving safety, this approach enhances the usability and reliability of gesture-based interfaces, marking a significant step forward in drone piloting technology.

4. Conclusions

This paper introduces Smart Glove v1.0, a novel, trailblazing, and low-price interface for high-tech drones. Using cutting-edge motion sensors and wireless communication modules connected to a wearable glove, we have shown how a gesture-driven UI can be a new way to augment human–drone interactions (HDIs). Recent studies focus on developing devices that interact with humans using hand and finger movements through bright gloves [24]. Although there is clear evidence of advancements in this field, a more unified approach is needed in research, as the variety of methodologies used to explore interactions indicates that the development process is still in progress. The development of Smart Glove v1.0 marks a significant advancement in UAV control interfaces. Thanks to an accurate and meticulous combination of hardware design and sensor calibration phases, we developed an integrated device capable of providing intuitive flight control using just one hand, leaving the other free. Using low-cost hardware boards such as Arduino Nano, the MPU6050 sensor, and the nRF24L01 module, we created a prototype that maximizes efficiency and development costs. All the tests we carried out on the use of Smart Glove v1.0 in the field produced very positive and encouraging results, showing the effectiveness of the device in converting hand gestures into specific commands to pilot drones correctly. Particular attention and care have been paid to ensuring maximum precision and the stability of the system through specific sensor calibration methodologies and the implementation of a complementary filtering algorithm. These measures ensure that a solid basis is provided for the further and future development of the device created. In general, the functionalities offered by Smart Glove v1.0 are not limited to drones but can also be used in other contexts, such as robotics or virtual reality [27], where the precision and safety of movements are equally essential. Its versatility and adaptability make it suitable for even more specialized fields such as medicine [28], the metaverse [29] and aerospace [30]. To improve the versatility and adaptability of this device, the ability to dynamically adjust the intervals in the gestures imparted to the glove could be implemented through adaptive threshold mechanisms. Furthermore, we strongly believe that further improvements can be explored by integrating our technology with other emerging technologies, such as augmented reality [31] and artificial intelligence [32,33,34,35]. Finally, Smart Glove v1.0 can significantly contribute to a range of machine–human communications. Making the protocol of comprehending complex issues during drone control easier can introduce a new era and new reason for working people to adopt drone use. We are hopeful that Smart Glove v1.0 will open doors for new progress in HDIs and the development of innovative applications soon.

Author Contributions

Conceptualization, and investigation C.R.; methodology, C.R.; software, A.P. (Andrea Pollina) and A.P. (Adriano Puglisi); validation, C.N.; formal analysis, C.N.; resources, C.N.; data curation, C.R., A.P. (Andrea Pollina) and A.P. (Adriano Puglisi); writing—original draft preparation, C.R., A.P. (Andrea Pollina) and A.P. (Adriano Puglisi); writing—review and editing, C.R., A.P. (Andrea Pollina), A.P. (Adriano Puglisi) and C.N.; supervision, C.N.; project administration, C.R. and C.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

This work has been developed at is.Lab() Intelligent Systems Laboratory of the Department of Computer, Control, and Management Engineering, Sapienza University of Rome.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tezza, D.; Andujar, M. The State-of-the-Art of Human–Drone Interaction: A Survey. IEEE Access 2019, 7, 167438–167454. [Google Scholar] [CrossRef]
  2. Hassanalian, M.; Abdelkefi, A. Classifications, applications, and design challenges of drones: A review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
  3. Hassanalian, M.; Rice, D.; Abdelkefi, A. Evolution of space drones for planetary exploration: A review. Prog. Aerosp. Sci. 2018, 97, 61–105. [Google Scholar] [CrossRef]
  4. Zeng, Y.; Zhang, R.; Lim, T.J. Wireless communications with unmanned aerial vehicles: Opportunities and challenges. IEEE Commun. Mag. 2016, 54, 36–42. [Google Scholar] [CrossRef]
  5. Ryan, A.; Zennaro, M.; Howell, A.; Sengupta, R. An overview of emerging results in cooperative UAV control. In Proceedings of the 43rd IEEE Conference on Decision and Control, Nassau, Bahamas, 14–17 December 2004. [Google Scholar] [CrossRef]
  6. Nunes, E.C. Employing Drones in Agriculture: An Exploration of Various Drone Types and Key Advantag. arXiv 2023, arXiv:2307.04037. [Google Scholar]
  7. Karar, M.E.; Alotaibi, F.; Rasheed, A.A.; Reyad, O. A pilot study of smart agricultural irrigation using unmanned aerial vehicles and IoT-based cloud system. arXiv 2021, arXiv:2101.01851. [Google Scholar]
  8. Rajapakshe, S.; Wickramasinghe, D.; Gurusinghe, S.; Ishtaweera, D.; Silva, B.; Jayasekara, P.; Panitz, N.; Flick, P.; Kottege, N. Collaborative Ground-Aerial Multi-Robot System for Disaster Response Missions with a Low-Cost Drone Add-On for Off-the-Shelf Drones. In Proceedings of the 2023 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Tomar, Portugal, 26–27 April 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 75–80. [Google Scholar]
  9. Mnaouer, H.B.; Faieq, M.; Yousefi, A.; Mnaouer, S.B. FireFly Autonomous Drone Project. arXiv 2021, arXiv:2104.07758. [Google Scholar]
  10. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef]
  11. Farhan, S.M.; Yin, J.; Chen, Z.; Memon, M.S.A. Comprehensive Review of LiDAR Applications in Crop Management for Precision Agriculture. Sensors 2024, 24, 5409. [Google Scholar] [CrossRef] [PubMed]
  12. Baykar. Bayraktar Akinci. Available online: https://baykartech.com/en/uav/bayraktar-akinci/ (accessed on 27 September 2024).
  13. CAA—Civil Aviation Authority. Available online: https://tcicaa.tc/operations-safety/aircrafts/unmanned-aircraft (accessed on 3 October 2024).
  14. FAA—Federal Aviation Administration. Available online: https://www.faa.gov/faq/what-unmanned-aircraft-system-uas (accessed on 3 October 2024).
  15. ENAC. Regolamento ENAC. Available online: www.enac.gov.it (accessed on 4 October 2024).
  16. Natarajan, K.; Nguyen, T.H.D.; Mete, M. Hand gesture controlled drones: An open source library. In Proceedings of the 2018 1st International Conference on Data Intelligence and Security (ICDIS), South Padre Island, TX, USA, 8–10 April 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 168–175. [Google Scholar]
  17. Bello, H.; Suh, S.; Geißler, D.; Ray, L.S.S.; Zhou, B.; Lukowicz, P. CaptAinGlove: Capacitive and inertial fusion-based glove for real-time on edge hand gesture recognition for drone control. In Proceedings of the Adjunct 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the 2023 ACM International Symposium on Wearable Computing, Cancun, Mexico, 8–12 October 2023; pp. 165–169. [Google Scholar]
  18. Dong, W.; Yang, L.; Fortino, G. Stretchable Human Machine Interface Based on Smart Glove Embedded With PDMS-CB Strain Sensors. IEEE Sens. J. 2020, 20, 8073–8081. [Google Scholar] [CrossRef]
  19. Ali, A.M.M.; Yusof, Z.M.D.; Kushairy, A.K.; Zaharah, H.F.; Ismail, A. Development of Smart Glove system for therapy treatment. In Proceedings of the 2015 International Conference on BioSignal Analysis, Processing and Systems (ICBAPS), Kuala Lumpur, Malaysia, 26–28 May 2015; pp. 67–71. [Google Scholar] [CrossRef]
  20. Niazmand, K.; Tonn, K.; Kalaras, A.; Fietzek, U.M.; Mehrkens, J.H.; Lueth, T.C. Quantitative evaluation of Parkinson’s disease using sensor based smart glove. In Proceedings of the 2011 24th International Symposium on Computer-Based Medical Systems (CBMS), Bristol, UK, 27–30 June 2011; pp. 1–8. [Google Scholar] [CrossRef]
  21. De Fazio, R.; Mastronardi, V.M.; Petruzzi, M.; De Vittorio, M.; Visconti, P. Human–Machine Interaction through Advanced Haptic Sensors: A Piezoelectric Sensory Glove with Edge Machine Learning for Gesture and Object Recognition. Future Internet 2023, 15, 14. [Google Scholar] [CrossRef]
  22. Fang, B.; Sun, F.; Liu, H.; Liu, C. 3D human gesture capturing and recognition by the IMMU-based data glove. Neurocomputing 2018, 277, 198–207. [Google Scholar] [CrossRef]
  23. Caeiro-Rodríguez, M.; Otero-González, I.; Mikic-Fonte, F.A.; Llamas-Nistal, M. A Systematic Review of Commercial Smart Gloves: Current Status and Applications. Sensors 2021, 21, 2667. [Google Scholar] [CrossRef] [PubMed]
  24. Lee, J.-W.; Yu, K.-H. Wearable Drone Controller: Machine Learning-Based Hand Gesture Recognition and Vibrotactile Feedback. Sensors 2023, 23, 2666. [Google Scholar] [CrossRef] [PubMed]
  25. Shin, S.-Y.; Kang, Y.-W.; Kim, Y.-G. Hand Gesture-based Wearable Human-Drone Interface for Intuitive Movement Control. In Proceedings of the 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 11–13 January 2019; pp. 1–6. [Google Scholar] [CrossRef]
  26. Lee, J.W.; Kim, K.-J.; Yu, K.-H. Implementation of a User-Friendly Drone Control Interface Using Hand Gestures and Vibrotactile Feedback. J. Inst. Control Robot. Syst. 2022, 28, 349–352. [Google Scholar] [CrossRef]
  27. Nelson, A.S.O.; Santiago, F.M.D.; De La Rosa, R.F. Virtual Reality and Human-Drone Interaction applied to the Construction and Execution of Flight Paths. In Proceedings of the 2023 International Conference on Unmanned Aircraft Systems (ICUAS), Warsaw, Poland, 6–9 June 2023; pp. 144–151. [Google Scholar]
  28. Pant, M.; Jadon, J.S.; Agarwal, R.; Sinha, S.K. Smart Monitoring System using Smart Glove. In Proceedings of the 2021 9th In-ternational Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), Noida, India, 3–4 September 2021; pp. 1–4. [Google Scholar] [CrossRef]
  29. Athreya, H.P.; Mamatha, G.; Manasa, R.; Raj, S.; Yashwanth, R. Smart Glove for the Disabled: A Survey. CiiT Int. J. Program. Device Circuits Syst. 2021, 13, 2. [Google Scholar]
  30. SETI Institute. An Astronaut Smart Glove to Explore the Moon, Mars and Beyond. Press Release, 31 October 2019. Available online: https://www.seti.org/press-release/astronaut-smart-glove-explore-moon-mars-and-beyond (accessed on 4 October 2024).
  31. Ozioko, O.; Dahiya, R. Smart Tactile Gloves for Haptic Interaction, Communication, and Rehabilitation. Adv. Intell. Syst. 2022, 4, 2100091. [Google Scholar] [CrossRef]
  32. Lee, K.T.; Chee, P.S.; Lim, E.H.; Lim, C.C. Artificial intelligence (AI)-driven smart glove for object recognition application. Mater. Today Proc. 2022, 64 Pt 4, 1563–1568. [Google Scholar] [CrossRef]
  33. Hu, B.; Wang, J. Deep Learning Based Hand Gesture Recognition and UAV Flight Controls. Int. J. Autom. Comput. 2020, 17, 17–29. [Google Scholar] [CrossRef]
  34. De Magistris, G.; Guercio, L.; Starna, F.; Russo, S.; Kryvinska, N.; Napoli, C. A Real-Time Support with Haptic Feedback for Safer Driving Using Monocular Camera. In International Conference of the Italian Association for Artificial Intelligence; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 15450 LNAI; Springer Nature: Cham, Switzerland, 2025; pp. 161–174. [Google Scholar] [CrossRef]
  35. Randieri, C.; Puglisi, V.F. FPGA Implementation Strategies for Efficient Machine Learning Systems. In Proceedings of the International Conference of Yearly Reports on Informatics, Mathematics, and Engineering, Catania, Italy, 26–29 August 2022; pp. 24–29. [Google Scholar]
Figure 1. Smart glove–drone system block diagram: The smart glove system (purple) and drone system (orange) work in synergy for seamless control and operation.
Figure 1. Smart glove–drone system block diagram: The smart glove system (purple) and drone system (orange) work in synergy for seamless control and operation.
Drones 09 00109 g001
Figure 2. Electrical diagram of the smart glove Tx system (Smart Glove TX) showing the interconnections of the Arduino Nano board with the NRF24L01, the MPU6050 module, and the flex sensor, which are all located on the glove.
Figure 2. Electrical diagram of the smart glove Tx system (Smart Glove TX) showing the interconnections of the Arduino Nano board with the NRF24L01, the MPU6050 module, and the flex sensor, which are all located on the glove.
Drones 09 00109 g002
Figure 3. Electrical diagram of the smart glove Rx system (Smart Glove RX) showing the interconnections of the Arduino Nano board with the NRF24L01 and the flight control modules located on the drone.
Figure 3. Electrical diagram of the smart glove Rx system (Smart Glove RX) showing the interconnections of the Arduino Nano board with the NRF24L01 and the flight control modules located on the drone.
Drones 09 00109 g003
Figure 4. RFX2401C chip block diagram featuring the power amplifier (PA) and the low-noise amplifier (LNA). The PA amplifies strong signals for transmission, while the LNA receives weak signals. The duplexer separates the signals, preventing the PA’s powerful output from overloading the LNA’s sensitive input.
Figure 4. RFX2401C chip block diagram featuring the power amplifier (PA) and the low-noise amplifier (LNA). The PA amplifies strong signals for transmission, while the LNA receives weak signals. The duplexer separates the signals, preventing the PA’s powerful output from overloading the LNA’s sensitive input.
Drones 09 00109 g004
Figure 5. The prototype of Smart Glove v1.0 worn during a testing phase of the system.
Figure 5. The prototype of Smart Glove v1.0 worn during a testing phase of the system.
Drones 09 00109 g005
Figure 6. Initial raw data from the gyroscope, with sampling occurring every 40 ms. Smart Glove v1.0 requires sensor calibration using a moving average to correct IMU errors and establish an accurate Cartesian reference system.
Figure 6. Initial raw data from the gyroscope, with sampling occurring every 40 ms. Smart Glove v1.0 requires sensor calibration using a moving average to correct IMU errors and establish an accurate Cartesian reference system.
Drones 09 00109 g006
Figure 7. Movement angle calculated by gyroscope calibration data multiplied by time, with the gyroscope calibration data obtained by subtracting the average data from the raw data shown in Figure 6.
Figure 7. Movement angle calculated by gyroscope calibration data multiplied by time, with the gyroscope calibration data obtained by subtracting the average data from the raw data shown in Figure 6.
Drones 09 00109 g007
Figure 8. Comparison of unfiltered (blue curve) and filtered (orange curve) data using the complementary filter. Effective drone flight control using a complementary filter to merge gyroscope and accelerometer data, eliminating noise and vibration, highlighted in macros for smoother and more accurate angle estimation during movement.
Figure 8. Comparison of unfiltered (blue curve) and filtered (orange curve) data using the complementary filter. Effective drone flight control using a complementary filter to merge gyroscope and accelerometer data, eliminating noise and vibration, highlighted in macros for smoother and more accurate angle estimation during movement.
Drones 09 00109 g008
Figure 9. Digital values based on the glove pitch angle. The limitation of pitch and roll, via software with a digital reading from 0 to 255, which translates into ±90°, reduced the reading of excessive pilot gestures, improving the drone’s stability.
Figure 9. Digital values based on the glove pitch angle. The limitation of pitch and roll, via software with a digital reading from 0 to 255, which translates into ±90°, reduced the reading of excessive pilot gestures, improving the drone’s stability.
Drones 09 00109 g009
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Randieri, C.; Pollina, A.; Puglisi, A.; Napoli, C. Smart Glove: A Cost-Effective and Intuitive Interface for Advanced Drone Control. Drones 2025, 9, 109. https://doi.org/10.3390/drones9020109

AMA Style

Randieri C, Pollina A, Puglisi A, Napoli C. Smart Glove: A Cost-Effective and Intuitive Interface for Advanced Drone Control. Drones. 2025; 9(2):109. https://doi.org/10.3390/drones9020109

Chicago/Turabian Style

Randieri, Cristian, Andrea Pollina, Adriano Puglisi, and Christian Napoli. 2025. "Smart Glove: A Cost-Effective and Intuitive Interface for Advanced Drone Control" Drones 9, no. 2: 109. https://doi.org/10.3390/drones9020109

APA Style

Randieri, C., Pollina, A., Puglisi, A., & Napoli, C. (2025). Smart Glove: A Cost-Effective and Intuitive Interface for Advanced Drone Control. Drones, 9(2), 109. https://doi.org/10.3390/drones9020109

Article Metrics

Back to TopTop