Next Article in Journal
Enhancing Building Archaeology: Drawing, UAV Photogrammetry and Scan-to-BIM-to-VR Process of Ancient Roman Ruins
Next Article in Special Issue
Distributed Control for Multi-Robot Interactive Swarming Using Voronoi Partioning
Previous Article in Journal
Evaluation of Almond Harvest Dust Abatement Strategies Using an Aerial Drone Particle Monitoring System
Previous Article in Special Issue
An Efficient Framework for Autonomous UAV Missions in Partially-Unknown GNSS-Denied Environments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Usability Comparison between 2D and 3D Control Methods for the Operation of Hovering Objects

1
Swen, Inc., Seoul 06178, Republic of Korea
2
AI Research Center, Neowiz, Inc., Seongnam 13487, Republic of Korea
3
Meissa, Inc., Seoul 06127, Republic of Korea
4
School of Global Entrepreneurship and Information Communication Technology, Handong Global University, Pohang 37554, Republic of Korea
*
Author to whom correspondence should be addressed.
Drones 2023, 7(8), 520; https://doi.org/10.3390/drones7080520
Submission received: 15 July 2023 / Revised: 2 August 2023 / Accepted: 4 August 2023 / Published: 8 August 2023

Abstract

:
This paper experimentally analyzed the cognitive load of users based on different methods of operating hovering objects, such as drones. The traditional gamepad-type control method (2D) was compared with a control method that mapped the movement directions of the drone to the natural manipulation gestures of the user using a Leap Motion device (3D). Twenty participants operated the drone on an obstacle course using the two control methods. The drone’s trajectory was measured using motion-capture equipment with a reflective marker. The distance traveled by the drone, operation time, and trajectory smoothness were calculated and compared between the two control methods. The results showed that when the drone’s movements were mapped to the user’s natural directional gestures, the drone’s 3D movements were perceived as more natural and smoother. A more intuitive drone control method can reduce cognitive load and minimize operational errors, making it more user friendly and efficient. However, due to the users’ lack of familiarity with Leap Motion, it resulted in longer distance and time and lower subjective satisfaction; therefore, a more improved 3D control method over Leap Motion is needed to address the limitations.

1. Introduction

Vertical take-off and landing (VTOL) aircraft such as helicopters or drones move by using propulsion devices, such as propellers, while hovering in the air. Unlike fixed-wing aircraft, which gain propulsion in one direction, VTOL aircrafts can move in six directions (up, down, left, right, forward, and backward) or rotate around three axes (longitudinal, lateral, and vertical) while hovering in one place. The rotations of VTOL aircrafts along the axes are referred to as roll (rotation along the longitudinal axis), pitch (rotation along the lateral axis), and yaw (rotation along the vertical axis). VTOL aircrafts are typically operated through four controls [1,2,3]: (1) moving vertically by controlling the rotation speed of the propellers, (2) moving forward and backward by tilting the body through pitch rotation, (3) moving left and right by tilting the body through roll rotation, and (4) rotating around the vertical axis through yaw rotation. Since VTOL aircrafts can perform precise hovering in the air using propellers, they are effectively used for military operations, observation and surveillance activities, firefighting, and so forth [4]. In particular, as VTOL aircrafts with multiple propellers can achieve vertical take-off and landing by precisely controlling the propellers, which enables stable and smooth flight, they can quickly respond to emergency situations through a sudden stop or sharp turn; they are expected to be used as a means of transportation in urban air mobility (UAM), the next generation of urban transportation.
The existing method of controlling VTOL aircrafts, which moves in four ways (up–down movement, front–back movement, left–right movement, and yaw rotation) in 3D space, has nonintuitive eye–hand coordination because the mapping between the movement of the body and the interface is unnatural [5,6,7,8]. In the case of a helicopter, collective control (which can be controlled up and down with the left hand) is used to control the body up and down. Cyclic control (which can be controlled front–back and left–right with the right hand) is used to control the body front–back and left–right. Two rudder pedals are used to control yaw rotation on both feet. The movement directions of the helicopter are mapped to the arms and legs of the pilot, and the pilot needs to have good eye–hand coordination to control the helicopter’s movements precisely while moving quickly in the air [9]. However, the problem of eye–hand coordination is more complex in the case of a drone because one of the main causes is the pilot’s point of view. In the case of a helicopter, as the pilot is inside the aircraft, the direction of control by the pilot is the same as the direction of movement of the helicopter. However, drones are controlled by the operator from an external (third-person) perspective. Therefore, if the drone rotates its yaw to change direction, the control direction and the drone’s movement direction become mismatched (e.g., when the drone rotates 90 degrees clockwise if the operator tries to move the drone forward by manipulating the joystick upwards, the drone moves to the right not forward) [10]. Usability issues related to eye–hand coordination based on the unnatural mapping between the drone’s movement and the drone–control interface are reported, even excluding problems related to perspective. In general, a drone is controlled using a device having two joysticks by manipulating the left joystick with the left hand to move the drone in the forward, backward, leftward, and rightward directions; and manipulating the right joystick with the right hand to control the drone’s altitude (by controlling the joystick upward and downward) and yaw rotation (by controlling the joystick leftward and rightward). Both joysticks are manipulated in the same manner (say, two-dimensional (2D) control), but the mapping of the upward, downward, leftward, and rightward directions of the joystick to the drone’s movement is different between the two joysticks. For example, manipulating the left joystick to the left direction causes the drone to fly leftward while manipulating the right joystick to the left direction causes the drone to rotate in a counterclockwise direction. Therefore, due to the lack of intuitive mapping between the direction of the drone’s movement and the way the controller is manipulated, cognitive errors can occur in controlling the drone and it takes a lot of practice time to develop eye–hand coordination [2,10,11,12].
The previous studies mainly focused on the interface for intuitive drone control but there is a lack of experimental studies that analyze the performance of drone control in terms of usability of drone–control interface (mapping between the drone’s movements and the drone–control interface). To solve the problem that two-handed drone control is difficult, more intuitive one-handed-control controllers were studied from the aspect of eye–hand coordination [2,12,13] and related products have also been introduced in the market (e.g., Shift, this is engineering Inc., Seongnam, Republic of Korea; FT Aviator, Fluidity Technologies Inc., Houston, TX, USA). Previous studies [14,15,16,17,18,19] have proposed methods utilizing hand gestures to verify the assumption that one-handed controllers or hand-gesture-based control approaches can provide intuitive control over the up–down, front–back, and left–right movements of drones in three-dimensional (3D) space. However, there are few studies that quantitatively compare and analyze the usability difference of these intuitive drone-control methods compared to two-handed controllers.
This study compares the performance of drone control using two different methods by experimentally analyzing the effect of mapping between the manipulation and movement of a drone. This study implemented an interface that controls the drone to be intuitively controlled by hand gestures using Leap Motion (Ultraleap Inc., Mountain View, CA, USA), a hand-gesture recognition device. The usability difference between the Leap Motion (3D controller) and a generally used drone-control device (a gamepad-type device consisting of two joysticks; 2D controller) was compared from the perspective of the drone’s movement trajectory with 20 participants. The drone’s trajectory was measured using a motion analysis system and the distance traveled, the time of movement, and the smoothness of the trajectory were calculated using the trajectory data. Since the mapping between the direction of interface operation and the direction in which the drone moves becomes unnatural in the drone’s yaw rotation [7,8,13], this study only analyzed the remaining three movements (up–down, front–back, and left–right) among the four movements of the drone.

2. Methods

2.1. Participants

The participants in the experiment were 20 university students (age range: 20 to 27), with 10 males and 10 females. To exclude the effect of previous drone piloting experience and proficiency on the experiment, participants who had no prior drone piloting experience were recruited.

2.2. Apparatus

This study used a program developed to control the drone using two controllers. The drone and controllers used for the experiment were a Tello (DJI Technology Co. Ltd., Shenzhen, Guangdong, China; Figure 1a), a DualShock 4 gamepad (Sony Corp., Tokyo, Japan; Figure 1b), and a Leap Motion hand-gesture recognition sensor (Figure 1c), respectively. The drone utilized in this study can hover in place precisely under windless and uncontrolled conditions using the Vision Positioning System (VPS) developed by the manufacturer. The VPS can operate effectively as long as the drone flies at speeds below 5 m/s and at altitudes above 3.5 ft (1 m), avoiding inappropriate surfaces like monochrome, highly reflective, transparent, or surfaces with identical repeating textures (e.g., tiles), and ensuring suitable ambient brightness levels between 10 lux and 100,000 lux, without drastic light changes. The drone-control program was developed using Gobot (https://github.com/hybridgroup/gobot/, accessed on 3 August 2023), which is a library based on the Go language and provides various functions for sensors and controllers. Using an in-house program, the drone was controlled with different input methods while issuing the same commands using the same algorithm. The gamepad (2D controller) was used to control the drone in the forward and backward directions and left and right directions while pressing the up, down, left, and right keys on the left joystick; and in the up and down directions (altitude) while pressing the up and down keys on the right joystick. Note that the drone’s yaw rotation was not implemented. When the hand is released from the direction keys, the drone’s speed decreases and eventually stops its movement (hovering). The Leap Motion device (3D controller) is used to control the drone by tracking the absolute position (x, y, and z coordinates) and gesture of the hand within a specific cuboid space (left–right: −200 < x < 200 mm, front–rear: −200 < z < 200 mm, and top–bottom: 150 < y < 350 mm) at the top of the device. If the hand moves forward on the device, the z coordinate of the palm increases, making the drone move forward. The drone can be switched to hovering mode with the gamepad by releasing the hand from the joysticks, or with the Leap Motion device by moving the hand to the center area in the 3D space. Therefore, a fist gesture was used to immediately switch the drone to hovering mode anywhere on the device. Other actions such as takeoff and landing can be performed via keyboard commands. The maximum flight speed of the drone was set to a particular level (0.5 m/s = 6.25% of its maximum speed, 8 m/s) suitable for beginners to navigate obstacles.
The experiment was conducted by setting up three types of obstacle courses and collecting the drone’s movement trajectory information using the OptiTrack motion-capture system (NaturalPoint Inc., Corvallis, OR, USA; Figure 1d), which is composed of 12 cameras at a sampling rate of 60 Hz. The obstacle course consisting of three sections was arranged in a rectangular formation (Figure 2) and the drone was flown clockwise once to pass through those three sections. The course consisted of the A section, which had the drone pass through rings in a straight line toward the 12 o’clock direction from the participant; the B section, where the drone flew toward the 3 o’clock direction from the end of the A section and passed through a high obstacle; the C section, where the drone flew toward the 6 o’clock direction from the end of the B section and avoided obstacles in a zigzag manner; and, lastly, the section, which had no obstacles and brought the drone back to the starting position toward the 9 o’clock direction from the end of the C section. This study excluded yaw rotation, so the drone was flown to the right during the B section and backward during the C section. The 3D position of the drone was tracked using a motion-capture system with a reflective marker attached to the drone. The trajectory refers to the sequence of 3D positions (x, y, and z) of the drone that changes over time.

2.3. Experiment Design

The experiment was conducted by a four-step procedure (S1: introduction, S2: practice, S3: main experiment, and S4: debriefing). First, the purpose and process of the experiment were introduced to a participant and an informed consent was obtained. Second, the participant got familiar with the two drone-control methods for about 30 min, without flying the drone in the course during the practice. Third, the main experiment was conducted after the participants responded that they were familiar with the operation of the two controllers. The evaluation order of the two controllers was counterbalanced. Lastly, the participant responded to the survey on the subjective usability scores of each controller. Then, their participation was compensated.

2.4. Analysis Methods

This study divided the drone’s trajectory data measured by the motion-capture system into three sections (A, B, C) and analyzed each section based on three performance aspects (movement distance, movement time, and trajectory smoothness) and subjective usability scores based on six aspects (controllability, intuitiveness, accuracy, mapping, learnability, and overall satisfaction) (Table 1) using a 7-point Likert scale (1: very dissatisfied, 4: neutral, 7: very satisfied). The smoothness of the movement trajectory was defined as the cumulative angle change in each section divided by the movement distance. Among the participants, there were instances where the drone was temporarily hovering between sections and the data that was not related to drone operation was excluded from the analysis. The statistical differences between the two controllers for the three quantitative performance measures and the six subjective usability scores were analyzed using a paired t-test (α = 0.05).

3. Results

Figure 3 illustrates the trajectories of the drone measured by the motion-capture system when controlled by 20 participants using the gamepad (represented by blue lines) and the Leap Motion (represented by red lines). The top view is visualized for the A and C sections, while the front view is shown for the B section. In the A section, the Leap Motion controlled the drone to follow a straighter trajectory compared to the gamepad.
The average movement distances produced by the gamepad and Leap Motion differed slightly by section (Figure 4a and Table 2). In the A section, the average movement distance produced by the Leap Motion was significantly shorter (mean difference = 0.6 m) compared to that produced by the gamepad (p-value = 0.04). In contrast, the B section showed a significantly longer average movement distance (mean difference = 1.2 m) produced by the Leap Motion (p-value = 0.03). The C section also showed a longer average movement distance (mean difference = 0.3 m) produced by the Leap Motion, though the difference was not significant (p-value = 0.49). As for the average movement distance in a total of three sections, there was no significant difference observed (mean difference = 0.8 m, p-value = 0.35).
The results indicate that the Leap Motion had a longer average movement time compared to the gamepad (Figure 4b). In the A section, the average movement time produced by the Leap Motion was shorter (mean difference = 4.9 s) on average, though the difference was not statistically significant (p-value = 0.16). In contrast, the B and C sections showed a significantly longer movement time on average for the Leap Motion, with mean differences of 12.0 s and 12.4 s, respectively (p-value = 0.02 and p-value < 0.001). The overall average movement time across all sections also showed a longer duration for the Leap Motion compared to the gamepad (mean difference = 19.5 s; p-value = 0.048).
The results regarding the smoothness of the trajectory, as defined by the accumulated angle change divided by the movement distance, showed that the Leap Motion made different results by course (Figure 5 and Table 2). The trajectory smoothness shows a high value when the drone makes sharp turns. In the A and B sections, the Leap Motion produced a smoother trajectory on average with a smaller accumulated angle change. Significant differences were observed in the X- and Z-axes in the A section and the Z-axis in the B section. In contrast, in the C section, the Leap Motion showed a less smooth trajectory on average with a significant difference in the Y- and Z-axes compared to the gamepad. While there are no significant differences between the two methods, the overall average smoothness across all sections showed a smoother trajectory for the Leap Motion compared to the gamepad, except for the Y-axis. This was because two out of twenty participants using Leap Motion in the C section made unnecessary movements in the Y-axis. They were not treated as outliers, but their data were included in the results.
Lastly, regarding the subjective evaluation, the overall scores for the gamepad interface were 5% to 29% higher on average compared to the Leap Motion (Figure 6 and Table 2). The differences in scores were statistically significant at α = 0.05 in most of the usability measures (controllability, accuracy, mapping, and learnability), except for intuitiveness.

4. Discussion

This study aimed to compare the performance of two interfaces, the 2D (gamepad) and 3D (Leap Motion), in controlling a VTOL aircraft. The results showed that the 3D control method, which is an intuitive interface for controlling an object in three-dimensional space, was superior in several critical aspects. First, it can reduce the operator’s cognitive fatigue by providing a more natural and ergonomic control experience, resulting in decreased cognitive errors. It may demonstrate a quicker response compared to the two-joystick interface. This highlights the ability of an intuitive interface to respond rapidly to changing circumstances, which is particularly important for ensuring the safe operation of a VTOL aircraft. Second, the 3D method can provide improved performance in terms of the drone’s movement smoothness, as the drone is operated by following natural human motion. Lastly, the 3D interface also can provide a more immersive experience for the operator, making a better visualization of the VTOL aircraft’s movements in 3D space. Based on these findings, it can be concluded that an intuitive 3D interface (e.g., Leap Motion) with a clear control-manipulation mapping would be not only a more effective and efficient solution but also a more intuitive and safer solution for controlling VTOL aircrafts compared to 2D interfaces [6,13].
The results of this study provide insights into the advantages of using a 3D controller, such as the Leap Motion, for controlling VTOL aircrafts in 3D space. The Leap Motion interface demonstrated several benefits in terms of mapping and control. By mapping the drone’s movements to the user’s natural directional gestures, the Leap Motion interface allows for more intuitive and precise control. The natural mapping of the Leap Motion interface resulted in smoother control and lower angle changes during both straight (A section) and up–down (B section) movements. In previous studies, it has been also mentioned that when the direction of the drone is mapped to the control inputs, it allows for intuitive and precise control of the drone. Kim and Chang (2021) invented a handheld controller where the user holds and moves it in three dimensions to control the drone and found that the developed controller provided more natural control compared to the gamepad, resulting in fewer failures in drone-control tasks, such as collisions [13]. And Di Vincenzo et al. (2022) argued that using hand gestures enables faster and more accurate control of the drone in terms of eye–hand coordination [20]. A 3D controller that effectively maps the direction of manipulation to the movement of the drone enhances the overall user experience and facilitates more accurate controlling of the drone, which is particularly important in scenarios that require precise control and smooth movements. The advantages of the 3D controller highlight its potential for enhancing the control capabilities of VTOL aircrafts and opening up new possibilities for applications in various industries and domains.
The use of the Leap Motion for controlling a drone presents certain limitations compared to using the gamepad. One limitation is the slower speed of control with the Leap Motion, which can be attributed to the users’ unfamiliarity and difficulty in manipulating the drone’s movements using hand gestures within a specific range in the air. This resulted in a lower subjective usability score, indicating challenges in user adaptation and proficiency with the Leap Motion interface. Particularly, there was a discrepancy between the users’ subjective opinions of the control method and the actual drone-control results in intuitiveness. The Leap Motion interface was shown to be more intuitive for controlling the drone by showing better trajectory smoothness, while the Leap Motion control method was perceived as less intuitive in terms of subjective satisfaction ratings. Moreover, the performance of the Leap Motion was hindered in zigzag movements (C section) due to users’ lack of proficiency, leading to suboptimal control characterized by sharp turns and unnecessary up–down movements. Participants also reported that the Leap Motion was not easy to learn and use, especially for novice users, as it lacks physical feedback mechanisms [6,13,20]. These limitations highlight the necessity for further exploration and the development of effective and user-friendly interfaces to enhance the control of VTOL aircraft s. It is crucial to focus on optimizing the speed, accuracy, and user familiarity of gesture-based drone control, thereby improving overall performance and usability. To address these limitations, future research can explore the use of physical devices that provide intuitive control of the drone in 3D space by incorporating handheld or manipulative actions in six directions. Moreover, consideration for different types of hand gestures is necessary. In this study, the drone was controlled using the Leap Motion by extending the palm and performing movements in six directions. However, the limited space for arm movement above the Leap Motion sometimes caused the hand position to go beyond its sensing range, leading to control errors. Previous studies have explored alternative methods for drone control, such as utilizing wrist rotations (e.g., abduction, adduction, flexion, extension, and rotation) or employing specific hand shapes (e.g., clenched fist, thumb-up, pointing, and V gesture), without the need for extensive arm movement [3,17].
This study is limited to analyzing the influence of eye–hand coordination (specifically mapping) on the control of hovering objects by novice users. This study does not compare various controllers and does not analyze other hovering objects besides drones; thus, the results of this study cannot be hastily generalized. Furthermore, advanced control laws used in real-world drone operations were not considered. By focusing on simply moving the drone in the vertical, horizontal, and lateral directions, this study aimed to analyze the impact of eye–hand coordination on the smoothness and precision of hovering object control. Therefore, other factors than controlling the drone’s directions were fixed in the experiment in order to eliminate unexpected influences on the results. Also, this study aimed to analyze the effects of control methods on users without prior drone-operating experience. Lastly, while this study focused on participants who had no bias from previous drone-control experience, it is noted that participants with familiarity with drone control using various methods may show different results.
Future research should address the limitations of this study and further explore the best interfaces for controlling VTOL aircrafts. One potential area of investigation is to examine the issue of yaw rotation, which was excluded from this study for a specific reason. A possible solution could be to include yaw rotation in the mapping of the interface to assess how it impacts the performance and experience of the operator. This could be achieved by combining hand gestures with body movements, such as rotation of the head or body toward the direction of the drone’s head-up orientation. Or, a VR device can be utilized to offer users a first-person perspective and the motion of wrist abduction/adduction can be naturally mapped to yaw rotation for controlling the drone [3,6,21]. Additionally, the comparison between different interfaces could be extended to include more devices or methods, such as touchscreen [22], hand gestures [3,16,17,23,24], body gestures [5], electromyography [21], gaze [20,25,26], brain signal [1], and speech [16,24] interfaces. This would provide a more comprehensive understanding of the strengths and weaknesses of different interfaces for controlling VTOL aircrafts. Moreover, as previous studies emphasize the importance of training for safe drone operation [27,28], future research needs to investigate not only eye–hand coordination but also the learnability aspect.
The findings of this study have important implications for the design and development of intuitive interfaces for controlling various hovering objects. The superiority of the Leap Motion in terms of reducing cognitive fatigue, providing better performance, quicker response in emergency situations, and a more immersive experience, suggests that an intuitive interface could be useful in various domains. These domains could include the control of VTOL aircrafts such as drones and helicopters, underwater vehicles, construction robots, and robot arms in a manufacturing setting, or dangerous tasks such as bomb disposal, as well as spacecraft. The precise mapping of the interface to the object’s movement is critical for the successful handling of such tasks and requires further exploration.

5. Conclusions

This study compared the use of the 2D and 3D methods for controlling a drone, a type of vertical take-off and landing (VTOL) aircraft. The results showed that the 3D controller provided smoother and more intuitive control compared to the 2D controller. The intuitive hand gestures used with the 3D controller allowed for natural movements of the VTOL aircrafts in 3D space; therefore, the 3D controller has the potential to enhance the user experience and improve the overall control of VTOL aircrafts. It is important to note that further research on 3D control methods is needed to overcome the limitations of the Leap Motion device and provide a more user-friendly experience. Further research will optimize the ease of control and feedback mechanisms to enhance user proficiency for better control for VTOL aircrafts. Overall, the findings of this study highlight the importance of exploring and developing innovative interfaces for controlling VTOL aircrafts to achieve smoother, more precise, and more satisfying flight performance.

Author Contributions

Conceptualization, H.K.; methodology, all coauthors; software, H.K.; formal analysis, D.L. and H.Y.; investigation, H.K. and H.Y.; data curation, D.L.; writing—original draft preparation, D.L., H.K. and H.Y.; writing—review and editing, W.L.; visualization, D.L.; supervision, W.L.; project administration, W.L.; funding acquisition, W.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was jointly supported by Handong Global University Research Grants (No. 202300900001) and the National Research Foundation of Korea (NRF) grants funded by the Korean Government (2020R1F1A1050076).

Data Availability Statement

The dataset containing the performance (e.g., movement distance, movement time, trajectory smoothness) and subjective measures of 20 participants for two drone control interfaces can be found at https://doi.org/10.6084/m9.figshare.23822760 (accessed on 6 June 2023).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kim, S.; Lee, S.; Kang, H.; Kim, S.; Ahn, M. P300 Brain–Computer Interface-Based Drone Control in Virtual and Augmented Reality. Sensors 2021, 21, 5765. [Google Scholar] [CrossRef] [PubMed]
  2. Seok, J.; Han, J.; Baek, J.; Chang, W.; Kim, H. Development of an intuitive motion-basd dron controller. J. Korea Soc. Die Mold Eng. 2017, 11, 41–45. [Google Scholar]
  3. Konstantoudakis, K.; Christaki, K.; Tsiakmakis, D.; Sainidis, D.; Albanis, G.; Dimou, A.; Daras, P. Drone Control in AR: An Intuitive System for Single-Handed Gesture Control, Drone Tracking, and Contextualized Camera Feed Visualization in Augmented Reality. Drones 2022, 6, 43. [Google Scholar] [CrossRef]
  4. Park, N.; Ahn, Y.; Hwang, Y. A study on the development of a remote control drone for disaster response. J. Soc. Disaster Inf. 2019, 15, 578–589. [Google Scholar]
  5. Sanna, A.; Lamberti, F.; Paravati, G.; Manuri, F. A Kinect-based natural interface for quadrotor control. Entertain. Comput. 2013, 4, 179–186. [Google Scholar] [CrossRef]
  6. Tezza, D.; Andujar, M. The State-of-the-Art of Human–Drone Interaction: A Survey. IEEE Access 2019, 7, 167438–167454. [Google Scholar] [CrossRef]
  7. Aretz, A.J.; Wickens, C.D. The Mental Rotation of Map Displays. Hum. Perform. 2009, 5, 303–328. [Google Scholar] [CrossRef]
  8. Gugerty, L.; Brooks, J. Reference-Frame Misalignment and Cardinal Direction Judgments: Group Differences and Strategies. J. Exp. Psychol. Appl. 2004, 10, 75–88. [Google Scholar] [CrossRef]
  9. Park, C.; Kim, S.; Tak, H.; Shin, S.; Choi, Y. The correlation between flight training factors in helicopter pilot training course and learning achievement. J. Korean Soc. Aviat. Aeronaut. 2019, 27, 45–53. [Google Scholar] [CrossRef] [Green Version]
  10. Williams, K.W. Human Factors Implications of Unmanned Aircraft Accidents: Flight-Control Problems; DOT/FAA/AM-06/8; Federal Aviation Administration: Washington, DC, USA, 2006. [Google Scholar]
  11. Kim, H.; Lim, S.; Lee, W. Comparative analysis of controlling drone. In Proceedings of the 10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019), Washington, DC, USA, 24–28 July 2019. [Google Scholar]
  12. Jeon, J.; Yu, S.; Cho, K. The embodiment of the drone controller: Single-handed drone controller to fly a drone safely and freely. In Proceedings of the Human-Computer Interaction (HCI) Korea 2017, Jeongseon, Republic of Korea, 8–10 February 2017. [Google Scholar]
  13. Kim, H.; Chang, W. Intuitive Drone Control using Motion Matching between a Controller and a Drone. Arch. Des. Res. 2022, 35, 93–113. [Google Scholar] [CrossRef]
  14. Zhao, Z.; Luo, H.; Song, G.-H.; Chen, Z.; Lu, Z.-M.; Wu, X. Web-based interactive drone control using hand gesture. Rev. Sci. Instrum. 2018, 89, 014707. [Google Scholar] [CrossRef]
  15. Smeragliuolo, A.H.; Hill, N.J.; Disla, L.; Putrino, D. Validation of the Leap Motion Controller using markered motion capture technology. J. Biomech. 2016, 49, 1742–1750. [Google Scholar] [CrossRef]
  16. Fernández, R.A.S.; Sanchez-Lopez, J.L.; Sampedro, C.; Bavle, H.; Molina, M.; Campoy, P. Natural user interfaces for human-drone multi-modal interaction. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016. [Google Scholar]
  17. Yoo, M.; Na, Y.; Song, H.; Kim, G.; Yun, J.; Kim, S.; Moon, C.; Jo, K. Motion Estimation and Hand Gesture Recognition-Based Human–UAV Interaction Approach in Real Time. Sensors 2022, 22, 2513. [Google Scholar] [CrossRef]
  18. Budiyanto, A.; Ramadhan, M.I.; Burhanudin, I.; Triharminto, H.H.; Santoso, B. Navigation control of drone using hand gesture based on complementary filter algorithm. In Proceedings of the 5th International Conference on Advanced Material for Better Future (ICAMBF 2020), Surakarta, Indonesia, 13–14 October 2020. [Google Scholar]
  19. Mutalib, M.K.Z.B.A.; Mohd, M.N.H.; Tomari, M.R.B.M.; Sari, S.B.; Ambar, R.B. Flying Drone Controller by Hand Gesture Using Leap Motion. Int. J. Adv. Trends Comput. Sci. Eng. 2020, 9, 111–116. [Google Scholar] [CrossRef]
  20. Di Vincenzo, M.; Palini, F.; De Marsico, M.; Borghi, A.M.; Baldassarre, G. A Natural Human-Drone Embodied Interface: Empirical Comparison With a Traditional Interface. Front. Neurorobotics 2022, 16, 898859. [Google Scholar] [CrossRef]
  21. Miehlbradt, J.; Cherpillod, A.; Mintchev, S.; Coscia, M.; Artoni, F.; Floreano, D.; Micera, S. Data-driven body–machine interface for the accurate control of drones. Proc. Natl. Acad. Sci. USA 2018, 115, 7913–7918. [Google Scholar] [CrossRef] [Green Version]
  22. De Marsico, M.; Spagnoli, A. Using hands as an easy UAV joystick for entertainment applications. In Proceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the next interaction, Padova, Italy, 23–25 September 2019; pp. 1–9. [Google Scholar]
  23. Bachmann, D.; Weichert, F.; Rinkenauer, G. Review of Three-Dimensional Human-Computer Interaction with Focus on the Leap Motion Controller. Sensors 2018, 18, 2194. [Google Scholar] [CrossRef] [Green Version]
  24. Xiang, X.; Tan, Q.; Zhou, H.; Tang, D.; Lai, J. Multimodal Fusion of Voice and Gesture Data for UAV Control. Drones 2022, 6, 201. [Google Scholar] [CrossRef]
  25. Yuan, L.; Reardon, C.; Warnell, G.; Loianno, G. Human Gaze-Driven Spatial Tasking of an Autonomous MAV. IEEE Robot. Autom. Lett. 2019, 4, 1343–1350. [Google Scholar] [CrossRef]
  26. Zhou, T.; Li, B.; Liu, Y.; Chen, S.; Wang, Y. Gaze-assisted remote control for quad-rotor UAV. In Proceedings of the 2021 International Conference on High Performance Computing and Communications (HPCCE 2021), Guangzhou, China, 3–5 December 2021. [Google Scholar]
  27. Covaciu, F.; Iordan, A.-E. Control of a Drone in Virtual Reality Using MEMS Sensor Technology and Machine Learning. Micromachines 2022, 13, 521. [Google Scholar] [CrossRef]
  28. Koç, D.; Seçkin, A.Ç.; Satı, Z.E. Evaluation of Participant Success in Gamified Drone Training Simulator Using Brain Signals and Key Logs. Brain Sci. 2021, 11, 1024. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Apparatus of this study: (a) Drone (DJI’s Tello), (b) Gamepad (Sony DualShock 4), (c) 3D controller (Leap Motion), and (d) setup of the experiment place with a 12-camera motion-capture system (OptiTrack).
Figure 1. Apparatus of this study: (a) Drone (DJI’s Tello), (b) Gamepad (Sony DualShock 4), (c) 3D controller (Leap Motion), and (d) setup of the experiment place with a 12-camera motion-capture system (OptiTrack).
Drones 07 00520 g001
Figure 2. The obstacle course consists of three sections.
Figure 2. The obstacle course consists of three sections.
Drones 07 00520 g002
Figure 3. Movement trajectory of the drone controlled by the gamepad (blue) and the Leap Motion device (red) shown with obstacles (dotted line). In the B section, the Leap Motion exhibited smoother trajectory compared to the gamepad. Lastly, in the C section, there was no visually distinct difference between the Leap Motion and the gamepad.
Figure 3. Movement trajectory of the drone controlled by the gamepad (blue) and the Leap Motion device (red) shown with obstacles (dotted line). In the B section, the Leap Motion exhibited smoother trajectory compared to the gamepad. Lastly, in the C section, there was no visually distinct difference between the Leap Motion and the gamepad.
Drones 07 00520 g003
Figure 4. Performance analysis results in (a) movement distance and (b) movement time (* p < 0.05).
Figure 4. Performance analysis results in (a) movement distance and (b) movement time (* p < 0.05).
Drones 07 00520 g004
Figure 5. Performance analysis results in trajectory smoothness (* p < 0.05).
Figure 5. Performance analysis results in trajectory smoothness (* p < 0.05).
Drones 07 00520 g005
Figure 6. Subjective usability analysis results (* p < 0.05).
Figure 6. Subjective usability analysis results (* p < 0.05).
Drones 07 00520 g006
Table 1. Performance and usability measures for drone control.
Table 1. Performance and usability measures for drone control.
MeasureDescription
objective measuresmovement distancethe accumulation of the Euclidean distances between the positions of the drone in consecutive frames
movement timethe cumulative duration of the drone’s movement, as calculated by the number of frames in which the drone moves
trajectory smoothnessThe level of smoothness of the drone’s movement, as determined by the accumulated angle change divided by the distance traveled in each axis
subjective measurescontrollabilitythe degree to which the drone can be controlled and directed by the operator in a precise manner.
intuitivenessthe degree to which the operation of the drone is understandable to the operator while it is controlled
accuracythe degree to which the drone’s movement matches the intended movement, as specified by the operator
mappingthe degree to which the relationship between the operator’s inputs and the drone’s resulting movements is established and understood
learnabilitythe degree to which the operator can acquire the skills necessary to control the drone effectively through experience or training
overall satisfactionthe degree to which the operator is overall satisfied with the drone operation
Table 2. Performance analysis results by measure.
Table 2. Performance analysis results by measure.
MeasureA SectionB SectionC Section
Mean
Difference
p-ValueMean
Difference
p-ValueMean
Difference
p-Value
objective measuresmovement distance0.60.04−1.20.03−0.30.49
movement time4.90.16−12.00.02−12.4<0.001
trajectory smoothness X-axis194.80.0297.70.14−87.50.03
Y-axis178.80.1383.40.13−264.60.08
Z-axis167.50.0481.40.03−76.30.01
Measuremean
difference
p-value
subjective measurescontrollability1.4<0.001
intuitiveness0.30.28
accuracy1.00.03
mapping1.4<0.001
learnability0.90.02
overall satisfaction1.00.01
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lee, D.; Kim, H.; Yoon, H.; Lee, W. Usability Comparison between 2D and 3D Control Methods for the Operation of Hovering Objects. Drones 2023, 7, 520. https://doi.org/10.3390/drones7080520

AMA Style

Lee D, Kim H, Yoon H, Lee W. Usability Comparison between 2D and 3D Control Methods for the Operation of Hovering Objects. Drones. 2023; 7(8):520. https://doi.org/10.3390/drones7080520

Chicago/Turabian Style

Lee, Daeseong, Hajun Kim, Heesoo Yoon, and Wonsup Lee. 2023. "Usability Comparison between 2D and 3D Control Methods for the Operation of Hovering Objects" Drones 7, no. 8: 520. https://doi.org/10.3390/drones7080520

Article Metrics

Back to TopTop