Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (5)

Search Parameters:
Keywords = Tello EDU drone

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 10725 KiB  
Article
Fractional-Order Control Algorithm for Tello EDU Quadrotor Drone Safe Landing during Disturbance on Propeller
by Nurfarah Hanim Binti Rosmadi, Kishore Bingi, P. Arun Mozhi Devan, Reeba Korah, Gaurav Kumar, B Rajanarayan Prusty and Madiah Omar
Drones 2024, 8(10), 566; https://doi.org/10.3390/drones8100566 - 10 Oct 2024
Cited by 2 | Viewed by 2005
Abstract
Quadcopter drones have become increasingly popular because of their versatility and usefulness in various applications, such as surveillance, delivery, and search and rescue operations. Weather conditions and obstacles can undoubtedly pose challenges for drone flights, sometimes causing the loss of one or two [...] Read more.
Quadcopter drones have become increasingly popular because of their versatility and usefulness in various applications, such as surveillance, delivery, and search and rescue operations. Weather conditions and obstacles can undoubtedly pose challenges for drone flights, sometimes causing the loss of one or two propellers. This is a significant challenge as the loss of one or more propellers leads to a sudden loss of control, potentially resulting in a crash, which must be addressed through advanced control strategies. Therefore, this article develops and implements a fractional-order control algorithm to enhance quadrotor drones’ safety and resilience during propeller failure scenarios. The research encompasses the complexities of quadrotor dynamics, fractional-order control theory, and existing methodologies for ensuring safe drone landings. The study emphasizes case validation on experimental results, where four distinct cases were tested using PID and Fractional-order PID (FOPID) controllers. These cases involve various simulated failure conditions to assess the performance and adaptability of the developed control algorithms. The results show the proposed FOPID control’s superior robustness and adaptability compared to traditional PID controllers. These offer significant advancements in navigating dynamic environments and managing disruptive elements introduced during propeller failure simulations in drone control technology. Full article
(This article belongs to the Special Issue Advances in Perception, Communications, and Control for Drones)
Show Figures

Figure 1

33 pages, 11948 KiB  
Article
Deep Learning for Indoor Pedestal Fan Blade Inspection: Utilizing Low-Cost Autonomous Drones in an Educational Setting
by Angel A. Rodriguez, Mason Davis, Joshua Zander, Edwin Nazario Dejesus, Mohammad Shekaramiz, Majid Memari and Mohammad A. S. Masoum
Drones 2024, 8(7), 298; https://doi.org/10.3390/drones8070298 - 5 Jul 2024
Cited by 3 | Viewed by 1677
Abstract
This paper introduces a drone-based surrogate project aimed at serving as a preliminary educational platform for undergraduate students in the Electrical and Computer Engineering (ECE) fields. Utilizing small Unmanned Aerial Vehicles (sUAVs), this project serves as a surrogate for the inspection of wind [...] Read more.
This paper introduces a drone-based surrogate project aimed at serving as a preliminary educational platform for undergraduate students in the Electrical and Computer Engineering (ECE) fields. Utilizing small Unmanned Aerial Vehicles (sUAVs), this project serves as a surrogate for the inspection of wind turbines using scaled-down pedestal fans to replace actual turbines. This approach significantly reduces the costs, risks, and logistical complexities, enabling feasible and safe on-campus experiments. Through this project, students engage in hands-on applications of Python programming, computer vision, and machine learning algorithms to detect and classify simulated defects in pedestal fan blade (PFB) images. The primary educational objectives are to equip students with foundational skills in autonomous systems and data analysis, critical for their progression to larger scale projects involving professional drones and actual wind turbines in wind farm settings. This surrogate setup not only provides practical experience in a controlled learning environment, but also prepares students for real-world challenges in renewable energy technologies, emphasizing the transition from theoretical knowledge to practical skills. Full article
Show Figures

Figure 1

31 pages, 7166 KiB  
Article
Computer Vision-Based Path Planning with Indoor Low-Cost Autonomous Drones: An Educational Surrogate Project for Autonomous Wind Farm Navigation
by Angel A. Rodriguez, Mohammad Shekaramiz and Mohammad A. S. Masoum
Drones 2024, 8(4), 154; https://doi.org/10.3390/drones8040154 - 17 Apr 2024
Cited by 8 | Viewed by 3212
Abstract
The application of computer vision in conjunction with GPS is essential for autonomous wind turbine inspection, particularly when the drone navigates through a wind farm to detect the turbine of interest. Although drones for such inspections use GPS, our study only focuses on [...] Read more.
The application of computer vision in conjunction with GPS is essential for autonomous wind turbine inspection, particularly when the drone navigates through a wind farm to detect the turbine of interest. Although drones for such inspections use GPS, our study only focuses on the computer vision aspect of navigation that can be combined with GPS information for better navigation in a wind farm. Here, we employ an affordable, non-GPS-equipped drone within an indoor setting to serve educational needs, enhancing its accessibility. To address navigation without GPS, our solution leverages visual data captured by the drone’s front-facing and bottom-facing cameras. We utilize Hough transform, object detection, and QR codes to control drone positioning and calibration. This approach facilitates accurate navigation in a traveling salesman experiment, where the drone visits each wind turbine and returns to a designated launching point without relying on GPS. To perform experiments and investigate the performance of the proposed computer vision technique, the DJI Tello EDU drone and pedestal fans are used to represent commercial drones and wind turbines, respectively. Our detailed and timely experiments demonstrate the effectiveness of computer vision-based path planning in guiding the drone through a small-scale surrogate wind farm, ensuring energy-efficient paths, collision avoidance, and real-time adaptability. Although our efforts do not replicate the actual scenario of wind turbine inspection using drone technology, they provide valuable educational contributions for those willing to work in this area and educational institutions who are seeking to integrate projects like this into their courses, such as autonomous systems. Full article
Show Figures

Figure 1

33 pages, 33738 KiB  
Article
Real-Time Human Motion Tracking by Tello EDU Drone
by Anuparp Boonsongsrikul and Jirapon Eamsaard
Sensors 2023, 23(2), 897; https://doi.org/10.3390/s23020897 - 12 Jan 2023
Cited by 12 | Viewed by 10623
Abstract
Human movement tracking is useful in a variety of areas, such as search-and-rescue activities. CCTV and IP cameras are popular as front-end sensors for tracking human motion; however, they are stationary and have limited applicability in hard-to-reach places, such as those where disasters [...] Read more.
Human movement tracking is useful in a variety of areas, such as search-and-rescue activities. CCTV and IP cameras are popular as front-end sensors for tracking human motion; however, they are stationary and have limited applicability in hard-to-reach places, such as those where disasters have occurred. Using a drone to discover a person is challenging and requires an innovative approach. In this paper, we aim to present the design and implementation of a human motion tracking method using a Tello EDU drone. The design methodology is carried out in four steps: (1) control panel design; (2) human motion tracking algorithm; (3) notification systems; and (4) communication and distance extension. Intensive experimental results show that the drone implemented by the proposed algorithm performs well in tracking a human at a distance of 2–10 m moving at a speed of 2 m/s. In an experimental field of the size 95×35m2, the drone tracked human motion throughout a whole day, with the best tracking results observed in the morning. The drone was controlled from a laptop using a Wi-Fi router with a maximum horizontal tracking distance of 84.30 m and maximum vertical distance of 13.40 m. The experiment showed an accuracy rate for human movement detection between 96.67 and 100%. Full article
(This article belongs to the Special Issue Image Processing and Analysis for Object Detection)
Show Figures

Figure 1

11 pages, 5838 KiB  
Technical Note
The Development of a Visual Tracking System for a Drone to Follow an Omnidirectional Mobile Robot
by Jie-Tong Zou and Xiang-Yin Dai
Drones 2022, 6(5), 113; https://doi.org/10.3390/drones6050113 - 29 Apr 2022
Cited by 14 | Viewed by 4701
Abstract
This research aims to develop a visual tracking system for a UAV which guides a drone to track a mobile robot and accurately land on it when it stops moving. Two LEDs with different colors were installed on the bottom of the drone. [...] Read more.
This research aims to develop a visual tracking system for a UAV which guides a drone to track a mobile robot and accurately land on it when it stops moving. Two LEDs with different colors were installed on the bottom of the drone. The visual tracking system on the mobile robot can detect the heading angle and the distance between the drone and mobile robot. The heading angle and flight velocity in the pitch and roll direction of the drone were modified by PID control, so that the flying speed and angle are more accurate, and the drone can land quickly. The PID tuning parameters were also adjusted according to the height of the drone. The embedded system on the mobile robot, which is equipped with Linux Ubuntu and processes images with OpenCV, can send the control command (SDK 2.0) to the Tello EDU drone through WIFI with UDP Protocol. The drone can auto-track the mobile robot. After the mobile robot stops, the drone can land on the top of the mobile robot. From the experimental results, the drone can take off from the top of the mobile robot, visually track the mobile robot, and finally land on the top of the mobile robot accurately. Full article
(This article belongs to the Special Issue Advances in UAV Detection, Classification and Tracking)
Show Figures

Figure 1

Back to TopTop