Next Article in Journal
Using A Sliding Window Phase Matching Method for Imaging of GNSS Radio Occultation Signals
Previous Article in Journal
Accuracy, Bias, and Improvements in Mapping Crops and Cropland across the United States Using the USDA Cropland Data Layer
Open AccessArticle

Autonomous, Onboard Vision-Based Trash and Litter Detection in Low Altitude Aerial Images Collected by an Unmanned Aerial Vehicle

Faculty of Control, Robotics and Electrical, Engineering, Institute of Robotics and Machine Intelligence, Poznań University of Technology, 60-965 Poznań, Poland
*
Author to whom correspondence should be addressed.
Academic Editor: Bahram Salehi
Remote Sens. 2021, 13(5), 965; https://doi.org/10.3390/rs13050965
Received: 29 January 2021 / Revised: 24 February 2021 / Accepted: 26 February 2021 / Published: 4 March 2021
Public littering and discarded trash are, despite the effort being put to limit it, still a serious ecological, aesthetic, and social problem. The problematic waste is usually localised and picked up by designated personnel, which is a tiresome, time-consuming task. This paper proposes a low-cost solution enabling the localisation of trash and litter objects in low altitude imagery collected by an unmanned aerial vehicle (UAV) during an autonomous patrol mission. The objects of interest are detected in the acquired images and put on the global map using a set of onboard sensors commonly found in typical UAV autopilots. The core object detection algorithm is based on deep, convolutional neural networks. Since the task is domain-specific, a dedicated dataset of images containing objects of interest was collected and annotated. The dataset is made publicly available, and its description is contained in the paper. The dataset was used to test a range of embedded devices enabling the deployment of deep neural networks for inference onboard the UAV. The results of measurements in terms of detection accuracy and processing speed are enclosed, and recommendations for the neural network model and hardware platform are given based on the obtained values. The complete system can be put together using inexpensive, off-the-shelf components, and perform autonomous localisation of discarded trash, relieving human personnel of this burdensome task, and enabling automated pickup planning. View Full-Text
Keywords: deep learning; object detection; image processing; trash; litter; UAV; YOLO deep learning; object detection; image processing; trash; litter; UAV; YOLO
Show Figures

Graphical abstract

MDPI and ACS Style

Kraft, M.; Piechocki, M.; Ptak, B.; Walas, K. Autonomous, Onboard Vision-Based Trash and Litter Detection in Low Altitude Aerial Images Collected by an Unmanned Aerial Vehicle. Remote Sens. 2021, 13, 965. https://doi.org/10.3390/rs13050965

AMA Style

Kraft M, Piechocki M, Ptak B, Walas K. Autonomous, Onboard Vision-Based Trash and Litter Detection in Low Altitude Aerial Images Collected by an Unmanned Aerial Vehicle. Remote Sensing. 2021; 13(5):965. https://doi.org/10.3390/rs13050965

Chicago/Turabian Style

Kraft, Marek; Piechocki, Mateusz; Ptak, Bartosz; Walas, Krzysztof. 2021. "Autonomous, Onboard Vision-Based Trash and Litter Detection in Low Altitude Aerial Images Collected by an Unmanned Aerial Vehicle" Remote Sens. 13, no. 5: 965. https://doi.org/10.3390/rs13050965

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop