Next Article in Journal
Split-Attention Networks with Self-Calibrated Convolution for Moon Impact Crater Detection from Multi-Source Data
Previous Article in Journal
An Automated Machine Learning Framework in Unmanned Aircraft Systems: New Insights into Agricultural Management Practices Recognition Approaches
Previous Article in Special Issue
Machine Learning Comparison and Parameter Setting Methods for the Detection of Dump Sites for Construction and Demolition Waste Using the Google Earth Engine
 
 
Article

How the Small Object Detection via Machine Learning and UAS-Based Remote-Sensing Imagery Can Support the Achievement of SDG2: A Case Study of Vole Burrows

1
Spectroscopy & Remote Sensing Laboratory, Center for Spatial Analysis Research (UHCSISR), Department of Geography and Environmental Studies, University of Haifa, Abba Khoushy Ave 199, Haifa 3498838, Israel
2
The Shamir Research Institute, University of Haifa, Katzrin 12900, Israel
3
Department of Geography and Environmental Studies, University of Haifa, Abba Khoushy Ave 199, Haifa 3498838, Israel
4
Institute for Mediterranean Agricultural and Forestry Systems, National Research Council, Via Patacca 85, I-80056 Ercolano, Italy
*
Author to whom correspondence should be addressed.
Academic Editor: Joanne N. Halls
Remote Sens. 2021, 13(16), 3191; https://doi.org/10.3390/rs13163191
Received: 19 July 2021 / Revised: 9 August 2021 / Accepted: 10 August 2021 / Published: 12 August 2021
(This article belongs to the Special Issue Monitoring Sustainable Development Goals)
Small mammals, and particularly rodents, are common inhabitants of farmlands, where they play key roles in the ecosystem, but when overabundant, they can be major pests, able to reduce crop production and farmers’ incomes, with tangible effects on the achievement of Sustainable Development Goals no 2 (SDG2, Zero Hunger) of the United Nations. Farmers do not currently have a standardized, accurate method of detecting the presence, abundance, and locations of rodents in their fields, and hence do not have environmentally efficient methods of rodent control able to promote sustainable agriculture oriented to reduce the environmental impacts of cultivation. New developments in unmanned aerial system (UAS) platforms and sensor technology facilitate cost-effective data collection through simultaneous multimodal data collection approaches at very high spatial resolutions in environmental and agricultural contexts. Object detection from remote-sensing images has been an active research topic over the last decade. With recent increases in computational resources and data availability, deep learning-based object detection methods are beginning to play an important role in advancing remote-sensing commercial and scientific applications. However, the performance of current detectors on various UAS-based datasets, including multimodal spatial and physical datasets, remains limited in terms of small object detection. In particular, the ability to quickly detect small objects from a large observed scene (at field scale) is still an open question. In this paper, we compare the efficiencies of applying one- and two-stage detector models to a single UAS-based image and a processed (via Pix4D mapper photogrammetric program) UAS-based orthophoto product to detect rodent burrows, for agriculture/environmental applications as to support farmer activities in the achievements of SDG2. Our results indicate that the use of multimodal data from low-cost UASs within a self-training YOLOv3 model can provide relatively accurate and robust detection for small objects (mAP of 0.86 and an F1-score of 93.39%), and can deliver valuable insights for field management with high spatial precision able to reduce the environmental costs of crop production in the direction of precision agriculture management. View Full-Text
Keywords: small object detection; UAS; YOLOv3; Faster R-CNN; EfficientNet; RetinaNet small object detection; UAS; YOLOv3; Faster R-CNN; EfficientNet; RetinaNet
Show Figures

Figure 1

MDPI and ACS Style

Ezzy, H.; Charter, M.; Bonfante, A.; Brook, A. How the Small Object Detection via Machine Learning and UAS-Based Remote-Sensing Imagery Can Support the Achievement of SDG2: A Case Study of Vole Burrows. Remote Sens. 2021, 13, 3191. https://doi.org/10.3390/rs13163191

AMA Style

Ezzy H, Charter M, Bonfante A, Brook A. How the Small Object Detection via Machine Learning and UAS-Based Remote-Sensing Imagery Can Support the Achievement of SDG2: A Case Study of Vole Burrows. Remote Sensing. 2021; 13(16):3191. https://doi.org/10.3390/rs13163191

Chicago/Turabian Style

Ezzy, Haitham, Motti Charter, Antonello Bonfante, and Anna Brook. 2021. "How the Small Object Detection via Machine Learning and UAS-Based Remote-Sensing Imagery Can Support the Achievement of SDG2: A Case Study of Vole Burrows" Remote Sensing 13, no. 16: 3191. https://doi.org/10.3390/rs13163191

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop