Next Article in Journal
Revision of WDM7 Microphysics Scheme and Evaluation for Precipitating Convection over the Korean Peninsula
Previous Article in Journal
Monitoring Meteorological Drought in Southern China Using Remote Sensing Data
Previous Article in Special Issue
Detect, Consolidate, Delineate: Scalable Mapping of Field Boundaries Using Satellite Images
Article

Real-Time Automated Classification of Sky Conditions Using Deep Learning and Edge Computing

Geosystems Research Institute, Mississippi State University, Mississippi State, MS 39762, USA
*
Author to whom correspondence should be addressed.
Academic Editors: Karem Chokmani, Yacine Bouroubi and Saeid Homayouni
Remote Sens. 2021, 13(19), 3859; https://doi.org/10.3390/rs13193859
Received: 11 August 2021 / Revised: 18 September 2021 / Accepted: 23 September 2021 / Published: 27 September 2021
(This article belongs to the Special Issue Applications of Deep Learning in Smart Agriculture)
The radiometric quality of remotely sensed imagery is crucial for precision agriculture applications because estimations of plant health rely on the underlying quality. Sky conditions, and specifically shadowing from clouds, are critical determinants in the quality of images that can be obtained from low-altitude sensing platforms. In this work, we first compare common deep learning approaches to classify sky conditions with regard to cloud shadows in agricultural fields using a visible spectrum camera. We then develop an artificial-intelligence-based edge computing system to fully automate the classification process. Training data consisting of 100 oblique angle images of the sky were provided to a convolutional neural network and two deep residual neural networks (ResNet18 and ResNet34) to facilitate learning two classes, namely (1) good image quality expected, and (2) degraded image quality expected. The expectation of quality stemmed from the sky condition (i.e., density, coverage, and thickness of clouds) present at the time of the image capture. These networks were tested using a set of 13,000 images. Our results demonstrated that ResNet18 and ResNet34 classifiers produced better classification accuracy when compared to a convolutional neural network classifier. The best overall accuracy was obtained by ResNet34, which was 92% accurate, with a Kappa statistic of 0.77. These results demonstrate a low-cost solution to quality control for future autonomous farming systems that will operate without human intervention and supervision. View Full-Text
Keywords: autonomous systems; cloud detection; low-altitude remote sensing; ResNet; UAS image quality autonomous systems; cloud detection; low-altitude remote sensing; ResNet; UAS image quality
Show Figures

Graphical abstract

MDPI and ACS Style

Czarnecki, J.M.P.; Samiappan, S.; Zhou, M.; McCraine, C.D.; Wasson, L.L. Real-Time Automated Classification of Sky Conditions Using Deep Learning and Edge Computing. Remote Sens. 2021, 13, 3859. https://doi.org/10.3390/rs13193859

AMA Style

Czarnecki JMP, Samiappan S, Zhou M, McCraine CD, Wasson LL. Real-Time Automated Classification of Sky Conditions Using Deep Learning and Edge Computing. Remote Sensing. 2021; 13(19):3859. https://doi.org/10.3390/rs13193859

Chicago/Turabian Style

Czarnecki, Joby M.P., Sathishkumar Samiappan, Meilun Zhou, Cary D. McCraine, and Louis L. Wasson 2021. "Real-Time Automated Classification of Sky Conditions Using Deep Learning and Edge Computing" Remote Sensing 13, no. 19: 3859. https://doi.org/10.3390/rs13193859

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop