Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area and Plant Count Collection
2.2. UAS Data Collection
2.3. Geospatial Object Detection Framework for Plant Counting
2.3.1. UAS Data Pre-Processing Using Structure from Motion (SfM)
2.3.2. YOLOv3-Based Plant Detection
2.3.3. Fine-Tuning of Object Center
2.3.4. Image Coordinates to Ground Surface Projection
3. Results and Discussion
3.1. Quality of Iput Data
3.2. Object Detection Accuracy Using YOLOv3
3.3. Canopy and Non-Canopy Classification
3.4. Determination of Optimal Parameters for Screening and Clustering Plant Centers
3.5. Accuracy of Plant Counting
3.6. Processign Time
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Reddy, K.R.; Reddy, V.R.; Hodges, H.F. Temperature effects on early season cotton growth and development. Agron. J. 1992, 84, 229–237. [Google Scholar] [CrossRef]
- Reddy, K.R.; Brand, D.; Wijewardana, C.; Gao, W. Temperature effects on cotton seedling emergence, growth, and development. Agron. J. 2017, 109, 1379–1387. [Google Scholar] [CrossRef]
- Briddon, R.W.; Markham, P.G. Cotton leaf curl virus disease. Virus Res. 2000, 71, 151–159. [Google Scholar] [CrossRef]
- Wheeler, T.R.; Craufurd, P.Q.; Ellis, R.H.; Porter, J.R.; Vara Prasad, P.V. Temperature variability and the yield of annual crops. Agric. Ecosyst. Environ. 2000, 82, 159–167. [Google Scholar] [CrossRef]
- Hopper, N.; Supak, J.; Kaufman, H. Evaluation of several fungicides on seedling emergence and stand establishment of Texas high plains cotton. In Proceedings of the Beltwide Cotton Production Research Conference, New Orleans, LA, USA, 5–8 January 1988. [Google Scholar]
- Wrather, J.A.; Phipps, B.J.; Stevens, W.E.; Phillips, A.S.; Vories, E.D. Cotton planting date and plant population effects on yield and fiber quality in the Mississippi Delta. J. Cotton Sci. 2008, 12, 1–7. [Google Scholar]
- UC IPM Pest Management Guidelines: Cotton. Available online: http://ipm.ucanr.edu/PDF/PMG/pmgcotton.pdf (accessed on 3 July 2020).
- Hunt, E.R.; Daughtry, C.S.T. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef] [Green Version]
- Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
- Roth, L.; Streit, B. Predicting cover crop biomass by lightweight UAS-based RGB and NIR photography: An applied photogrammetric approach. Precis. Agric. 2018, 19, 93–114. [Google Scholar] [CrossRef] [Green Version]
- Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
- Berni, J.A.J.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef] [Green Version]
- Duan, T.; Zheng, B.; Guo, W.; Ninomiya, S.; Guo, Y.; Chapman, S.C. Comparison of ground cover estimates from experiment plots in cotton, sorghum and sugarcane based on images and ortho-mosaics captured by UAV. Funct. Plant Biol. 2017, 44, 169–183. [Google Scholar] [CrossRef]
- Jung, J.; Maeda, M.; Chang, A.; Landivar, J.; Yeom, J.; McGinty, J. Unmanned aerial system assisted framework for the selection of high yielding cotton genotypes. Comput. Electron. Agric. 2018, 152, 74–81. [Google Scholar] [CrossRef]
- Ashapure, A.; Jung, J.; Yeom, J.; Chang, A.; Maeda, M.; Maeda, A.; Landivar, J. A novel framework to detect conventional tillage and no-tillage cropping system effect on cotton growth and development using multi-temporal UAS data. ISPRS J. Photogramm. Remote Sens. 2019, 152, 49–64. [Google Scholar] [CrossRef]
- Chen, A.; Orlov-Levin, V.; Meron, M. Applying high-resolution visible-channel aerial imaging of crop canopy to precision irrigation management. Agric. Water Manag. 2019, 216, 196–205. [Google Scholar] [CrossRef]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L.; Wen, S.; Jiang, Y.; Suo, G.; Chen, P. A two-stage classification approach for the detection of spider mite- infested cotton using UAV multispectral imagery. Remote Sens. Lett. 2018, 9, 933–941. [Google Scholar] [CrossRef]
- Wang, T.; Alex Thomasson, J.; Yang, C.; Isakeit, T. Field-region and plant-level classification of cotton root rot based on UAV remote sensing. In Proceedings of the 2019 ASABE Annual International Meeting, Boston, MA, USA, 7–10 July 2019. [Google Scholar]
- Yeom, J.; Jung, J.; Chang, A.; Maeda, M.; Landivar, J. Automated open cotton boll detection for yield estimation using unmanned aircraft vehicle (UAV) data. Remote Sens. 2018, 10, 1895. [Google Scholar] [CrossRef] [Green Version]
- Ehsani, R.; Wulfsohn, D.; Das, J.; Lagos, I.Z. Yield estimation: A low-hanging fruit for application of small UAS. Resour. Eng. Technol. Sustain. World 2016, 23, 16–18. [Google Scholar]
- Chen, R.; Chu, T.; Landivar, J.A.; Yang, C.; Maeda, M.M. Monitoring cotton (Gossypium hirsutum L.) germination using ultrahigh-resolution UAS images. Precis. Agric. 2018, 19, 161–177. [Google Scholar] [CrossRef]
- Feng, A.; Sudduth, K.A.; Vories, E.D.; Zhou, J. Evaluation of cotton stand count using UAV-based hyperspectral imagery. In Proceedings of the 2019 ASABE Annual International Meeting, Boston, MA, USA, 7–10 July 2019. [Google Scholar]
- Jin, X.; Liu, S.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef] [Green Version]
- Gnädinger, F.; Schmidhalter, U. Digital counts of maize plants by unmanned aerial vehicles (UAVs). Remote Sens. 2017, 9, 544. [Google Scholar] [CrossRef] [Green Version]
- Liu, T.; Wu, W.; Chen, W.; Sun, C.; Zhu, X.; Guo, W. Automated image-processing for counting seedlings in a wheat field. Precis. Agric. 2016, 17, 392–406. [Google Scholar] [CrossRef]
- Kalantar, B.; Mansor, S.B.; Shafri, H.Z.M.; Halin, A.A. Integration of template matching and object-based image analysis for semi-Automatic oil palm tree counting in UAV images. In Proceedings of the 37th Asian Conference on Remote Sensing, ACRS 2016, Colombo, Sri Lanka, 17–21 October 2016. [Google Scholar]
- Salamí, E.; Gallardo, A.; Skorobogatov, G.; Barrado, C. On-the-Fly Olive Tree Counting Using a UAS and Cloud Services. Remote Sens. 2019, 11, 316. [Google Scholar] [CrossRef] [Green Version]
- Gu, J.; Grybas, H.; Congalton, R.G. Individual Tree Crown Delineation from UAS Imagery Based on Region Growing and Growth Space Considerations. Remote Sens. 2020, 12, 2363. [Google Scholar] [CrossRef]
- De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef] [Green Version]
- Wetz, M.S.; Hayes, K.C.; Fisher, K.V.B.; Price, L.; Sterba-Boatwright, B. Water quality dynamics in an urbanizing subtropical estuary (Oso Bay, Texas). Mar. Pollut. Bull. 2016, 104, 44–53. [Google Scholar] [CrossRef]
- Schonberger, J.L.; Frahm, J.M. Structure-from-Motion Revisited. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. “Structure-from-Motion” photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef] [Green Version]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Hess, M.; Green, S. Structure from motion. In Digital Techniques for Documenting and Preserving Cultural Heritage; Bentkowska-Kafel, A., MacDonald, L., Eds.; Arc Humanities Press: York, UK, 2017; pp. 243–246. [Google Scholar]
- Furukawa, Y.; Ponce, J. Accurate, dense, and robust multiview stereopsis. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 1362–1376. [Google Scholar] [CrossRef]
- Haala, N.; Rothermel, M. Dense multi-stereo matching for high quality digital elevation models. Photogramm. Fernerkund. Geoinf. 2012, 2012, 331–343. [Google Scholar] [CrossRef]
- YOLOv3: An Incremental Improvement. Available online: https://pjreddie.com/media/files/papers/YOLOv3.pdf (accessed on 3 July 2020).
- Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Tong, K.; Wu, Y.; Zhou, F. Recent advances in small object detection based on deep learning: A review. Image Vis. Comput. 2020, 97, 103910. [Google Scholar] [CrossRef]
- Torralba, A.; Fergus, R.; Freeman, W.T. 80 million tiny images: A large data set for nonparametric object and scene recognition. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 1958–1970. [Google Scholar] [CrossRef] [PubMed]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2826–2830. [Google Scholar]
- Varoquaux, G.; Buitinck, L.; Louppe, G.; Grisel, O.; Pedregosa, F.; Mueller, A. Scikit-learn. GetMobile Mob. Comput. Commun. 2015, 19. [Google Scholar] [CrossRef]
- Patrignani, A.; Ochsner, T.E. Canopeo: A powerful new tool for measuring fractional green canopy cover. Agron. J. 2015, 107, 2312–2320. [Google Scholar] [CrossRef] [Green Version]
- Chung, Y.S.; Choi, S.C.; Silva, R.R.; Kang, J.W.; Eom, J.H.; Kim, C. Case study: Estimation of sorghum biomass using digital image analysis with Canopeo. Biomass Bioenerg. 2017, 105, 207–210. [Google Scholar] [CrossRef]
- Di Stefano, L.; Bulgarelli, A. A simple and efficient connected components labeling algorithm. In Proceedings of the 10th International Conference on Image Analysis and Processing, Venice, Italy, 27–29 September 1999. [Google Scholar]
- Image Processing Review, Neighbors, Connected Components, and Distance. Available online: http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/MORSE/connectivity.pdf (accessed on 3 July 2020).
- Enciso, J.M.; Colaizzi, P.D.; Multer, W.L. Economic analysis of subsurface drip irrigation lateral spacing and installation depth for cotton. Trans. Am. Soc. Agric. Eng. 2005, 48, 197–204. [Google Scholar] [CrossRef]
- Khan, N.; Usman, K.; Yazdan, F.; Din, S.U.; Gull, S.; Khan, S. Impact of tillage and intra-row spacing on cotton yield and quality in wheat–cotton system. Arch. Agron. Soil Sci. 2015, 61, 581–597. [Google Scholar] [CrossRef]
- Yazgi, A.; Degirmencioglu, A. Optimisation of the seed spacing uniformity performance of a vacuum-type precision seeder using response surface methodology. Biosyst. Eng. 2007, 97, 347–356. [Google Scholar] [CrossRef]
- Nichols, S.P.; Snipes, C.E.; Jones, M.A. Cotton growth, lint yield, and fiber quality as affected by row spacing and cultivar. J. Cotton Sci. 2004, 8, 1–12. [Google Scholar]
- Murtagh, F.; Legendre, P. Ward’s Hierarchical Agglomerative Clustering Method: Which Algorithms Implement Ward’s Criterion? J. Classif. 2014, 31, 274–295. [Google Scholar] [CrossRef] [Green Version]
- Bouguettaya, A.; Yu, Q.; Liu, X.; Zhou, X.; Song, A. Efficient agglomerative hierarchical clustering. Expert Syst. Appl. 2015, 42, 2785–2797. [Google Scholar] [CrossRef]
Time and Date of UAS Data Collection | Altitude above Ground (m) | Side/Front Overlap (%) | Physical Seedling/Plant Separation | Ground Wetness, Brightness |
---|---|---|---|---|
14:02, 15 April 2019 | 10 | 75 | Separable | Dry, bright |
15:34, 16 April 2019 | 10 | 75 | Separable | Dry, bright |
16:52, 18 April 2019 | 10 | 75 | Generally separable | Dry/moist, darker |
14:52, 24 April 2019 | 10 | 75 | Hard to separate | Dry, bright |
YOLOv3 Model | UAS Data Used for Training and Testing (See Table 1) | No. of Training Images | No. of Testing Images |
---|---|---|---|
Model 1 | April 15 | 200 | 100 |
Model 2 | April 16 | 200 | 100 |
Model 3 | April 18 | 200 | 100 |
Model 4 | April 24 | 200 | 100 |
Model 5 | April 15, 16 | 400 | 200 |
Model 6 | April 15, 16, 18 | 600 | 300 |
Model 7 | April 15, 16, 18, 24 | 800 | 400 |
Model 8 | April 16, 18 | 400 | 200 |
Model 9 | April 16, 18, 24 | 600 | 300 |
Model 10 | April 18, 24 | 400 | 200 |
YOLOv3 Model | Accuracy When Model was Applied to Testing Data of Corresponding Dates (i.e., Own Date) | Accuracy When Model was Applied to 15 April Testing Data | Accuracy When Model was Applied to 18 April Testing Data | |||
---|---|---|---|---|---|---|
mAP * (%) | F1 Score | mAP (%) | F1 Score | mAP (%) | F1 Score | |
Model 1 | 88.16 ** | 0.85 | 88.16 ** | 0.85 | 50.38 | 0.48 |
Model 2 | 86.91 ** | 0.84 | 48.79 | 0.50 | 57.74 | 0.53 |
Model 3 | 78.76 | 0.77 | 59.56 | 0.64 | 78.76 | 0.77 |
Model 4 | 67.56 | 0.58 | 5.51 | 0.06 | 43.37 | 0.42 |
Model 5 | 86.02 ** | 0.82 | 85.48 ** | 0.81 | 66.02 | 0.59 |
Model 6 | 83.60 ** | 0.81 | 86.85 ** | 0.83 | 80.35 ** | 0.78 |
Model 7 | 79.31 | 0.72 | 83.08 ** | 0.75 | 78.09 | 0.70 |
Model 8 | 79.21 | 0.77 | 69.59 | 0.68 | 77.04 | 0.77 |
Model 9 | 77.78 | 0.72 | 54.92 | 0.58 | 81.25 ** | 0.79 |
Model 10 | 71.28 | 0.73 | 53.28 | 0.60 | 79.66 | 0.78 |
Maximum Radial Distance from Fiducial Center (Pixels) | Minimum Class Confidence Score (%) | Maximum Distance between Clusters by Ward’s Minimum Variance Method (m2, Equation (1)) | Average RMSE of Plant Count Derived by Models 1 to 10 (Plants per Linear Meter of Row) |
---|---|---|---|
820 | 25 | 0.05 | 1.8 |
820 | 25 | 0.1 | 2.13 |
1320 | 25 | 0.05 | 1.43 * |
1320 | 25 | 0.1 | 1.34 * |
1980 | 25 | 0.05 | 2.37 |
1980 | 25 | 0.1 | 1.02 * |
820 | 50 | 0.05 | 2.6 |
820 | 50 | 0.1 | 2.83 |
1320 | 50 | 0.05 | 1.88 |
1320 | 50 | 0.1 | 2.03 |
1980 | 50 | 0.05 | 1.98 |
1980 | 50 | 0.1 | 1.56 |
YOLOv3 Model | UAS Data Used for Training | RMSE (Plants per Linear Meter of Row) | R2 |
---|---|---|---|
Model 1 | April 15 | 0.56 | 0.97 |
Model 2 | April 16 | 1.14 | 0.82 |
Model 3 | April 18 | 0.81 | 0.92 |
Model 4 | April 24 | 3.11 | Negative |
Model 5 | April 15, 16 | 0.50 | 0.97 |
Model 6 | April 15, 16, 18 | 0.50 | 0.97 |
Model 7 | April 15, 16, 18, 24 | 0.60 | 0.96 |
Model 8 | April 16, 18 | 0.85 | 0.91 |
Model 9 | April 16, 18, 24 | 1.06 | 0.83 |
Model 10 | April 18, 24 | 1.09 | 0.83 |
Sub-Process | Approximate Processing Time (min) |
---|---|
SfM process and surface mesh generation | 30 |
Splitting original UAS images | 3 |
Labeling data (300 images) | 120 |
Training YOLOv3 network | 300 |
Applying YOLOv3 network | 15 |
Canopy and non-canopy classification | 5 |
Image coordinates to ground surface projection | 15 |
Clustering and grid-wise stand counting | 1 |
Total estimated time when pretrained YOLOv3 model does not exist | 489 |
Total estimated time when pretrained YOLOv3 model exists | 69 |
Total estimated time when pretrained YOLOv3 model, ground surface model, and RTK GPS measurement of UAS is available | 39 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Oh, S.; Chang, A.; Ashapure, A.; Jung, J.; Dube, N.; Maeda, M.; Gonzalez, D.; Landivar, J. Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework. Remote Sens. 2020, 12, 2981. https://doi.org/10.3390/rs12182981
Oh S, Chang A, Ashapure A, Jung J, Dube N, Maeda M, Gonzalez D, Landivar J. Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework. Remote Sensing. 2020; 12(18):2981. https://doi.org/10.3390/rs12182981
Chicago/Turabian StyleOh, Sungchan, Anjin Chang, Akash Ashapure, Jinha Jung, Nothabo Dube, Murilo Maeda, Daniel Gonzalez, and Juan Landivar. 2020. "Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework" Remote Sensing 12, no. 18: 2981. https://doi.org/10.3390/rs12182981
APA StyleOh, S., Chang, A., Ashapure, A., Jung, J., Dube, N., Maeda, M., Gonzalez, D., & Landivar, J. (2020). Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework. Remote Sensing, 12(18), 2981. https://doi.org/10.3390/rs12182981