Identification and Counting of Sugarcane Seedlings in the Field Using Improved Faster R-CNN
Abstract
:1. Introduction
2. Experimental Data
2.1. Experimental Sugarcane Field
2.2. UAV Imagery Data Acquisition
2.3. COCO Dataset
3. Methodology
3.1. Overall Program
3.2. Data Preprocessing
3.3. Improved Faster R-CNN Detection Algorithm
3.3.1. Feature Extraction Using ResNet-50
3.3.2. Attention Network SN-Block
3.3.3. Multi-Scale Feature Fusion
3.3.4. Anchor Optimization for the Regional Proposed Network (RPN)
3.4. Construction of a System for the Automatic Recognition and Counting of Sugarcane Seedlings
3.5. Assessment Metrics
3.5.1. Metrics for Detection Accuracy Assessment
- True Positive (TP)—the number of correctly detected sugarcane seedlings
- False Positive (FP)—the number of weed or other features incorrectly detected as sugarcane seedlings
- False Negative (FN)—the number of sugarcane seedlings that are not detected
- Q—the total number of images to be verified
- Recall (q)—the recall rate of an image q in the data set
3.5.2. Counting Accuracy Assessment Metrics
- n = the total number of test images
- ai = the ground truth number of sugarcane seedlings in the i-th aerial image
- ci = the model prediction number (the total number of detected target frames)
- a∗i = the average ground truth number
3.6. Training Platform and Parameter Settings
4. Results and Discussion
4.1. Model Performance Comparison
4.1.1. Comparison of Detection Performance before and after Model Improvement
4.1.2. Performance Comparison with Other Object Detection Models
4.2. The Influence of IoU Threshold on Detection and Counting Results of Sugarcane Seedlings in Original Aerial Images
4.2.1. Analysis of Detecting Performance of Sugarcane Seedlings in Original Images
4.2.2. Analysis of Counting Performance
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Li, Y.R.; Song, X.P.; Wu, J.M.; Li, C.N.; Liang, Q.; Liu, X.H.; Yang, L.T. Sugar industry and improved sugarcane farming technologies in China. Sugar Tech 2016, 18, 603–611. [Google Scholar] [CrossRef]
- Zhang, M.; Govindaraju, M. Sugarcane production in China. In Sugarcane-Technology and Research; IntechOpen: London, UK, 2018; p. 49. [Google Scholar] [CrossRef] [Green Version]
- Elsharif, A.A.; Abu-Naser, S.S. An Expert System for Diagnosing Sugarcane Diseases. Int. J. Acad. Eng. Res. (IJAER) 2019, 3, 19–27. Available online: https://ssrn.com/abstract=3369014 (accessed on 23 November 2020).
- Flack-Prain, S.; Shi, L.; Zhu, P.; da Rocha, H.R.; Cabral, O.; Hu, S.; Williams, M. The impact of climate change and climate extremes on sugarcane production. GCB Bioenergy 2021, 13, 408–424. [Google Scholar] [CrossRef]
- Bhatt, R. Resources management for sustainable sugarcane production. In Resources Use Efficiency in Agriculture; Springer: Singapore, 2020; pp. 647–693. [Google Scholar] [CrossRef]
- Linnenluecke, M.K.; Nucifora, N.; Thompson, N. Implications of climate change for the sugarcane industry. Wiley Interdiscip. Rev. Clim. Chang. 2018, 9, e498. [Google Scholar] [CrossRef]
- Stein, M.; Bargoti, S.; Underwood, J. Image based mango fruit detection, localisation and yield estimation using multiple view geometry. Sensors 2016, 16, 1915. [Google Scholar] [CrossRef]
- Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.G. Unmanned Aerial Vehicles (UAV) in precision agriculture: Applications and challenges. Energies 2021, 15, 217. [Google Scholar] [CrossRef]
- Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
- Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
- Pathan, M.; Patel, N.; Yagnik, H.; Shah, M. Artificial cognition for applications in smart agriculture: A comprehensive review. Artif. Intell. Agric. 2020, 4, 81–95. [Google Scholar] [CrossRef]
- Ponti, M.; Chaves, A.A.; Jorge, F.R.; Costa, G.B.; Colturato, A.; Branco, K.R. Precision agriculture: Using low-cost systems to acquire low-altitude images. IEEE Comput. Graph. Appl. 2016, 36, 14–20. [Google Scholar] [CrossRef]
- Montibeller, M.; da Silveira, H.L.F.; Sanches, I.D.A.; Körting, T.S.; Fonseca, L.M.G. Identification of gaps in sugarcane plantations using UAV images. In Proceedings of the Brazilian Symposium on Remote Sensing, Santos, Brazil, 28–31 May 2017. [Google Scholar]
- Sanches, G.M.; Duft, D.G.; Kölln, O.T.; Luciano, A.C.D.S.; De Castro, S.G.Q.; Okuno, F.M.; Franco, H.C.J. The potential for RGB images obtained using unmanned aerial vehicle to assess and predict yield in sugarcane fields. Int. J. Remote Sens. 2018, 39, 5402–5414. [Google Scholar] [CrossRef]
- Yu, Z.; Cao, Z.; Wu, X.; Bai, X.; Qin, Y.; Zhuo, W.; Xue, H. Automatic image-based detection technology for two critical growth stages of maize: Emergence and three-leaf stage. Agric. For. Meteorol. 2013, 174, 65–84. [Google Scholar] [CrossRef]
- Liu, T.; Wu, W.; Chen, W.; Sun, C.; Zhu, X.; Guo, W. Automated image-processing for counting seedlings in a wheat field. Precis. Agric. 2016, 17, 392–406. [Google Scholar] [CrossRef]
- Zhao, B.; Zhang, J.; Yang, C.; Zhou, G.; Ding, Y.; Shi, Y.; Liao, Q. Rapeseed seedling stand counting and seeding performance evaluation at two early growth stages based on unmanned aerial vehicle imagery. Front. Plant Sci. 2018, 9, 1362. [Google Scholar] [CrossRef] [PubMed]
- Xia, L.; Zhang, R.; Chen, L.; Huang, Y.; Xu, G.; Wen, Y.; Yi, T. Monitor cotton budding using SVM and UAV images. Appl. Sci. 2019, 9, 4312. [Google Scholar] [CrossRef] [Green Version]
- Li, B.; Xu, X.; Han, J.; Zhang, L.; Bian, C.; Jin, L.; Liu, J. The estimation of crop emergence in potatoes by UAV RGB imagery. Plant Methods 2019, 15, 1–13. [Google Scholar] [CrossRef] [Green Version]
- Banerjee, B.P.; Sharma, V.; Spangenberg, G.; Kant, S. Machine learning regression analysis for estimation of crop emergence using multispectral UAV imagery. Remote Sens. 2021, 13, 2918. [Google Scholar] [CrossRef]
- Saleem, M.H.; Potgieter, J.; Arif, K.M. Automation in agriculture by machine and deep learning techniques: A review of recent developments. Precis. Agric. 2021, 22, 2053–2091. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Patrício, D.I.; Rieder, R. Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Comput. Electron. Agric. 2018, 153, 69–81. [Google Scholar] [CrossRef] [Green Version]
- Villaruz, J.A.; Salido, J.A.A.; Barrios, D.M.; Felizardo, R.L. Philippine indigenous plant seedlings classification using deep learning. In Proceedings of the 2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), Baguio City, Philippines, 29 November–2 December 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Li, Z.; Li, Y.; Yang, Y.; Guo, R.; Yang, J.; Yue, J.; Wang, Y. A high-precision detection method of hydroponic lettuce seedlings status based on improved Faster RCNN. Comput. Electron. Agric. 2021, 182, 106054. [Google Scholar] [CrossRef]
- Jiang, Y.; Li, C.; Paterson, A.H.; Robertson, J.S. DeepSeedling: Deep convolutional network and Kalman filter for plant seedling detection and counting in the field. Plant Methods 2019, 15, 1–19. [Google Scholar] [CrossRef] [Green Version]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [Green Version]
- Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosyst. Eng. 2019, 184, 1–23. [Google Scholar] [CrossRef]
- Fromm, M.; Schubert, M.; Castilla, G.; Linke, J.; McDermid, G. Automated detection of conifer seedlings in drone imagery using convolutional neural networks. Remote Sens. 2019, 11, 2585. [Google Scholar] [CrossRef] [Green Version]
- Lin, Z.; Guo, W. Cotton stand counting from unmanned aerial system imagery using mobilenet and centernet deep learning models. Remote Sens. 2021, 13, 2822. [Google Scholar] [CrossRef]
- Oh, S.; Chang, A.; Ashapure, A.; Jung, J.; Dube, N.; Maeda, M.; Landivar, J. Plant counting of cotton from UAS imagery using deep learning-based object detection framework. Remote Sens. 2020, 12, 2981. [Google Scholar] [CrossRef]
- Neupane, B.; Horanont, T.; Hung, N.D. Deep learning based banana plant detection and counting using high-resolution red-green-blue (RGB) images collected from unmanned aerial vehicle (UAV). PLoS ONE 2019, 14, e0223906. [Google Scholar] [CrossRef]
- Feng, A.; Zhou, J.; Vories, E.; Sudduth, K.A. Evaluation of cotton emergence using UAV-based imagery and deep learning. Comput. Electron. Agric. 2020, 177, 105711. [Google Scholar] [CrossRef]
- Anuar, M.M.; Halin, A.A.; Perumal, T.; Kalantar, B. Aerial imagery paddy seedlings inspection using deep learning. Remote Sens. 2022, 14, 274. [Google Scholar] [CrossRef]
- Li, Y.R.; Yang, L.T. Sugarcane agriculture and sugar industry in China. Sugar Tech 2015, 17, 1–8. [Google Scholar] [CrossRef]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Zitnick, C.L. Microsoft COCO: Common objects in context. In Proceedings of the European Conference on Computer Vision, Zurich, Switzerland, 6–12 September 2014; Springer: Cham, Switzerland, 2014; pp. 740–755. [Google Scholar] [CrossRef]
- Zuiderveld, K. Contrast limited adaptive histogram equalization. Graph. Gems 1994, 4, 474–485. Available online: https://ci.nii.ac.jp/naid/10031105927/ (accessed on 25 September 2022).
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef]
- Lin, T.Y.; Dollár, P.; Girshick, R.; He, K.; Hariharan, B.; Belongie, S. Feature pyramid networks for object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2117–2125. [Google Scholar] [CrossRef]
- Yosinski, J.; Clune, J.; Bengio, Y.; Lipson, H. How transferable are features in deep neural networks? arXiv 2014, arXiv:1411.1792. [Google Scholar]
Parameters | Value |
---|---|
Learning rate | 0.001 |
Weight decay | 0.0005 |
Momentum | 0.9 |
Epoch | 150 |
Regularization factor | 0.0001 |
optimizer | Momentum Optimizer |
Method | The Number of Anchor Boxes | AP/% | AR/% |
---|---|---|---|
Original Faster R-CNN | 9 | 84.98 | 83.56 |
Faster R-CNN | 20 | 86.58 | 84.91 |
Faster R-CNN+ResNet50 | 20 | 89.88 | 87.22 |
Faster R-CNN+ResNet50+SN-Block | 20 | 90.59 | 87.65 |
Faster R-CNN+ResNet50+SN-Block+FPN | 20 | 93.67 | 89.78 |
Models | AP% | AR% |
---|---|---|
YOLO v2 | 79.75 | 76.84 |
YOLO v3 | 82.46 | 80.33 |
SSD | 77.50 | 74.21 |
Faster R-CNN | 84.98 | 83.56 |
Improved Faster R-CNN | 93.67 | 89.78 |
IOU Thresholds | Precision (%) | Recall (%) | F1-score (%) |
---|---|---|---|
λ = 0.05 | 95.32 | 86.93 | 90.93 |
λ = 0.10 | 93.14 | 92.76 | 92.95 |
λ = 0.15 | 92.70 | 94.65 | 93.66 |
λ = 0.20 | 91.34 | 95.20 | 93.23 |
λ = 0.25 | 89.93 | 96.24 | 92.98 |
λ = 0.30 | 88.02 | 96.66 | 92.14 |
IOU Thresholds | CA (%) | MAE | R2 |
---|---|---|---|
λ = 0.05 | 87.63 | 21.55 | 0.9687 |
λ = 0.10 | 94.08 | 10.15 | 0.9824 |
λ = 0.15 | 96.83 | 4.60 | 0.9905 |
λ = 0.20 | 92.73 | 10.30 | 0.9880 |
λ = 0.25 | 86.75 | 19.15 | 0.9784 |
λ = 0.30 | 81.06 | 28.05 | 0.9746 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pan, Y.; Zhu, N.; Ding, L.; Li, X.; Goh, H.-H.; Han, C.; Zhang, M. Identification and Counting of Sugarcane Seedlings in the Field Using Improved Faster R-CNN. Remote Sens. 2022, 14, 5846. https://doi.org/10.3390/rs14225846
Pan Y, Zhu N, Ding L, Li X, Goh H-H, Han C, Zhang M. Identification and Counting of Sugarcane Seedlings in the Field Using Improved Faster R-CNN. Remote Sensing. 2022; 14(22):5846. https://doi.org/10.3390/rs14225846
Chicago/Turabian StylePan, Yuyun, Nengzhi Zhu, Lu Ding, Xiuhua Li, Hui-Hwang Goh, Chao Han, and Muqing Zhang. 2022. "Identification and Counting of Sugarcane Seedlings in the Field Using Improved Faster R-CNN" Remote Sensing 14, no. 22: 5846. https://doi.org/10.3390/rs14225846
APA StylePan, Y., Zhu, N., Ding, L., Li, X., Goh, H. -H., Han, C., & Zhang, M. (2022). Identification and Counting of Sugarcane Seedlings in the Field Using Improved Faster R-CNN. Remote Sensing, 14(22), 5846. https://doi.org/10.3390/rs14225846