Cognitive Computing Advancements: Improving Precision Crop Protection through UAV Imagery for Targeted Weed Monitoring
Abstract
:1. Introduction
- Develop a comprehensive dataset including weed species in their early growth stages, intended for the training, validation and testing of the ViT model’s classification performance on both maize and tomato crops.
- Assess the performance of the previously tested ViT model on the dataset featuring the same weed species but in later growth stages. The hypothesis being that there could be a significant decrease in performance compared to the initial growth-stage findings.
- Implement GAN-based data augmentation techniques to enhance the spatial resolution of the early growth-stage dataset. Then, create new classification models and evaluate the performance of weed species classification using ViT architecture on the subsequent growth-stage dataset.
- Evaluate and quantify the impact of increased training data on the performance of an object detection model for weed detection by training three different models using incremental datasets.
2. Materials and Methods
2.1. Dataset Generation
2.2. Vision Transformer Neural Network for Weed and Crops Classification
2.3. Model Training and Inference Details
- Preprocessing: The PyTorch library was used for label resizing to 224 × 224 pixels. Additionally, the library enabled various transformations, including random horizontal image flipping with a 50% probability and normalization based on two sets of values: mean and standard deviation. These values, (0.485, 0.456, 0.406) and (0.229, 0.224, 0.225), correspond to the means and standard deviations of each color channel (red, green, blue), calculated from a reference dataset such as ImageNet (Figure 3b).
- Custom sampler: Due to class imbalance within our dataset, we developed a custom sampler that relies on a calculation involving the number of samples per class in the training dataset and inverse class weights. These weights were assigned inversely proportional to the number of samples in each class, meaning that classes with fewer samples received higher weights, while those with more samples received lower weights. This sampler was used during training to select batches of data. This approach facilitated the model in giving greater focus to underrepresented classes, mitigating any bias toward the majority classes in the trained model.
- Hyperparameter optimization: In the cross-validation process, we utilized the AdamW algorithm [31] with a learning rate of 6 × 10−4 and a weight decay factor of 1 × 10−4. The chosen loss function was sparse categorical cross-entropy, with batches of size 32 and a total of 50 training epochs.
- Performance metrics: To evaluate the performance of the classification model, several metrics were used. Accuracy (ACC) determines the overall accuracy by calculating the percentage of correctly predicted images in relation to the total number of images. However, it is crucial to acknowledge that accuracy is reliable only when the class distribution is balanced. Precision (P) represents the fraction of images correctly labeled as positive by the model. The Recall (R) metric measures the proportion of actual positive cases that the model correctly identifies. F1-score combines precision and recall into a single value, providing a comprehensive measure of classification performance. It proves particularly useful when managing class imbalances in the dataset.
2.4. Generative Adversarial Neural Network for Image Augmentation
2.5. Vision Transformer Neural Network for Weed Detection
3. Results
3.1. Classification Inference Using the Original Datasets in Both Early and Subsequent Growth Stages
3.2. Data Augmentation GFPGAN on the Early Growth-Stage Dataset
3.3. Classification Inference after Data Augmentation on the Subsequent Growth-Stage Dataset
3.4. Detection of Weeds
3.4.1. Model Training
3.4.2. Model Inference
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Horvath, D.P.; Clay, S.A.; Swanton, C.J.; Anderson, J.V.; Chao, W.S. Weed-induced crop yield loss: A new paradigm and new challenges. Trends Plant Sci. 2023, 28, 567–582. [Google Scholar] [CrossRef] [PubMed]
- Fernández-Quintanilla, C.; Dorado, J.; Andújar, D.; Peña, J.M. Site-Specific Based Models. In Decision Support Systems for Weed Management; Chantre, G.R., González-Andújar, J.L., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 143–157. ISBN 978-3-030-44402-0. [Google Scholar]
- Andújar, D.; Ribeiro, A.; Carmona, R.; Fernández-Quintanilla, C.; Dorado, J. An assessment of the accuracy and consistency of human perception of weed cover. Weed Res. 2010, 50, 638–647. [Google Scholar] [CrossRef]
- Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
- Aghav-Palwe, S.; Gunjal, A. Chapter 1-Introduction to cognitive computing and its various applications. In Cognitive Computing for Human-Robot Interaction; Mittal, M., Shah, R.R., Roy, S., Eds.; Cognitive Data Science in Sustainable Computing; Academic Press: Cambridge, MA, USA, 2021; pp. 1–18. ISBN 978-0-323-85769-7. [Google Scholar]
- Sreedevi, A.G.; Nitya Harshitha, T.; Sugumaran, V.; Shankar, P. Application of cognitive computing in healthcare, cybersecurity, big data and IoT: A literature review. Inf. Process. Manag. 2022, 59, 102888. [Google Scholar] [CrossRef]
- Dong, Y.; Hou, J.; Zhang, N.; Zhang, M. Research on How Human Intelligence, Consciousness, and Cognitive Computing Affect the Development of Artificial Intelligence. Complexity 2020, 2020, e1680845. [Google Scholar] [CrossRef]
- Lytras, M.D.; Visvizi, A. Artificial Intelligence and Cognitive Computing: Methods, Technologies, Systems, Applications and Policy Making. Sustainability 2021, 13, 3598. [Google Scholar] [CrossRef]
- Remya, R.; Sumithra, M.G.; Krishnamoorthi, M. Foundation of cognitive computing. In Deep Learning for Cognitive Computing Systems: Technological Advancements and Applications; Sumithra, M., Kumar Dhanaraj, R., Iwendi, C., Manoharan, A., Eds.; De Gruyter: Berlin, Germany; Boston, MA, USA, 2023; pp. 19–33. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Zaidi, S.S.A.; Ansari, M.S.; Aslam, A.; Kanwal, N.; Asghar, M.; Lee, B. A survey of modern deep learning based object detection models. Digit. Signal Process. 2022, 126, 103514. [Google Scholar] [CrossRef]
- Singh, V.; Prasad, S. Speech emotion recognition system using gender dependent convolution neural network. Procedia Comput. Sci. 2023, 218, 2533–2540. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is All you Need. In Proceedings of the Advances in Neural Information Processing Systems; Curran Associates, Inc.: New York, NY, USA, 2017; Volume 30. [Google Scholar]
- Lin, T.; Wang, Y.; Liu, X.; Qiu, X. A survey of transformers. AI Open 2022, 3, 111–132. [Google Scholar] [CrossRef]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv 2021, arXiv:2010.11929. [Google Scholar] [CrossRef]
- Yang, Y.; Jiao, L.; Liu, X.; Liu, F.; Yang, S.; Feng, Z.; Tang, X. Transformers Meet Visual Learning Understanding: A Comprehensive Review. arXiv 2022, arXiv:2203.12944. [Google Scholar] [CrossRef]
- Liu, Y.; Zhang, Y.; Wang, Y.; Hou, F.; Yuan, J.; Tian, J.; Zhang, Y.; Shi, Z.; Fan, J.; He, Z. A Survey of Visual Transformers. IEEE Trans. Neural Netw. Learn. Syst. 2023, 1–21. [Google Scholar] [CrossRef] [PubMed]
- Lonij, V.P.A.; Fiot, J.-B. Chapter 8-Cognitive Systems for the Food–Water–Energy Nexus. In Handbook of Statistics; Gudivada, V.N., Raghavan, V.V., Govindaraju, V., Rao, C.R., Eds.; Cognitive Computing: Theory and Applications; Elsevier: Amsterdam, The Netherlands, 2016; Volume 35, pp. 255–282. [Google Scholar]
- Huang, Y.; Lan, Y.; Thomson, S.J.; Fang, A.; Hoffmann, W.C.; Lacey, R.E. Development of soft computing and applications in agricultural and biological engineering. Comput. Electron. Agric. 2010, 71, 107–127. [Google Scholar] [CrossRef]
- Mourhir, A.; Papageorgiou, E.I.; Kokkinos, K.; Rachidi, T. Exploring Precision Farming Scenarios Using Fuzzy Cognitive Maps. Sustainability 2017, 9, 1241. [Google Scholar] [CrossRef]
- Munteanu, S.; Sudacevschi, V.; Ababii, V.; Branishte, R.; Turcan, A.; Leashcenco, V. Cognitive Distributed Computing System for Intelligent Agriculture. Int. J. Progress. Sci. Technol. (IJPSAT) 2021, 24, 334–342. [Google Scholar]
- Maes, W.H.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
- Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
- Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
- Mumuni, A.; Mumuni, F. Data augmentation: A comprehensive survey of modern approaches. Array 2022, 16, 100258. [Google Scholar] [CrossRef]
- Meier, U. Growth Stages of Mono- and Dicotyledonous Plants: BBCH Monograph. Open Agrar Repositorium: Quedlinburg, Germany, 2018; ISBN 978-3-95547-071-5. [Google Scholar]
- Tzutalin, LabelImg. 2015. Available online: https://github.com/tzutalin/labelImg (accessed on 15 August 2024).
- Mesías-Ruiz, G.A.; Borra-Serrano, I.; Peña Barragán, J.M.; de Castro, A.I.; Fernández-Quintanilla, C.; Dorado, J. Unmanned Aerial Vehicle Imagery for Early Stage Weed Classification and Detection in Maize and Tomato Crops. DIGITAL.CSIC. 2024. Available online: https://digital.csic.es/handle/10261/347533 (accessed on 15 August 2024). [CrossRef]
- Liu, Z.; Lin, Y.; Cao, Y.; Hu, H.; Wei, Y.; Zhang, Z.; Lin, S.; Guo, B. Swin Transformer: Hierarchical Vision Transformer using Shifted Windows. In Proceedings of the 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 10–17 October 2021; pp. 9992–10002. [Google Scholar] [CrossRef]
- Selvaraju, R.R.; Cogswell, M.; Das, A.; Vedantam, R.; Parikh, D.; Batra, D. Grad-CAM: Visual Explanations From Deep Networks via Gradient-Based Localization. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 618–626. [Google Scholar]
- Loshchilov, I.; Hutter, F. Decoupled Weight Decay Regularization. arXiv 2019, arXiv:1711.05101. [Google Scholar] [CrossRef]
- Olaniyi, E.; Chen, D.; Lu, Y.; Huang, Y. Generative Adversarial Networks for Image Augmentation in Agriculture: A Systematic Review. arXiv 2022, arXiv:2204.04707. [Google Scholar] [CrossRef]
- Wang, X.; Li, Y.; Zhang, H.; Shan, Y. Towards Real-World Blind Face Restoration with Generative Facial Prior. arXiv 2021, arXiv:2101.04061. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2017, arXiv:1412.6980. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
- Carion, N.; Massa, F.; Synnaeve, G.; Usunier, N.; Kirillov, A.; Zagoruyko, S. End-to-End Object Detection with Transformers. arXiv 2020, arXiv:2005.12872. [Google Scholar] [CrossRef]
- dos Santos Ferreira, A.; Matte Freitas, D.; Gonçalves da Silva, G.; Pistori, H.; Theophilo Folhes, M. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
- Valente, J.; Doldersum, M.; Roers, C.; Kooistra, L. DETECTING RUMEX OBTUSIFOLIUS WEED PLANTS IN GRASSLANDS FROM UAV RGB IMAGERY USING DEEP LEARNING. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, IV-2/W5, 179–185. [Google Scholar] [CrossRef]
- Petrich, L.; Lohrmann, G.; Neumann, M.; Martin, F.; Frey, A.; Stoll, A.; Schmidt, V. Detection of Colchicum autumnale in drone images, using a machine-learning approach. Precis. Agric. 2020, 21, 1291–1303. [Google Scholar] [CrossRef]
- Zhang, R.; Wang, C.; Hu, X.; Liu, Y.; Chen, S.; Su, B. Weed location and recognition based on UAV imaging and deep learning. Int. J. Precis. Agric. Aviat. 2020, 3, 23–29. [Google Scholar] [CrossRef]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Wen, S.; Zhang, H.; Zhang, Y. Accurate Weed Mapping and Prescription Map Generation Based on Fully Convolutional Networks Using UAV Imagery. Sensors 2018, 18, 3299. [Google Scholar] [CrossRef]
- Lam, O.H.Y.; Dogotari, M.; Prüm, M.; Vithlani, H.N.; Roers, C.; Melville, B.; Zimmer, F.; Becker, R. An open source workflow for weed mapping in native grassland using unmanned aerial vehicle: Using Rumex obtusifolius as a case study. Eur. J. Remote Sens. 2021, 54, 71–88. [Google Scholar] [CrossRef]
- Espejo-Garcia, B.; Panoutsopoulos, H.; Anastasiou, E.; Rodríguez-Rigueiro, F.J.; Fountas, S. Top-tuning on transformers and data augmentation transferring for boosting the performance of weed identification. Comput. Electron. Agric. 2023, 211, 108055. [Google Scholar] [CrossRef]
- Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer Neural Network for Weed and Crop Classification of High Resolution UAV Images. Remote Sens. 2022, 14, 592. [Google Scholar] [CrossRef]
- Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Iqbal, J.; Alam, M. A novel semi-supervised framework for UAV based crop/weed classification. PLoS ONE 2021, 16, e0251008. [Google Scholar] [CrossRef] [PubMed]
- Shahi, T.B.; Dahal, S.; Sitaula, C.; Neupane, A.; Guo, W. Deep Learning-Based Weed Detection Using UAV Images: A Comparative Study. Drones 2023, 7, 624. [Google Scholar] [CrossRef]
- Gallo, I.; Rehman, A.U.; Dehkordi, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep Object Detection of Crop Weeds: Performance of YOLOv7 on a Real Case Dataset from UAV Images. Remote Sens. 2023, 15, 539. [Google Scholar] [CrossRef]
- Gašparović, M.; Zrinjski, M.; Barković, Đ.; Radočaj, D. An automatic method for weed mapping in oat fields based on UAV imagery. Comput. Electron. Agric. 2020, 173, 105385. [Google Scholar] [CrossRef]
Early Growth Stage | Subsequent Growth Stage | ||||
---|---|---|---|---|---|
Labels | Train | Validation | Test | Labels | |
Atriplex patula | 1000 | 720 | 180 | 100 | 1459 |
Chenopodium album | 1200 | 880 | 220 | 100 | 2175 |
Convolvulus arvensis | 1200 | 880 | 220 | 100 | 1102 |
Cyperus rotundus | 3090 | 2392 | 598 | 100 | 134 |
Datura ferox | 683 | 466 | 117 | 100 | 589 |
Lolium rigidum | 1000 | 720 | 180 | 100 | 80 |
Portulaca oleracea | 1875 | 1420 | 355 | 100 | 177 |
Salsola kali | 1200 | 880 | 220 | 100 | 1216 |
Solanum nigrum | 1900 | 1440 | 360 | 100 | 2175 |
Sorghum halepense | 1600 | 1200 | 300 | 100 | 103 |
Maize | 12,364 | 9811 | 2453 | 100 | 24,614 |
Tomato | 3890 | 3032 | 758 | 100 | 2732 |
Species | Accuracy (%) | Precision (%) | Recall (%) | F1-Score (%) | Support |
---|---|---|---|---|---|
Atriplex patula | 98.0 | 97.0 | 98.0 | 97.5 | 100 |
Chenopodium album | 98.0 | 99.0 | 98.0 | 98.5 | 100 |
Convolvulus arvensis | 99.0 | 100.0 | 99.0 | 99.5 | 100 |
Cyperus rotundus | 99.0 | 100.0 | 99.0 | 99.5 | 100 |
Datura ferox | 99.0 | 98.0 | 99.0 | 98.5 | 100 |
Lolium rigidum | 98.0 | 100.0 | 98.0 | 99.0 | 100 |
Portulaca oleracea | 100.0 | 99.0 | 100.0 | 99.5 | 100 |
Salsola kali | 98.0 | 98.0 | 98.0 | 98.0 | 100 |
Solanum nigrum | 100.0 | 99.0 | 100.0 | 99.5 | 100 |
Sorghum halepense | 100.0 | 100.0 | 100.0 | 100.0 | 100 |
Maize | 100.0 | 99.0 | 100.0 | 99.5 | 100 |
Tomato | 100.0 | 100.0 | 100.0 | 100.0 | 100 |
Accuracy | 99.1 | ||||
Macro average | 99.1 | 99.1 | 99.1 | 99.1 | |
Weighted average | 99.1 | 99.1 | 99.1 | 99.1 |
Species | Precision (%) | Recall (%) | F1-Score (%) | Support |
---|---|---|---|---|
Atriplex patula | 96.3 | 53.3 | 68.6 | 1459 |
Chenopodium album | 91.9 | 82.0 | 86.7 | 2175 |
Convolvulus arvensis | 55.6 | 97.3 | 70.7 | 1102 |
Cyperus rotundus | 65.4 | 52.2 | 58.1 | 134 |
Datura ferox | 88.7 | 91.7 | 90.2 | 589 |
Lolium rigidum | 37.1 | 90.0 | 52.6 | 80 |
Portulaca oleracea | 85.0 | 89.8 | 87.4 | 177 |
Salsola kali | 98.5 | 88.3 | 93.2 | 1216 |
Solanum nigrum | 98.2 | 100.0 | 99.1 | 2175 |
Sorghum halepense | 67.8 | 100.0 | 80.8 | 103 |
Maize | 99.0 | 98.0 | 98.5 | 24,614 |
Tomato | 91.0 | 98.6 | 94.7 | 2732 |
Accuracy | 94.7 | 36,556 | ||
Macro average | 81.2 | 86.8 | 81.7 | |
Weighted average | 95.9 | 94.7 | 94.8 |
Species | F1-Score % | |||||
---|---|---|---|---|---|---|
Original (Ori) | Data Augmentation | |||||
Ori + GFPGAN×1 | Ori + GFPGAN×2 | Ori + GFPGAN×3 | Ori + GFPGAN×1 + ×2 | Ori + GFPGAN×1 + ×2 + ×3 | ||
Atriplex patula | 68.6 | 71.5 | 65.0 | 62.1 | 72.1 | 70.6 |
Chenopodium album | 86.7 | 86.4 | 81.8 | 79.5 | 86.5 | 79.3 |
Convolvulus arvensis | 70.7 | 70.7 | 72.4 | 64.7 | 74.6 | 72.18 |
Cyperus rotundus | 58.1 | 60.8 | 57.4 | 51.4 | 58.5 | 63.8 |
Datura ferox | 90.2 | 87.5 | 71.2 | 85.2 | 79.3 | 85.5 |
Lolium rigidum | 52.6 | 67.0 | 53.0 | 69.1 | 69.5 | 63.1 |
Portulaca oleracea | 95.4 | 88.6 | 73.0 | 81.0 | 84.8 | 83.5 |
Salsola kali | 87.4 | 96.3 | 92.9 | 96.0 | 96.6 | 94.2 |
Solanum nigrum | 99.1 | 99.3 | 98.9 | 98.0 | 97.9 | 97.1 |
Sorghum halepense | 80.8 | 89.2 | 96.7 | 95.4 | 97.6 | 95.4 |
Maize | 98.5 | 98.5 | 97.2 | 97.8 | 97.9 | 97.8 |
Tomato | 94.7 | 96.8 | 95.2 | 95.6 | 97.0 | 96.7 |
Macro average | 81.7 | 84.4 | 79.6 | 81.3 | 84.4 | 83.3 |
Weighted average | 94.8 | 95.3 | 93.3 | 93.5 | 94.8 | 94.1 |
Original | Ori + GFPGAN×1 | Ori + GFPGAN×1 + ×2 | |
---|---|---|---|
mAP | 0.060 | 0.191 | 0.234 |
mAP@[IoU = 0.50] | 0.158 | 0.444 | 0.513 |
mAP@[IoU = 0.75] | 0.030 | 0.129 | 0.171 |
mAP@[area = small] | 0.031 | 0.072 | 0.096 |
mAP@[area = medium] | 0.069 | 0.206 | 0.259 |
mAP@[area = large] | 0.185 | 0.237 | 0.393 |
Recall@[maxDets = 1] | 0.096 | 0.084 | 0.095 |
Recall@[maxDets = 10] | 0.274 | 0.255 | 0.252 |
Recall@[maxDets = 100] | 0.336 | 0.340 | 0.339 |
Recall@[area = small] | 0.215 | 0.162 | 0.149 |
Recall@[area = medium] | 0.351 | 0.366 | 0.372 |
Recall@[area = large] | 0.408 | 0.447 | 0.522 |
Model | N° of Trainable Parameters (×106) | Size on Disk (Mb) | Inference Time (s) | |||
---|---|---|---|---|---|---|
Early Growth Stage | Subsequent Growth Stage | |||||
Maize (BBCH14) | Tomato (BBCH501) | Maize (BBCH17) | Tomato (BBCH509) | |||
Original | 41.5 | 166.5 | 80.28 | 103.64 | 110.56 | 165.88 |
Ori + GFPGAN×1 | 41.5 | 166.5 | - | - | 115.28 | 166.14 |
Ori + GFPGAN×1 + ×2 | 41.5 | 166.5 | - | - | 117.45 | 169.83 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mesías-Ruiz, G.A.; Peña, J.M.; de Castro, A.I.; Borra-Serrano, I.; Dorado, J. Cognitive Computing Advancements: Improving Precision Crop Protection through UAV Imagery for Targeted Weed Monitoring. Remote Sens. 2024, 16, 3026. https://doi.org/10.3390/rs16163026
Mesías-Ruiz GA, Peña JM, de Castro AI, Borra-Serrano I, Dorado J. Cognitive Computing Advancements: Improving Precision Crop Protection through UAV Imagery for Targeted Weed Monitoring. Remote Sensing. 2024; 16(16):3026. https://doi.org/10.3390/rs16163026
Chicago/Turabian StyleMesías-Ruiz, Gustavo A., José M. Peña, Ana I. de Castro, Irene Borra-Serrano, and José Dorado. 2024. "Cognitive Computing Advancements: Improving Precision Crop Protection through UAV Imagery for Targeted Weed Monitoring" Remote Sensing 16, no. 16: 3026. https://doi.org/10.3390/rs16163026
APA StyleMesías-Ruiz, G. A., Peña, J. M., de Castro, A. I., Borra-Serrano, I., & Dorado, J. (2024). Cognitive Computing Advancements: Improving Precision Crop Protection through UAV Imagery for Targeted Weed Monitoring. Remote Sensing, 16(16), 3026. https://doi.org/10.3390/rs16163026