Drone-Assisted Plant Stress Detection Using Deep Learning: A Comparative Study of YOLOv8, RetinaNet, and Faster R-CNN
Abstract
1. Introduction
- The paper offers a detailed comparative analysis of three State-of-the-Art deep learning models: YOLOv8, Faster R-CNN, and RetinaNet, and their variants for aerial images of potato crops. The research highlights the most promising method for balancing accuracy and computational cost through performance evaluation.
- The paper demonstrates how data augmentation mitigates data scarcity by introducing a larger dataset supporting a better training process and reducing the time and effort needed for manual collection and labeling of data.
- The research emphasizes the use of lightweight, fast, and accurate deep learning models in real-world agricultural settings, enabling more automated, cost-effective, and scalable crop health monitoring with drones.
2. Related Work
2.1. Drones for Precision Agriculture
2.2. Deep Learning Models in Agriculture
3. Methodology
3.1. Dataset
3.2. Data Preprocessing and Augmentation
- Pixel intensity adjustments to simulate lighting variations;
- Contrast modification using sigmoid-based enhancement;
- Brightness changes via gamma correction;
- Controlled random noise addition to simulate sensor artifacts and environmental conditions;
- Geometric transformations including horizontal/vertical flips and rotations (at 90°, 180°, and 270°) to improve spatial diversity.
3.3. Exploratory Data Analysis
3.4. Model Selection
3.4.1. YOLOv8 Variants
3.4.2. Variants of RetinaNet
3.4.3. Variants of the Faster R-CNN Method
3.5. Evaluation Metrics
3.6. Experiment Setup
4. Results and Discussion
4.1. Results of Detection by YOLOv8
4.2. Results of Detection by RetinaNet
4.3. Results of Detection by Faster R-CNN
4.4. Comparison
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
CNN | Convolutional Neural Network |
DL | Deep Learning |
ML | Machine Learning |
YOLO | You Only Look Once |
R-CNN | Region-based Convolutional Neural Network |
UAVs | Unmanned Aerial Vehicles |
LSTM | Long Short-Term Memory |
ResNet | Residual Network |
mAP | Mean Average Precision |
References
- Ning, Y.; Liu, W.; Wang, G.-L. Balancing Immunity and Yield in Crop Plants. Trends Plant Sci. 2017, 22, 1069–1079. [Google Scholar] [CrossRef]
- Seelan, S.K.; Laguette, S.; Casady, G.M.; Seielstad, G.A. Remote Sensing Applications for Precision Agriculture: A Learning Community Approach. Remote Sens. Environ. 2003, 88, 157–169. [Google Scholar] [CrossRef]
- Akbari, Y.; Almaadeed, N.; Al-Maadeed, S.; Elharrouss, O. Applications, Databases and Open Computer Vision Research from Drone Videos and Images: A Survey. Artif. Intell. Rev. 2021, 54, 3887–3938. [Google Scholar] [CrossRef]
- Bilodeau, M.F.; Esau, T.J.; Zaman, Q.U.; Heung, B.; Farooque, A.A. Using Drones to Predict Degradation of Surface Drainage on Agricultural Fields: A Case Study of the Atlantic Dykelands. AgriEngineering 2025, 7, 112. [Google Scholar] [CrossRef]
- Miyoshi, K.; Hiraguri, T.; Shimizu, H.; Hattori, K.; Kimura, T.; Okubo, S.; Endo, K.; Shimada, T.; Shibasaki, A.; Takemura, Y. Development of Pear Pollination System Using Autonomous Drones. AgriEngineering 2025, 7, 68. [Google Scholar] [CrossRef]
- Toscano, F.; Fiorentino, C.; Santana, L.S.; Magalhães, R.R.; Albiero, D.; Tomáš, Ř.; Klocová, M.; D’Antonio, P. Recent Developments and Future Prospects in the Integration of Machine Learning in Mechanised Systems for Autonomous Spraying: A Brief Review. AgriEngineering 2025, 7, 142. [Google Scholar] [CrossRef]
- Kim, H.; Kim, W.; Kim, S.D. Damage Assessment of Rice Crop after Toluene Exposure Based on the Vegetation Index (VI) and UAV Multispectral Imagery. Remote Sens. 2020, 13, 25. [Google Scholar] [CrossRef]
- Rábago, J.; Portuguez-Castro, M. Use of Drone Photogrammetry as An Innovative, Competency-Based Architecture Teaching Process. Drones 2023, 7, 187. [Google Scholar] [CrossRef]
- Abbas, A.; Zhang, Z.; Zheng, H.; Alami, M.M.; Alrefaei, A.F.; Abbas, Q.; Naqvi, S.A.H.; Rao, M.J.; Mosa, W.F.A.; Abbas, Q.; et al. Drones in Plant Disease Assessment, Efficient Monitoring, and Detection: A Way Forward to Smart Agriculture. Agronomy 2023, 13, 1524. [Google Scholar] [CrossRef]
- Martinelli, F.; Scalenghe, R.; Davino, S.; Panno, S.; Scuderi, G.; Ruisi, P.; Villa, P.; Stroppiana, D.; Boschetti, M.; Goulart, L.R.; et al. Advanced Methods of Plant Disease Detection. A Review. Agron. Sustain. Dev. 2015, 35, 1–25. [Google Scholar] [CrossRef]
- Meshram, V.; Patil, K.; Meshram, V.; Hanchate, D.; Ramkteke, S.D. Machine Learning in Agriculture Domain: A State-of-Art Survey. Artif. Intell. Life Sci. 2021, 1, 100010. [Google Scholar] [CrossRef]
- Gadiraju, K.K.; Ramachandra, B.; Chen, Z.; Vatsavai, R.R. Multimodal Deep Learning Based Crop Classification Using Multispectral and Multitemporal Satellite Imagery. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Virtual Event, 6–10 July 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 3234–3242. [Google Scholar]
- Oikonomidis, A.; Catal, C.; Kassahun, A. Deep Learning for Crop Yield Prediction: A Systematic Literature Review. N. Z. J. Crop Hortic. Sci. 2023, 51, 1–26. [Google Scholar] [CrossRef]
- Heidari, A.; Jafari Navimipour, N.; Unal, M.; Zhang, G. Machine Learning Applications in Internet-of-Drones: Systematic Review, Recent Deployments, and Open Issues. ACM Comput. Surv. 2023, 55, 1–45. [Google Scholar] [CrossRef]
- Geetha, V.; Punitha, A.; Abarna, M.; Akshaya, M.; Illakiya, S.; Janani, A. An Effective Crop Prediction Using Random Forest Algorithm. In Proceedings of the 2020 International Conference on System, Computation, Automation and Networking (ICSCAN), Pondicherry, India, 3–4 July 2020; pp. 1–5. [Google Scholar]
- Nalini, T.; Rama, A. Impact of Temperature Condition in Crop Disease Analyzing Using Machine Learning Algorithm. Meas. Sens. 2022, 24, 100408. [Google Scholar] [CrossRef]
- Nishankar, S.; Pavindran, V.; Mithuran, T.; Nimishan, S.; Thuseethan, S.; Sebastian, Y. ViT-RoT: Vision Transformer-Based Robust Framework for Tomato Leaf Disease Recognition. AgriEngineering 2025, 7, 185. [Google Scholar] [CrossRef]
- Wang, J.; Li, J.; Meng, F. Recognition of Strawberry Powdery Mildew in Complex Backgrounds: A Comparative Study of Deep Learning Models. AgriEngineering 2025, 7, 182. [Google Scholar] [CrossRef]
- Dang, L.M.; Wang, H.; Li, Y.; Min, K.; Kwak, J.T.; Lee, O.N.; Park, H.; Moon, H. Fusarium Wilt of Radish Detection Using RGB and Near Infrared Images from Unmanned Aerial Vehicles. Remote Sens. 2020, 12, 2863. [Google Scholar] [CrossRef]
- Su, J.; Yi, D.; Su, B.; Mi, Z.; Liu, C.; Hu, X.; Xu, X.; Guo, L.; Chen, W.-H. Aerial Visual Perception in Smart Farming: Field Study of Wheat Yellow Rust Monitoring. IEEE Trans. Ind. Inform. 2021, 17, 2242–2249. [Google Scholar] [CrossRef]
- Shah, S.A.; Lakho, G.M.; Keerio, H.A.; Sattar, M.N.; Hussain, G.; Mehdi, M.; Vistro, R.B.; Mahmoud, E.A.; Elansary, H.O. Application of Drone Surveillance for Advance Agriculture Monitoring by Android Application Using Convolution Neural Network. Agronomy 2023, 13, 1764. [Google Scholar] [CrossRef]
- Zhao, S.; Chen, H.; Zhang, D.; Tao, Y.; Feng, X.; Zhang, D. SR-YOLO: Spatial-to-Depth Enhanced Multi-Scale Attention Network for Small Target Detection in UAV Aerial Imagery. Remote Sens. 2025, 17, 2441. [Google Scholar] [CrossRef]
- Fuentes, A.; Yoon, S.; Kim, S.C.; Park, D.S. A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors 2017, 17, 2022. [Google Scholar] [CrossRef] [PubMed]
- Hamidisepehr, A.; Mirnezami, S.V.; Ward, J.K. Comparison of Object Detection Methods for Corn Damage Assessment Using Deep Learning. Trans. ASABE 2020, 63, 1969–1980. [Google Scholar] [CrossRef]
- Butte, S.; Vakanski, A.; Duellman, K.; Wang, H.; Mirkouei, A. Potato Crop Stress Identification in Aerial Images Using Deep Learning-based Object Detection. Agron. J. 2021, 113, 3991–4002. [Google Scholar] [CrossRef]
- Dawod, R.G.; Dobre, C. Upper and Lower Leaf Side Detection with Machine Learning Methods. Sensors 2022, 22, 2696. [Google Scholar] [CrossRef]
- Hughes, D.; Salathé, M. An Open Access Repository of Images on Plant Health to Enable the Development of Mobile Disease Diagnostics. arXiv 2015, arXiv:1511.08060. [Google Scholar]
- Amarasingam, N.; Ashan Salgadoe, A.S.; Powell, K.; Gonzalez, L.F.; Natarajan, S. A Review of UAV Platforms, Sensors, and Applications for Monitoring of Sugarcane Crops. Remote Sens. Appl. Soc. Environ. 2022, 26, 100712. [Google Scholar] [CrossRef]
- Zhang, T.; Xu, Z.; Su, J.; Yang, Z.; Liu, C.; Chen, W.-H.; Li, J. Ir-UNet: Irregular Segmentation U-Shape Network for Wheat Yellow Rust Detection by UAV Multispectral Imagery. Remote Sens. 2021, 13, 3892. [Google Scholar] [CrossRef]
- Barbosa Júnior, M.R.; Santos, R.G.D.; Sales, L.D.A.; Martins, J.V.D.S.; Santos, J.G.D.A.; Oliveira, L.P.D. Designing and Implementing a Ground-Based Robotic System to Support Spraying Drone Operations: A Step Toward Collaborative Robotics. Actuators 2025, 14, 365. [Google Scholar] [CrossRef]
- Yadav, S.S.; Jadhav, S.M. Deep Convolutional Neural Network Based Medical Image Classification for Disease Diagnosis. J. Big Data 2019, 6, 113. [Google Scholar] [CrossRef]
- Fatih, B.; Kayaalp, F. Review of Machine Learning and Deep Learning Models in Agriculture. Int. Adv. Res. Eng. J. 2021, 5, 309–323. [Google Scholar] [CrossRef]
- Sarker, I.H. Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions. SN Comput. Sci. 2021, 2, 420. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep Learning in Agriculture: A Survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Latha, R.S.; Sreekanth, G.R.; Suganthe, R.C.; Geetha, M.; Swathi, N.; Vaishnavi, S.; Sonasri, P. Automatic Fruit Detection System Using Multilayer Deep Convolution Neural Network. In Proceedings of the 2021 International Conference on Computer Communication and Informatics (ICCCI), Coimbatore, India, 27–29 January 2021; IEEE: Coimbatore, India, 2021; pp. 1–5. [Google Scholar]
- Ramos-Giraldo, P.; Reberg-Horton, C.; Locke, A.M.; Mirsky, S.; Lobaton, E. Drought Stress Detection Using Low-Cost Computer Vision Systems and Machine Learning Techniques. IT Prof. 2020, 22, 27–29. [Google Scholar] [CrossRef]
- An, J.; Li, W.; Li, M.; Cui, S.; Yue, H. Identification and Classification of Maize Drought Stress Using Deep Convolutional Neural Network. Symmetry 2019, 11, 256. [Google Scholar] [CrossRef]
- Tran, T.-T.; Choi, J.-W.; Le, T.-T.H.; Kim, J.-W. A Comparative Study of Deep CNN in Forecasting and Classifying the Macronutrient Deficiencies on Development of Tomato Plant. Appl. Sci. 2019, 9, 1601. [Google Scholar] [CrossRef]
- Anami, B.S.; Malvade, N.N.; Palaiah, S. Classification of Yield Affecting Biotic and Abiotic Paddy Crop Stresses Using Field Images. Inf. Process. Agric. 2020, 7, 272–285. [Google Scholar] [CrossRef]
- Albahli, S. 5: A Lightweight Deep Learning Model for Multisource Plant Disease Diagnosis. Agriculture 2025, 15, 1523. [Google Scholar] [CrossRef]
- Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using Deep Learning for Image-Based Plant Disease Detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef] [PubMed]
- Amara, J.; Bouaziz, B.; Algergawy, A. A Deep Learning-Based Approach for Banana Leaf Diseases Classification. Agric. Food Sci. Comput. Sci. 2017, 1, 79. [Google Scholar]
- Ramcharan, A.; Baranowski, K.; McCloskey, P.; Ahmed, B.; Legg, J.; Hughes, D.P. Deep Learning for Image-Based Cassava Disease Detection. Front. Plant Sci. 2017, 8, 1852. [Google Scholar] [CrossRef]
- Yamamoto, K.; Togami, T.; Yamaguchi, N. Super-Resolution of Plant Disease Images for the Acceleration of Image-Based Phenotyping and Vigor Diagnosis in Agriculture. Sensors 2017, 17, 2557. [Google Scholar] [CrossRef]
- Liu, B.; Zhang, Y.; He, D.; Li, Y. Identification of Apple Leaf Diseases Based on Deep Convolutional Neural Networks. Symmetry 2017, 10, 11. [Google Scholar] [CrossRef]
- Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An Explainable Deep Machine Vision Framework for Plant Stress Phenotyping. Proc. Natl. Acad. Sci. USA 2018, 115, 4613–4618. [Google Scholar] [CrossRef]
- Rançon, F.; Bombrun, L.; Keresztes, B.; Germain, C. Comparison of SIFT Encoded and Deep Learning Features for the Classification and Detection of Esca Disease in Bordeaux Vineyards. Remote Sens. 2018, 11, 1. [Google Scholar] [CrossRef]
- Tian, H.; Wang, P.; Tansey, K.; Zhang, J.; Zhang, S.; Li, H. An LSTM Neural Network for Improving Wheat Yield Estimates by Integrating Remote Sensing Data and Meteorological Data in the Guanzhong Plain, PR China. Agric. For. Meteorol. 2021, 310, 108629. [Google Scholar] [CrossRef]
- Dharani, M.K.; Thamilselvan, R.; Natesan, P.; Kalaivaani, P.; Santhoshkumar, S. Review on Crop Prediction Using Deep Learning Techniques. J. Phys. Conf. Ser. 2021, 1767, 012026. [Google Scholar] [CrossRef]
- Koirala, A.; Walsh, K.; Wang, Z.; McCarthy, C. Deep Learning for Real-Time Fruit Detection and Orchard Fruit Load Estimation: Benchmarking of ‘MangoYOLO’. Precis. Agric. 2019, 20, 1107–1135. [Google Scholar] [CrossRef]
- Villacrés, J.F.; Auat Cheein, F. Detection and Characterization of Cherries: A Deep Learning Usability Case Study in Chile. Agronomy 2020, 10, 835. [Google Scholar] [CrossRef]
- Chi, H.-C.; Sarwar, M.A.; Daraghmi, Y.-A.; Lin, K.-W.; Ik, T.-U.; Li, Y.-L. Smart Self-Checkout Carts Based on Deep Learning for Shopping Activity Recognition. In Proceedings of the 2020 21st Asia-Pacific Network Operations and Management Symposium (APNOMS), Daegu, Republic of Korea, 22–25 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 185–190. [Google Scholar]
- Sarwar, M.A.; Daraghmi, Y.-A.; Liu, K.-W.; Chi, H.-C.; Ik, T.-U.; Li, Y.-L. Smart Shopping Carts Based on Mobile Computing and Deep Learning Cloud Services. In Proceedings of the 2020 IEEE Wireless Communications and Networking Conference (WCNC), Seoul, Republic of Korea, 25–28 May 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
- Yaseen, M. What is YOLOv8: An in-Depth Exploration of the Internal Features of the Next-Generation Object Detector. arXiv 2024, arXiv:2408.15857. [Google Scholar]
- Lin, T.-Y.; Goyal, P.; Girshick, R.; He, K.; Dollár, P. Focal Loss for Dense Object Detection. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
- Saffarini, R.; Khamayseh, F.; Awwad, Y.; Sabha, M.; Eleyan, D. Dynamic Generative R-CNN. Neural Comput. Appl. 2025, 37, 7107–7120. [Google Scholar] [CrossRef]
Ref. | Context | Method | Evaluation Metric | Value | Inference Time |
---|---|---|---|---|---|
[25] | Potato stress detection | Ratina-UNet | Dice score coefficient (DSC) | 74% | NA |
[19] | Fusarium Wilt of Radish Detection | Customized deep learning | Recall | 96.0% | NA |
[21] | Advanced Agriculture Monitoring | EfficientNet-B3 (modified CNN) | Precision | 97.0% | NA |
[42] | Leaf disease detection | ResNet125 | Recall | 85.0% | NA |
[29] | Wheat yellow rust detection | Ir-Unet | Recall | 95.0% | NA |
[38] | Macronutrient Deficiencies on Development of Tomato Plant | Inception-ResNet | Recall | 91.22% | NA |
[39] | biotic and abiotic addy crop stresses | Back Propagation Neural Network (BPNN) | Accuracy | 89.12% | NA |
[24] | Corn damage assessment | YOLOv2 | Precision | 97.0% | NA |
[24] | Corn damage assessment | Faster R-CNN | Precision | 77.29% | NA |
[24] | Corn damage assessment | RetinaNet | Precision | 93.87% | NA |
[40] | Multisource plant disease diagnosis | AgriFusionNet | Precision | 94.1% | 28.5 |
Class Name | Small Scale | Medium Scale | Large Scale | Total |
---|---|---|---|---|
Healthy | 1812 | 216 | 13 | 2041 |
Stressed | 2931 | 186 | 0 | 3117 |
Total | 4743 | 402 | 13 | 5158 |
Dataset Subset | Healthy | Stressed | Total |
---|---|---|---|
Training original data | 1640 | 2384 | 4024 |
Training augmented data | 8200 | 11,920 | 20,120 |
Testing | 401 | 734 | 1135 |
YOLOv8 Variant | Precision | Recall | mAP@50 | mAP@50–95 |
---|---|---|---|---|
Nano—Original dataset | 0.754 | 0.738 | 0.798 | 0.619 |
Nano—Augmented dataset | 0.806 | 0.74 | 0.826 | 0.719 |
Small—Original dataset | 0.758 | 0.748 | 0.809 | 0.661 |
Small—Augmented dataset | 0.787 | 0.792 | 0.847 | 0.771 |
Medium—Original dataset | 0.817 | 0.744 | 0.847 | 0.73 |
Medium—Augmented dataset | 0.865 | 0.764 | 0.869 | 0.796 |
Large—Original dataset | 0.821 | 0.775 | 0.854 | 0.75 |
Large—Augmented dataset | 0.836 | 0.805 | 0.881 | 0.819 |
X-Large—Original dataset | 0.828 | 0.769 | 0.861 | 0.763 |
X-Large—Augmented dataset | 0.878 | 0.773 | 0.885 | 0.82 |
YOLOv8 Variant | Inference Time (ms) Original Dataset | Inference Time (ms) Augmented Dataset |
---|---|---|
Nano | 11.8 | 11.7 |
Small | 12.4 | 13.4 |
Medium | 25.8 | 25.7 |
Large | 40.7 | 40.9 |
X-Large | 134.3 | 133.3 |
RetinaNet Variant | mAP@50 | mAP@75 | mAP@50–95 |
---|---|---|---|
ResNet50—Original dataset | 0.587 | 0.319 | 0.339 |
ResNet50—Augmented dataset | 0.657 | 0.42 | 0.43 |
ResNet101—Original dataset | 0.621 | 0.354 | 0.362 |
ResNet101—Augmented dataset | 0.695 | 0.46 | 0.467 |
ResNeXt101—Original dataset | 0.628 | 0.364 | 0.368 |
ResNeXt101—Augmented dataset | 0.69 | 0.46 | 0.466 |
Model | Inference Time (ms) Original Dataset | Inference Time (ms) Augmented Dataset |
---|---|---|
ResNet50 | 118.7 | 133.9 |
ResNet101 | 156.1 | 188.2 |
ResNeXt101 | 158.8 | 190.2 |
Faster RCNN Variant | mAP@50 | mAP@75 | mAP@50–95 |
---|---|---|---|
ResNet50—Original dataset | 0.565 | 0.278 | 0.293 |
ResNet50—Augmented dataset | 0.666 | 0.405 | 0.411 |
ResNet101—Original dataset | 0.605 | 0.367 | 0.366 |
ResNet101—Augmented dataset | 0.67 | 0.467 | 0.464 |
ResNeXt101—Original dataset | 0.626 | 0.391 | 0.381 |
ResNeXt101—Augmented dataset | 0.663 | 0.452 | 0.457 |
Model | Inference Time (ms) Original Dataset | Inference Time (ms) Augmented Dataset |
---|---|---|
ResNet50 | 265.2 | 266.2 |
ResNet101 | 280.2 | 287.0 |
ResNeXt101 | 288.1 | 290.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Daraghmi, Y.-A.; Naser, W.; Daraghmi, E.Y.; Fouchal, H. Drone-Assisted Plant Stress Detection Using Deep Learning: A Comparative Study of YOLOv8, RetinaNet, and Faster R-CNN. AgriEngineering 2025, 7, 257. https://doi.org/10.3390/agriengineering7080257
Daraghmi Y-A, Naser W, Daraghmi EY, Fouchal H. Drone-Assisted Plant Stress Detection Using Deep Learning: A Comparative Study of YOLOv8, RetinaNet, and Faster R-CNN. AgriEngineering. 2025; 7(8):257. https://doi.org/10.3390/agriengineering7080257
Chicago/Turabian StyleDaraghmi, Yousef-Awwad, Waed Naser, Eman Yaser Daraghmi, and Hacene Fouchal. 2025. "Drone-Assisted Plant Stress Detection Using Deep Learning: A Comparative Study of YOLOv8, RetinaNet, and Faster R-CNN" AgriEngineering 7, no. 8: 257. https://doi.org/10.3390/agriengineering7080257
APA StyleDaraghmi, Y.-A., Naser, W., Daraghmi, E. Y., & Fouchal, H. (2025). Drone-Assisted Plant Stress Detection Using Deep Learning: A Comparative Study of YOLOv8, RetinaNet, and Faster R-CNN. AgriEngineering, 7(8), 257. https://doi.org/10.3390/agriengineering7080257