Smart Agriculture Drone for Crop Spraying Using Image-Processing and Machine Learning Techniques: Experimental Validation
Abstract
:1. Introduction
2. Literature Review
3. Methodology
3.1. Image-Processing System
3.1.1. Image Capture and Preprocessing
3.1.2. Annotation and Data Setup
3.1.3. Model Training and Evaluation
3.1.4. Deployment and Testing
3.2. Crop-Spraying System
3.2.1. Hardware Setup and Calibration
3.2.2. Spray Integration
3.3. Drone Flight System
3.3.1. Hardware Setup and Flight Control System
3.3.2. Software Setup and Flight Control
3.3.3. Calibration and Pre-Flight Testing
3.3.4. Flight and Mission Testing
3.4. Overall System
3.4.1. System Integration Process
3.4.2. System Testing
3.4.3. Evaluation and Optimisation
4. Results and Discussion
5. Conclusions and Future Works
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
ANN | Artificial Neural Network |
BLDC | Brushless DC motor |
CNN | Convolutional neural network |
CNN-GRU | Convolutional Neural Network-Gated Recurrent Unit |
CMS | Crop-monitoring system |
CWSI | Crop Water Stress Index |
DJI | Da-Jiang Innovations |
DL | Deep learning |
DNN | Deep Neural Network |
FANET | Flying Ad Hoc Network |
FPS | Frames Per Second |
GRU | Gated Recurrent Unit |
IMU | Inertial Measurement Unit |
IoT | Internet of Things |
LiPo | Lithium Polymer |
LR | Logistic Regression |
ML | Machine learning |
MLP | Multilayer Perceptron |
MP | Mission Planner |
MS | Multispectral |
MSI | Multispectral imaging |
NMD | Number median diameter |
PA | Precision agriculture |
PID | Proportional Integral Derivative |
PLSR | Partial Least Squares Regression |
RF | Random Forest |
RFR | Random Forest Regression |
RGB | Red Green Blue |
RSMPA-DLFCC | Robust Softmax with Maximum Probability Analysis-Deep Local Feature Cross-Correlation |
SBODL-FCC | Sigmoid-Based Objective Detection Layer-Feature Cross-Correlation |
SVM | Support Vector Machine |
TF | TensorFlow |
TIR | Thermal Infrared |
UAV | Unmanned Aerial Vehicle |
VGG | Visual Geometry Group |
VMD | Volume median diameter |
References
- Pajon, C. Food Systems in the Pacific: Addressing Challenges in Cooperation with Europe; Briefings de l’Ifri: Paris, France, 2023. [Google Scholar]
- Berry, F.; Bless, A.; Davila Cisneros, F. Evidence Brief UNFSS Pacific Country Food System Pathways Analysis. In Proceedings of the United Nations Food Systems Summit (UNFSS), New York, NY, USA, 23 September 2021. [Google Scholar]
- Karunathilake, E.M.B.M.; Le, A.T.; Heo, S.; Chung, Y.S.; Mansoor, S. The Path to Smart Farming: Innovations and Opportunities in Precision Agriculture. Agriculture 2023, 13, 1593. [Google Scholar] [CrossRef]
- Kumar, S.; Lal, R.; Ali, A.; Rollings, N.; Mehta, U.; Assaf, M. A Real-Time Fish Recognition Using Deep Learning Algorithms for Low-Quality Images on an Underwater Drone. In Artificial Intelligence and Sustainable Computing, Proceedings of the ICSISCET 2023, Gwalior, India, 21–22 October 2023; Pandit, M., Gaur, M.K., Kumar, S., Eds.; Springer: Singapore; Berlin/Heidelberg, Germany, 2024; pp. 67–79. [Google Scholar]
- Verma, G.; Gupta, Y.; Malik, A.M.; Chapman, B. Performance Evaluation of Deep Learning Compilers for Edge Inference. In Proceedings of the 2021 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW), New Orleans, LA, USA, 18–22 May 2020; pp. 858–865. [Google Scholar] [CrossRef]
- Tan, M.; Pang, R.; Le, Q.V. EfficientDet: Scalable and Efficient Object Detection. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 10778–10787. [Google Scholar] [CrossRef]
- Chand, A.A.; Prasad, K.A.; Mar, E.; Dakai, S.; Mamun, K.A.; Islam, F.R.; Mehta, U.; Kumar, N.M. Design and Analysis of Photovoltaic Powered Battery-Operated Computer Vision-Based Multi-Purpose Smart Farming Robot. Agronomy 2021, 11, 530. [Google Scholar] [CrossRef]
- Bouguettaya, A.; Zarzour, H.; Kechida, A.; Taberkit, A.M. Design and Development of Object Detection System in Augmented Reality Based Indoor Navigation Application. In Proceedings of the 2022 International Conference on Electrical Engineering and Informatics, Banda Aceh, Indonesia, 27–28 September 2022; pp. 89–94. [Google Scholar] [CrossRef]
- Renella, N.T.; Bhandari, S.; Raheja, A. Machine learning models for detecting and isolating weeds from strawberry plants using UAVs. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII, Orlando, FL, USA, 30 April–5 May 2023; Bauer, C., Thomasson, J.A., Eds.; SPIE: Orlando, FL, USA, 2023; p. 8. [Google Scholar] [CrossRef]
- Yallappa, D.; Veerangouda, M.; Maski, D.; Palled, V.; Bheemanna, M. Development and evaluation of drone mounted sprayer for pesticide applications to crops. In Proceedings of the 2017 IEEE Global Humanitarian Technology Conference (GHTC), San Jose, CA, USA, 19–22 October 2017; pp. 1–7. [Google Scholar] [CrossRef]
- Hafeez, A.; Husain, M.A.; Singh, S.; Chauhan, A.; Khan, M.T.; Kumar, N.; Chauhan, A.; Soni, S. Implementation of drone technology for farm monitoring & pesticide spraying: A review. Inf. Process. Agric. 2023, 10, 192–203. [Google Scholar] [CrossRef]
- Jubair, M.A.; Hossain, S.; Masud, M.A.A.; Hasan, K.M.; Newaz, S.H.S.; Ahsan, M.S. Design and development of an autonomous agricultural drone for sowing seeds. In Proceedings of the 7th Brunei International Conference on Engineering and Technology 2018 (BICET 2018), Bandar Seri Begawan, Brunei, 12–14 November 2018; p. 101. [Google Scholar] [CrossRef]
- Piriyasupakij, J.; Prasitphan, R. Design and Development of Autonomous Drone to Detect and Alert Intruders in Surveillance Areas. In Proceedings of the 2023 8th International Conference on Business and Industrial Research (ICBIR), Bangkok, Thailand, 18–19 May 2023; pp. 1094–1099. [Google Scholar] [CrossRef]
- Naufal, R.I.; Karna, N.; Shin, S.Y. Vision-based Autonomous Landing System for Quadcopter Drone Using OpenMV. In Proceedings of the 2022 13th International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 19–21 October 2022; pp. 1233–1237. [Google Scholar] [CrossRef]
- Ukaegbu, U.; Mathipa, L.; Malapane, M.; Tartibu, L.K.; Olayode, I.O. Deep Learning for Smart Plant Weed Applications Employing an Unmanned Aerial Vehicle. In Proceedings of the 2022 IEEE 13th International Conference on Mechanical and Intelligent Manufacturing Technologies (ICMIMT), Cape Town, South Africa, 25–27 May 2022; pp. 321–325. [Google Scholar] [CrossRef]
- Kiropoulos, K.; Tsouros, D.C.; Dimaraki, F.; Triantafyllou, A.; Bibi, S.; Sarigiannidis, P.; Angelidis, P. Monitoring Saffron Crops with UAVs. Telecom 2022, 3, 301–321. [Google Scholar] [CrossRef]
- Vijayalakshmi, K.; Al-Otaibi, S.; Arya, L.; Almaiah, M.A.; Anithaashri, T.P.; Karthik, S.S.; Shishakly, R. Smart Agricultural–Industrial Crop-Monitoring System Using Unmanned Aerial Vehicle–Internet of Things Classification Techniques. Sustainability 2023, 15, 11242. [Google Scholar] [CrossRef]
- Aliyari, M.; Droguett, E.L.; Ayele, Y.Z. UAV-Based Bridge Inspection via Transfer Learning. Sustainability 2021, 13, 11359. [Google Scholar] [CrossRef]
- Munawar, H.S.; Ullah, F.; Qayyum, S.; Khan, S.I.; Mojtahedi, M. UAVs in Disaster Management: Application of Integrated Aerial Imagery and Convolutional Neural Network for Flood Detection. Sustainability 2021, 13, 7547. [Google Scholar] [CrossRef]
- Shahi, T.B.; Xu, C.Y.; Neupane, A.; Guo, W. Recent Advances in Crop Disease Detection Using UAV and Deep Learning Techniques. Remote Sens. 2023, 15, 2450. [Google Scholar] [CrossRef]
- Som-ard, J.; Atzberger, C.; Izquierdo-Verdiguier, E.; Vuolo, F.; Immitzer, M. Remote Sensing Applications in Sugarcane Cultivation: A Review. Remote Sens. 2021, 13, 4040. [Google Scholar] [CrossRef]
- Zualkernan, I.; Abuhani, D.A.; Hussain, M.H.; Khan, J.; ElMohandes, M. Machine Learning for Precision Agriculture Using Imagery from Unmanned Aerial Vehicles (UAVs): A Survey. Drones 2023, 7, 382. [Google Scholar] [CrossRef]
- de Queiroz Otone, J.D.; Theodoro, G.d.F.; Santana, D.C.; Teodoro, L.P.R.; de Oliveira, J.T.; de Oliveira, I.C.; da Silva Junior, C.A.; Teodoro, P.E.; Baio, F.H.R. Hyperspectral Response of the Soybean Crop as a Function of Target Spot (Corynespora cassiicola) Using Machine Learning to Classify Severity Levels. AgriEngineering 2024, 6, 330–343. [Google Scholar] [CrossRef]
- Mohidem, N.A.; Che’Ya, N.N.; Juraimi, A.S.; Fazlil Ilahi, W.F.; Mohd Roslim, M.H.; Sulaiman, N.; Saberioon, M.; Mohd Noor, N. How Can Unmanned Aerial Vehicles Be Used for Detecting Weeds in Agricultural Fields? Agriculture 2021, 11, 1004. [Google Scholar] [CrossRef]
- Almasoud, A.S.; Mengash, H.A.; Saeed, M.K.; Alotaibi, F.A.; Othman, K.M.; Mahmud, A. Remote Sensing Imagery Data Analysis Using Marine Predators Algorithm with Deep Learning for Food Crop Classification. Biomimetics 2023, 8, 535. [Google Scholar] [CrossRef] [PubMed]
- Ampatzidis, Y.; Partel, V. UAV-Based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence. Remote Sens. 2019, 11, 410. [Google Scholar] [CrossRef]
- De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
- Mao, Y.; Li, H.; Wang, Y.; Wang, H.; Shen, J.; Xu, Y.; Ding, S.; Wang, H.; Ding, Z.; Fan, K. Rapid monitoring of tea plants under cold stress based on UAV multi-sensor data. Comput. Electron. Agric. 2023, 213, 108176. [Google Scholar] [CrossRef]
- Xue, X.; Lan, Y.; Sun, Z.; Chang, C.; Hoffmann, W.C. Develop an unmanned aerial vehicle based automatic aerial spraying system. Comput. Electron. Agric. 2016, 128, 58–66. [Google Scholar] [CrossRef]
- Primicerio, J.; Di Gennaro, S.F.; Fiorillo, E.; Genesio, L.; Lugato, E.; Matese, A.; Vaccari, F.P. A flexible unmanned aerial vehicle for precision agriculture. Precis. Agric. 2012, 13, 517–523. [Google Scholar] [CrossRef]
- Shankaran, V.P.; Azid, S.I.; Mehta, U.; Fagiolini, A. Improved Performance in Quadrotor Trajectory Tracking Using MIMO PIμ-D Control. IEEE Access 2022, 10, 110646–110660. [Google Scholar] [CrossRef]
Ref. | Focus Area | ML Method | Equipment | Performance |
---|---|---|---|---|
[16] | Saffron | RF, MLP | RGB, MS cameras | Accuracy: RF 100%, MLP 95% |
[17] | Agricultural weed, soil, and crop analysis | CMS, FANET, CNN, data fusion | Electrochemical sensors, DJI Inspire 2 | Accuracy: CMS 84%, FANET 87%, CNN 91%, data fusion 94% |
[18] | Bridge inspection (non-agricultural) | VGG16, ResNet50, ResNet152v2, Xception, InceptionV3, DenseNet201 | Spectroradiometer, MS camera | Accuracy: up to 83% |
[19] | Flood detection (environmental monitoring) | CNN | Spectroradiometer, MS camera | Accuracy: 91% |
[20] | Multiple crops (wheat, maize, potato, grape, banana, coffee, radish, sugar, cotton, soybean, tea) | RF, SVM, ANN, CNN | RGB, MS, hyperspectral, thermal | Accuracy: ML 72–98%, DL 85–100% |
[21] | Sugarcane cultivation analysis | RF, CNN | Landsat, Sentinel-2 MSI | Accuracy: RF 80–98%, CNN 95% |
[22] | General crop and weed detection | SVM, CNN | Imaging sensors, RGB, MS, hyperspectral, thermal | Accuracy: SVM 88.01%, CNN 96.3% |
[23] | Soybean cultivation and health analysis | LR, SVM | Spectroradiometer, MS camera | Accuracy: RF 76%, SVM 85% |
[24] | Weed identification | AlexNet, CNN, SVM | RGB, MS, hyperspectral, thermal | Accuracy: AlexNet 99.8%, CNN 98.8%, SVM 94% |
[25] | Various crops (maize, banana, forest, legume, structural, others) Analysis | RSMPA-DLFCC, SBODL-FCC, DNN, AlexNet, VGG-16, ResNet, SVM | UAV dataset | Accuracy: RSMPA-DLFCC 98.22%, SBODL-FCC 97.43%, DNN 86.23%, AlexNet 90.49%, VGG-16 90.35%, ResNet 87.7%, SVM 86.69% |
[26] | Citrus tree health and growth monitoring | CNNs | MS camera (RedEdge-M) | Accuracy: overall 99.8%, tree gaps 94.2%, canopy 85.5% |
[27] | Sunflower and cotton cultivation analysis | RF | RGB camera (Sony ILCE-6000) | Accuracy: sunflower 81%, cotton 84% |
[28] | Tea plant cold injury assessment | CNN-GRU, GRU, PLSR, SVR, RFR | MS, TIR, RGB | Accuracy: ; RMSEP = 0.138; RPD = 2.220 |
[29] | Automatic aerial spraying system | MSP430 single-chip microcomputer | GPS, spray uniformity tests | Route deviation: 0.2 m; coefficient of variation: 25% |
[30] | Vineyard management | NDVI-based classification | Six-rotor UAV, pitch-and-roll-compensated multispectral camera | Good agreement with ground-based observations |
Parameter | Value | Unit |
---|---|---|
Drone mass | 1 | kg |
Arm length | 0.25 | m |
Inertial moment about x | 0.063 | kg/m2 |
Inertial moment about y | 0.063 | kg/m2 |
Inertial moment about z | 0.063 | kg/m2 |
Motor thrust (at 100% throttle) | 1293 | kV |
Motor torque (at 100% throttle) | 0.18 | Nm |
Gravitational acceleration | 9.8 | m/s2 |
Model Architecture | Pineapple | Papaya | Cabbage |
---|---|---|---|
Efficientdetlite0 | 70.74 | 76.39 | 88.53 |
Efficientdetlite1 | 75.34 | 79.11 | 89.71 |
Efficientdetlite2 | 76.89 | 83.13 | 91.86 |
Efficientdetlite3 | 78.07 | 84.95 | 92.30 |
Model | Without Edge TPU | With Edge TPU | ||
---|---|---|---|---|
Architecture | Inference Time (ms) | File Size (MB) | Inference Time (ms) | File Size (MB) |
Efficientdetlite0 | 125 | 5.12 | 56 | 5.57 |
Efficientdetlite1 | 185 | 7.38 | 91 | 7.65 |
Efficientdetlite2 | 370 | 9.41 | 250 | 9.74 |
Efficientdetlite3 | 1250 | 11.95 | 500 | 12.32 |
Parameters | X | Y | Z | Roll | Pitch | Yaw |
---|---|---|---|---|---|---|
Proportional | 1.5 | 1.5 | 3.99 | 0.225 | 0.329 | 1.036 |
Integral | 0.0001 | 0.0001 | 1.56 | 0.225 | 0.329 | 0.104 |
Derivative | 0.05 | 0.05 | 3.10 | 0.009 | 0.011 | 0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Singh, E.; Pratap, A.; Mehta, U.; Azid, S.I. Smart Agriculture Drone for Crop Spraying Using Image-Processing and Machine Learning Techniques: Experimental Validation. IoT 2024, 5, 250-270. https://doi.org/10.3390/iot5020013
Singh E, Pratap A, Mehta U, Azid SI. Smart Agriculture Drone for Crop Spraying Using Image-Processing and Machine Learning Techniques: Experimental Validation. IoT. 2024; 5(2):250-270. https://doi.org/10.3390/iot5020013
Chicago/Turabian StyleSingh, Edward, Aashutosh Pratap, Utkal Mehta, and Sheikh Izzal Azid. 2024. "Smart Agriculture Drone for Crop Spraying Using Image-Processing and Machine Learning Techniques: Experimental Validation" IoT 5, no. 2: 250-270. https://doi.org/10.3390/iot5020013
APA StyleSingh, E., Pratap, A., Mehta, U., & Azid, S. I. (2024). Smart Agriculture Drone for Crop Spraying Using Image-Processing and Machine Learning Techniques: Experimental Validation. IoT, 5(2), 250-270. https://doi.org/10.3390/iot5020013