Development of a Light-Weight Unmanned Aerial Vehicle for Precision Agriculture
Abstract
:1. Introduction
1.1. Unmanned Aerial Vehicles (UAV)
1.2. Quadcopter
- Frame: It provides a physical structure, houses the electric motors and other components.
- Electric motors: They are two types of motors-brushed and brushless motors. For a quadcopter, a brushless motor is preferred due to its high thrust-to-weight ratio. The brushless motor spins the propeller, which results in lift
- Propellers: They come in different sizes and materials and are measured by diameter and pitch in the format, diameter × pitch. Pitch is a measure of how many “travels” the propeller undertakes in one revolution, while the diameter is the length of the propeller from tip to tip.
- Electronic speed controller: These are electric devices that collect PWM (Pulsating Width Modulation) signals from the flight controller and sends them to the electric motors, thus regulating their signals appropriately.
- Batteries: The battery supplies direct current (DC) power to the electric motor and all other quadcopter components via the electronic speed controllers. They come in different shapes and sizes, including alkaline, lead-acid, nickel-cadmium, lithium sulphur (Li-S), etc. Lithium polymer batteries (especially Li-S) are extensively used due to their high energy density.
- Power distribution board: It equally distributes power from the battery to the electronic speed controllers.
- Flight controller: This the brain box of the UAV as it controls the power to each electric motor based on signals received from the transmitter and onboard sensors.
2. Computer Vision Technology and Machine Learning
2.1. Deep Learning Algorithm
2.2. Embedded Systems
2.3. Raspberry Pi
2.4. Sprayer Module
3. Related Research Works
- To train a CNN model through transfer learning on a pre-trained ResNet50 model.
- To deploy the trained model on a Raspberry Pi 3B and incorporate a sprayer module.
- To build a quadcopter from a readily available kit.
- To mount the Raspberry Pi together with the sprayer module on the quadcopter and test run.
- To proffer noteworthy recommendations.
4. Materials and Methods
4.1. Acquisition of Datasets
4.2. Training the CNN Model
4.3. Embedded System for the Smart Herbicide Sprayer
4.4. Setup for the Raspberry Pi for DL Applications
4.5. Materials for the Smart Herbicide Sprayer
4.6. Calculation for Component Compatibility
4.6.1. ESC Calculation
4.6.2. Battery Calculation
4.6.3. Thrust Calculation
4.7. Assembling the Smart Herbicide Sprayer
4.7.1. Procedures for Assembling the Smart Herbicide Sprayer
4.8. Principle of Operation
5. Results
5.1. Outcome of the Model Training Procedure
5.2. Outcome of Incorporating the Sprayer Module with the Raspberry Pi
5.3. Assembling the Quadcopter Kit
5.4. Test-Running the Smartweed Detector and Selective Herbicide Sprayer
5.5. Discussion
6. Conclusions and Recommendations
6.1. Conclusions
- A CNN model was trained through transfer learning on the soybean dataset. The CNN model was converted to TensorFlow lite format and deployed on the Raspberry Pi 3. Training and validation accuracies of 99.98% and 98.4% were obtained respectively with an insignificant variance error. Moreover, training and validation losses of 0.0039 and 0.0323 respectively reveal that the proposed model was appropriate.
- A quadcopter was built, and its components were assembled with light-weight and strong materials to improve its efficiency. It consists of the electric motor, ESC, propellers, frame, li-po battery, flight controller, GPS, receiver. In this research work, a quadcopter was built and was able to perform as pre-planned.
- A sprayer module which consists of a relay, Raspberry Pi 3, spray pump, 12 V DC source, water hose, and the tank, was built. It operated in such a way that when a weed was detected based on the deep learning algorithms deployed on the Raspberry Pi, GPIO 17 or GPIO 18 were activated to supply 3.3 V, which turned on a DC relay to spray herbicides accordingly.
- The sprayer module was mounted on the quadcopter, and from the test-running operation carried out, broadleaf and grass weeds were accurately detected and spraying of herbicides according to the weed type occurred in less than a second.
6.2. Recommendations for Future Research
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Computers and Electronics in Agriculture. Elsevier 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Ojha, T.; Misra, S.; Raghuwanshi, N.S. Wireless sensor networks for agriculture: The state-of-the-art in practice and future challenges. Comput. Electron. Agric. 2015, 118, 66–84. [Google Scholar] [CrossRef]
- Miranda, J.; Ponce, P.; Molina, A.; Wright, P. Sensing, smart and sustainable technologies for Agri-Food 4.0. Comput. Ind. 2019, 108, 21–36. [Google Scholar] [CrossRef]
- Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
- Yanushevsky, R. Guidance of Unmanned Aerial Vehicles; Taylor and Francis Group; CRC Press: Boca Raton, FL, USA, 2011; ISBN 9781439850954. [Google Scholar]
- Chao, H.; Cao, Y.; Chen, Y. Autopilots for Small Unmanned Aerial Vehicles: A Survey. Int. J. Control Autom. 2010, 8, 36–44. [Google Scholar] [CrossRef]
- Vogeltanz, T. A Survey of Free Software for the Design, Analysis, Modelling, and Simulation of an Unmanned Aerial Vehicle. Arch. Comput. Methods Eng. 2016, 23, 449–514. [Google Scholar] [CrossRef]
- Hoang, M.; Poon, T.W. Final Report Design, Implementation, and Testing of a UAV Quadcopter. Ph.D. Thesis, Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB, Canada, 2013. [Google Scholar]
- Ferdaus, M.M.; Anavatti, S.G.; Pratama, M.; Garratt, M.A. Towards the use of fuzzy logic systems in rotary wing unmanned aerial vehicle: A review. Artif. Intell. Rev. 2020, 53, 257–290. [Google Scholar] [CrossRef]
- Patel, D.C.; Gabhawala, G.S.; Kapadia, A.K.; Desai, N.H.; Sheth, S.M. Design of Quadcopter in Reconnaissance. In Proceedings of the International Conference on Innovations in Automation and Mechatronics Engineering, Gujarat, India, 21–23 February 2013. [Google Scholar]
- Gopalakrishnan, E. Quadcopter Flight Mechanics Model and Control Algorithms. Masters’ Thesis, Department of Control Engineering, Lulea University of Technology, Czech Technical University, Prague, Czechia, 2017. [Google Scholar]
- Rehman, M. Quadcopter for Pesticide Spraying. Int. J. Sci. Eng. Res. 2016, 7, 238–240. [Google Scholar]
- Abbe, G.; Smith, H. Technological Development Trends in Solar-Powered Aircraft Systems. Renew. Sustain. Energy Rev. 2016, 60, 770–783. [Google Scholar] [CrossRef]
- Gao, X. Reviews of methods to extract and store energy for solar-powered aircraft. Renew. Sustain. Energy Rev. 2015, 44, 96–108. [Google Scholar] [CrossRef]
- Shivaji, C.P.; Tanaji, J.K.; Satish, N.A.; Mone, P.P. Agriculture Drone for Spraying Fertilizer and Pesticides. Int. J. Res. Trends Innov. 2017, 2, 34–36. [Google Scholar]
- Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using Deep Learning for Image-Based Plant Disease Detection. Front. Plant Sci. 2016. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Marcus, G. Deep learning: A critical appraisal. arXiv 2018, arXiv:1801.00631. [Google Scholar]
- Shrestha, A.; Mahmood, A. Review of deep learning algorithms and architectures. IEEE Access 2019, 7, 53040–53065. [Google Scholar] [CrossRef]
- Pandey, S.K.; Janghel, R.R. Recent deep learning techniques, challenges and its applications for medical healthcare system: A review. Neural Process. Lett. 2019, 50, 1907–1935. [Google Scholar] [CrossRef]
- Mu, R.; Zeng, X. A Review of Deep Learning Research. TIIS 2019, 13, 1738–1764. [Google Scholar]
- Amanullah, M.A.; Habeeb, R.A.A.; Nasaruddin, F.H.; Gani, A.; Ahmed, E.; Nainar, A.S.M.; Akim, N.M.; Imran, M. Deep learning and big data technologies for IoT security. Comput. Commun. 2020, 151, 495–517. [Google Scholar] [CrossRef]
- Tang, T.A.; Mhamdi, L.; McLernon, D.; Zaidi, S.A.R.; Ghogho, M. Deep Recurrent Neural Network for Intrusion Detection in SDN-based Networks. In Proceedings of the 2018 4th IEEE Conference on Network Softwarization and Workshops (NetSoft), Montreal, QC, Canada, 25–29 June 2018; pp. 202–206. [Google Scholar] [CrossRef] [Green Version]
- Sengupta, S.; Basak, S.; Saikia, P.; Paul, S.; Tsalavoutis, V.; Atiah, F.; Ravi, V.; Peters, A. A review of deep learning with special emphasis on architectures, applications and recent trends. Knowl. Based Syst. 2020, 105596. [Google Scholar] [CrossRef] [Green Version]
- Kaya, A.; Keceli, A.S.; Catal, C.; Yalic, H.Y.; Temucin, H.; Tekinerdogan, B. Analysis of transfer learning for deep neural network based plant classification models. Comput. Electron. Agric. 2019, 158, 20–29. [Google Scholar] [CrossRef]
- Pouyanfar, S.; Sadiq, S.; Yan, Y.; Tian, H.; Tao, Y.; Reyes, M.P.; Shyu, M.L.; Chen, S.C.; Iyengar, S.S. A Survey on Deep Learning: Algorithms, Techniques. ACM Comput. Surv. 2018, 51, 1–36. [Google Scholar] [CrossRef]
- Pouyanfar, S.; Chen, S.C.; Shyu, M.L. An efficient deep residual-inception network for multimedia classification. In Proceedings of the 2017 IEEE International Conference on Multimedia and Expo (ICME), Hong Kong, China, 10–14 July 2017; pp. 373–378. [Google Scholar]
- Huang, L.; Li, J.; Hao, H.; Li, X. Micro-seismic event detection and location in underground mines by using Convolutional Neural Networks (CNN) and deep learning. Tunn. Undergr. Space Technol. 2018, 81, 265–276. [Google Scholar] [CrossRef]
- Jha, K.; Doshi, A.; Patel, P.; Shah, M. A comprehensive review on automation in agriculture using artificial intelligence. Artif. Intell. Agric. 2019, 2, 1–12. [Google Scholar] [CrossRef]
- Bojarski, M.; del Testa, D.; Dworakowski, D.; Firner, B.; Flepp, B.; Goyal, P.; Jackel, L.D.; Monfort, M.; Muller, U.; Zhang, J.; et al. End to end learning for self-driving cars. arXiv 2016, arXiv:1604.07316. [Google Scholar]
- Rao, R.N.; Sridhar, B. IoT based smart crop-field monitoring and automation irrigation system. In Proceedings of the 2018 2nd International Conference on Inventive Systems and Control (ICISC), Coimbatore, India, 19–20 January 2018; pp. 478–483. [Google Scholar]
- Vikhram, G.R.; Agarwal, R.; Uprety, R.; Prasanth, V.N.S. Automatic weed detection and smart herbicide sprayer robot. Int. J. Eng. Technol. 2018, 7, 115–118. [Google Scholar] [CrossRef]
- Suriansyah, M.I.; Sukoco, H.; Solahudin, M. Weed detection using fractal based low cost commodity hardware Raspberry Pi. Indones. J. Electr. Eng. Comput. Sci. 2016, 2, 426–430. [Google Scholar] [CrossRef]
- Leroux, S.; Bohez, S.; de Coninck, E.; Verbelen, T.; Vankeirsbilck, B.; Simoens, P.; Dhoedt, B. The cascading neural network: Building the Internet of Smart Things. Knowl. Inf. Syst. 2017, 52, 791–814. [Google Scholar] [CrossRef]
- Sam. “Hello World” with Raspberry pi 3. 2016. Available online: https://coreelectronics.com.au/tutorials/hello-world-with-raspberry-pi-3.html (accessed on 6 March 2020).
- Alsalam, B.H.Y.; Morton, K.; Campbell, D.; Gonzalez, F. Autonomous UAV with vision based on-board decision making for remote sensing and precision agriculture. In Proceedings of the 2017 IEEE Aerospace Conference, Big Sky, MT, USA, 4–11 March 2017; pp. 1–12. [Google Scholar]
- Olsen, A.; Konovalov, D.A.; Philippa, B.; Ridd, P.; Wood, J.C.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al. DeepWeeds: A multiclass weed species image dataset for deep learning. Sci. Rep. 2019, 9, 1–12. [Google Scholar] [CrossRef]
- Dos Santos Ferreira, A.; Freitas, D.M.; da Silva, G.G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
- Yu, J.; Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Deep learning for image-based weed detection in turfgrass. Eur. J. Agron. 2019, 104, 78–84. [Google Scholar] [CrossRef]
- Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
Parameters | Value Obtained |
---|---|
Minimum amperage | 14.4 A |
Maximum amperage | 18 A |
Discharge current | 247.5 A |
Maximum current drawn by the motors | 80 A |
Thrust provided by the propellers | 340.47 N |
S/N | Name of Component | Specification |
---|---|---|
1 | XXD A2212 brushless outrunner electric motor | kV: 1000 Current rating: 12 A/60 s No-load Current: 10 V: 0.5 A Number of cells: 2–3 lithium polymer battery Dimensions: Ø27.5 × 30 mm |
2 | Electronic speed controller | Continuous current: 30 A Burst current: 40 A Input voltage: 2–3 cell lithium-polymer battery BEC: 2 A/5 V (linear mode) Size: 45 mm (L) × 24 mm (W) × 11 mm (H); Weight: 25 g |
3 | APM 2.8 flight controller | Made up of a 3-axis gyro, accelerometer, and barometer. 4-megabyte dataflash chip. Optional off-board GPS LEA-6CH module with a compass. |
4 | Propeller | Diameter: 10” Pitch: 4.5” |
5 | Flysky FS-I6 CH transmitter/receiver set | Transmitter: Channel: 6 Modulation type: GPSK RF range: 2.408–2.475 GHz Bandwidth: 500 kHz Receiver: FS-IAS 6-channel |
6 | BMT mini subm. water pump | Working voltage: 12 V DC Working current: 400 mA Max flow: 240 L/HPump life: >30,000 h Inlet diameter: 8 mm Outlet diameter: 8 mm |
7 | ZOP power 11.1 V 3S lithium-polymer battery | Capacity: 5500 MAH Continuous discharge rate: 45 C Size: 40 × 46.5 × 138 |
8 | B6 V3 Smart Balance Charger | DC Input voltage: 11–18 v Charge power: 80 W Discharge power: 10 W Charge current range: 0.1–6 A Discharge current range: 0.1–2 A |
9 | Single/1 channel 5VDC 10 A relay module development board | Control Voltage: 5 V DC Max Control Capacity: 10 A@250 VAC or 10 A@30 VDC Size: 41 × 16 × 16 mm Weight: 12 g |
S/N | Components | Cost (Rands) |
---|---|---|
1 | Quadcopter Kit | 3120 |
2 | ZOP power 5500 MAH 11.1 V li-poly battery | 2250 |
3 | Balance smart battery charger | 675 |
4 | Raspberry Pi 3B kit | 1400 |
5 | Raspberry Pi camera V2 | 582 |
6 | CMU Flysky FS-I6 6 channel transmitter/receiver set | 1451 |
7 | Single/1 channel 5 VDC 10 A relay module development board | 61 |
8 | BMT mini subm. water pump | 153 |
9 | Anker PowerCore 13,000 MAH power bank | 805 |
10 | Universal tail landing gear skid for DJI F450 | 236 |
11 | Pressure washer spray nozzle | 311 |
12 | Pixnor APM 2.6 MWC GPS compass antenna folding fixed mount bracket | 214 |
13 | Mirthhobby RC anti-vibration plate | 328 |
14 | Battery holder 8× AA with leads black | 21 |
15 | Battery connector male plug | 19 |
16 | AA size batter 1.5 V alkaline 24 pieces/pack | 120 |
Total | R11 746/790 USD |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ukaegbu, U.F.; Tartibu, L.K.; Okwu, M.O.; Olayode, I.O. Development of a Light-Weight Unmanned Aerial Vehicle for Precision Agriculture. Sensors 2021, 21, 4417. https://doi.org/10.3390/s21134417
Ukaegbu UF, Tartibu LK, Okwu MO, Olayode IO. Development of a Light-Weight Unmanned Aerial Vehicle for Precision Agriculture. Sensors. 2021; 21(13):4417. https://doi.org/10.3390/s21134417
Chicago/Turabian StyleUkaegbu, Uchechi F., Lagouge K. Tartibu, Modestus O. Okwu, and Isaac O. Olayode. 2021. "Development of a Light-Weight Unmanned Aerial Vehicle for Precision Agriculture" Sensors 21, no. 13: 4417. https://doi.org/10.3390/s21134417
APA StyleUkaegbu, U. F., Tartibu, L. K., Okwu, M. O., & Olayode, I. O. (2021). Development of a Light-Weight Unmanned Aerial Vehicle for Precision Agriculture. Sensors, 21(13), 4417. https://doi.org/10.3390/s21134417