TinyML Classification for Agriculture Objects with ESP32
Abstract
1. Introduction
- Insufficient spatial and temporal resolution of data [8];
- Incorrect use of neural network models [9];
- Poor reliability of systems in various operating conditions [10];
- Low energy efficiency [11].
- 1.
- A lightweight hardware and software complex for local energy-efficient image classification with support for Internet of Things protocols is presented.
- 2.
- An evaluation of the effectiveness of the TinyML model for the task of classifying agricultural objects based on low-resolution data ranging from 32 × 32 to 800 × 600 pixels is conducted.
- 3.
- The dependences of the dynamics of neural network training for data of different quality in a system with limited resources are shown.
2. Materials and Methods
2.1. Consideration the Terms of Reference
- The means to locally capture images of sufficient quality to be analyzed by a neural network.
- Non-blocking code to asynchronously acquire data from output neurons and broadcast the data to the user.
- A user interface displaying the captured image and its classification (using Wi-Fi or Bluetooth to communicate with the user).
- The ability to detect three classes: plant-free zone, weed, and wheat. In classification, the classes “plant-free zone” and “weed” will belong to an enlarged classification for the user-“attention zone.” Thus, support for neural computing with CNN architecture is required. The classification accuracy should not be less than 90%, considering the computational power of the microcontroller.
- A subroutine for local accumulation of the dataset, which will allow us to generate unique data for additional training of the applied neural model.
2.2. Definition of Hardware and Software Complex
- Extraction and classification of data samples from external datasets for subsequent use in neural network training;
- Preprocessing and augmentation of data, including resizing to multiple resolutions ranging from 512 × 512 to 80 × 60 pixels;
- Construction of training, testing, and validation datasets;
- Visual inspection of compiled datasets to ensure correctness and consistency;
- Definition and implementation of the neural network architecture using TensorFlow, as well as model compilation and training;
- Analysis of training dynamics and evaluation of prediction accuracy;
- Selection of representative datasets, followed by model optimization and quantization for deployment efficiency;
- Assessment of the accuracy of the optimized model;
- Conversion of trained model parameters into a binary format suitable for use in embedded C++ environments;
- Development of microcontroller firmware supporting TensorFlow Lite [38] and implementation of a basic web interface for interaction;
- Integration of the trained neural network into the firmware codebase;
- Testing of image classification performance on a physical hardware prototype.
3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
TinyML | Tiny Machine Learning |
CNN | Convolutional Neural Network |
IoT | Internet of Things |
PSRAM | Pseudo Static Random Access Memory |
GPS | Global Positioning System |
GLONASS | Global Navigation Satellite System |
References
- Adisa, O.; Ilugbusi, B.S.; Adewunmi, O.; Franca, O.; Ndubuisi, L. A comprehensive review of redefining agricultural economics for sustainable development: Overcoming challenges and seizing opportunities in a changing world. World J. Adv. Res. Rev. 2024, 21, 2329–2341. [Google Scholar] [CrossRef]
- Kumar, V.; Sharma, K.V.; Kedam, N.; Patel, A.; Kate, T.R.; Rathnayake, U. A comprehensive review on smart and sustainable agriculture using IoT technologies. Smart Agric. Technol. 2024, 8, 100487. [Google Scholar] [CrossRef]
- Zhang, C.; Di, L.; Lin, L.; Zhao, H.; Li, H.; Yang, A.; Yang, Z. Cyberinformatics tool for in-season crop-specific land cover monitoring: Design, implementation, and applications of iCrop. Comput. Electron. Agric. 2023, 213, 108199. [Google Scholar] [CrossRef]
- Anam, I.; Arafat, N.; Hafiz, M.S.; Jim, J.R.; Kabir, M.M.; Mridha, M.F. A systematic review of UAV and AI integration for targeted disease detection, weed management, and pest control in precision agriculture. Smart Agric. Technol. 2024, 9, 100647. [Google Scholar] [CrossRef]
- Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
- Kanning, M.; Kühling, I.; Trautz, D.; Jarmer, T. High-resolution UAV-based hyperspectral imagery for LAI and chlorophyll estimations from wheat for yield prediction. Remote Sens. 2018, 10, 2000. [Google Scholar] [CrossRef]
- Waqas, M.; Naseem, A.; Humphries, U.W.; Hlaing, P.T.; Dechpichai, P.; Wangwongchai, A. Applications of machine learning and deep learning in agriculture: A comprehensive review. Green Technol. Sustain. 2025, 3, 100199. [Google Scholar] [CrossRef]
- Lacerda, C.F.; Ampatzidis, Y.; Neto, A.D.O.C.; Partel, V. Cost-efficient high-resolution monitoring for specialty crops using AgI-GAN and AI-driven analytics. Comput. Electron. Agric. 2025, 237, 110678. [Google Scholar] [CrossRef]
- Castillo-Girones, S.; Munera, S.; Martínez-Sober, M.; Blasco, J.; Cubero, S.; Gómez-Sanchis, J. Artificial Neural Networks in Agriculture, the core of artificial intelligence: What, When, and Why. Comput. Electron. Agric. 2025, 230, 109938. [Google Scholar] [CrossRef]
- Dhanush, G.; Khatri, N.; Kumar, S.; Shukla, P.K. A comprehensive review of machine vision systems and artificial intelligence algorithms for the detection and harvesting of agricultural produce. Sci. Afr. 2023, 21, e01798. [Google Scholar] [CrossRef]
- Paris, B.; Vandorou, F.; Balafoutis, A.T.; Vaiopoulos, K.; Kyriakarakos, G.; Manolakos, D.; Papadakis, G. Energy use in open-field agriculture in the EU: A critical review recommending energy efficiency measures and renewable energy sources adoption. Renew. Sustain. Energy Rev. 2022, 158, 112098. [Google Scholar] [CrossRef]
- Sunil, G.C.; Upadhyay, A.; Sun, X. Development of software interface for AI-driven weed control in robotic vehicles, with time-based evaluation in indoor and field settings. Smart Agric. Technol. 2024, 9, 100678. [Google Scholar] [CrossRef]
- Kariyanna, B.; Sowjanya, M. Unravelling the use of artificial intelligence in management of insect pests. Smart Agric. Technol. 2024, 8, 100517. [Google Scholar] [CrossRef]
- Kuznetsov, P.; Kotelnikov, D.; Voronin, D.; Evstigneev, V.; Yakimovich, B.; Kelemen, M. Intelligent monitoring of the physio-logical state of agricultural products using UAV. MM Sci. J. 2024, 2024, 7772–7781. [Google Scholar] [CrossRef]
- Kaldarova, M.; Akanova, A.; Nazyrova, A.; Mukanova, A.; Tynykulova, A. Identification of weeds in fields based on computer vision technology. East.-Eur. J. Enterp. Technol. 2023, 4, 44–52. [Google Scholar] [CrossRef]
- Valladares, S.; Toscano, M.; Tufiño, R.; Morillo, P.; Vallejo-Huanga, D. Performance Evaluation of the Nvidia Jetson Nano Through a Real-Time Machine Learning Application. Intell. Hum. Syst. Integr. 2021, 1322, 343–349. [Google Scholar] [CrossRef]
- Mukhamediev, R.I.; Smurygin, V.; Symagulov, A.; Kuchin, Y.; Popova, Y.; Abdoldina, F.; Tabynbayeva, L.; Gopejenko, V.; Oxenenko, A. Fast Detection of Plants in Soybean Fields Using UAVs, YOLOv8x Framework, and Image Segmentation. Drones 2025, 9, 547. [Google Scholar] [CrossRef]
- Gao, X.; Wang, G.; Qi, J.; Wang, Q.; Xiang, M.; Song, K.; Zhou, Z. Improved YOLO v7 for Sustainable Agriculture Significantly Improves Precision Rate for Chinese Cabbage (Brassica pekinensis Rupr.) Seedling Belt (CCSB) Detection. Sustainability 2024, 16, 4759. [Google Scholar] [CrossRef]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. SSD: Single Shot Multibox Detector. In Proceedings of the European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 8–16 October 2016; Volume 1, pp. 21–37. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. Neural Inf. Process. Syst. 2015, 28, 1137–1149. [Google Scholar] [CrossRef]
- Brunell, D.; Albanese, A.; d’Acunto, D.; Nardello, M. Energy neutral machine learning based IoT device for pest detection in precision agriculture. IEEE Internet Things Mag. 2019, 2, 10–13. [Google Scholar] [CrossRef]
- Ivliev, E.; Demchenko, V.; Obukhov, P. Automatic Monitoring of Smart Greenhouse Parameters and Detection of Plant Diseases by Neural Networks. In Robotics, Machinery and Engineering Technology for Precision Agriculture; Shamtsyan, M., Pasetti, M., Beskopylny, A., Eds.; Smart Innovation, Systems and Technologies; Springer: Singapore, 2022; Volume 247, pp. 37–45. [Google Scholar] [CrossRef]
- Langer, T.; Widra, M.; Beyer, V. TinyML Towards Industry 4.0: Resource-Efficient Process Monitoring of a Milling Machine. arXiv 2025, arXiv:2508.16553. [Google Scholar]
- Vu, T.H.; Tu, N.H.; Huynh-The, T.; Lee, K.; Kim, S.; Voznak, M.; Pham, Q.V. Integration of TinyML and LargeML: A Survey of 6G and Beyond. arXiv 2025, arXiv:2505.15854. [Google Scholar]
- Dockendorf, C.; Mitra, A.; Mohanty, S.P.; Kougianos, E. Lite-Agro: Exploring Light-Duty Computing Platforms for IoAT-Edge AI in Plant Disease Identification. In Proceedings of the IFIP International Internet of Things Conference, Denton, TX, USA, 2–3 November 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 371–380. [Google Scholar] [CrossRef]
- Gookyi, D.; Wulnye, F.; Arthur, E.; Ahiadormey, R.; Agyemang, J.; Aguekum, K.; Gyaang, R. TinyML for Smart Agriculture: Comparative Analysis of TinyML Platforms and Practical Deployment for Maize Leaf Disease Identification. Smart Agric. Technol. 2024, 8, 100490. [Google Scholar] [CrossRef]
- Choudhary, V.; Guha, P.; Pau, G.; Mishra, S. An overview of smart agriculture using internet of things (IoT) and web services. Environ. Sustain. Indic. 2025, 26, 100607. [Google Scholar] [CrossRef]
- Sabovic, A.; Fontaine, J.; De Poorter, E.; Famaey, J. Energy-aware tinyML model selection on zero energy devices. Internet Things 2025, 30, 101488. [Google Scholar] [CrossRef]
- Sumari, A.; Annurroni, I.; Ayuningtyas, A. The Internet-of-Things-based Fishpond Security System Using NodeMCU ESP32-CAM Microcontroller. J. RESTI (Rekayasa Sist. Dan Teknol. Inf.) 2025, 9, 51–61. [Google Scholar] [CrossRef]
- Adi, P.D.P.; Wahyu, Y. Performance evaluation of ESP32 Camera Face Recognition for various projects. Internet Things Artif. Intell. J. 2022, 2, 10–21. [Google Scholar] [CrossRef]
- Panara, U.; Pandya, R.; Rayja, M. Crop and Weed Detection Data with Bounding Boxes. 2020. Available online: https://www.kaggle.com/datasets/ravirajsinh45/crop-and-weed-detection-data-with-bounding-boxes (accessed on 6 June 2025).
- Steininger, D.; Trondl, A.; Croonen, G.; Simon, J.; Widhalm, V. The CropAndWeed Dataset: A Multi-Modal Learning Approach for Efficient Crop and Weed Manipulation. In Proceedings of the 2023 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 3–7 January 2023. [Google Scholar]
- Haug, S.; Ostermann, J. A Crop/Weed Field Image Dataset for the Evaluation of Computer Vision Based Precision Agriculture Tasks. In Proceedings of the Computer Vision—ECCV 2014 Workshops, Zurich, Switzerland, 6, 7 and 12 September 2014. [Google Scholar]
- Lameski, P. Weed-Datasets. Available online: https://github.com/zhangchuanyin/weed-datasets?ysclid=m9b389hlew273226771 (accessed on 6 June 2025).
- David, E.; Madec, S.; Sadeghi-Tehran, P.; Aasen, H.; Zheng, B.; Liu, S.; Kirchgessner, N.; Ishikawa, G.; Nagasawa, K.; Badhon, M.A. Global Wheat Head Detection (GWHD) dataset: A large and diverse dataset of high-resolution RGB-labelled images to develop and benchmark wheat head detection methods. Sci. Partn. J. 2020, 2020, 1–15. [Google Scholar] [CrossRef]
- Olsen, A.; Konovalov, D.A.; Philippa, B. DeepWeeds: A Multiclass Weed Species Image Dataset for Deep Learning. Sci. Rep. 2019, 9, 2058. [Google Scholar] [CrossRef]
- WeedSeeker 2 Spot Spraying System. Available online: https://ru.ptxtrimble.com/product/sistema-tochechnogo-opriskivania-weedseeker2/ (accessed on 6 June 2025).
- TensorFlow Lite for Microcontrollers (Tflite-Micro). Available online: https://github.com/tensorflow/tflite-micro/tree/main (accessed on 6 June 2025).
- Donskoy, D.Y.; Lukyanov, A.D. Implementation of neural networks in IoT based on ESP32 microcontrollers. In Proceedings of the XVII International Scientific and Technical Conference «Dynamics of Technical Systems» (DTS-2021), Rostov-on-Don, Russia, 9–11 September 2021. [Google Scholar]
- Arduino-Style TensorFlow Lite Micro Library (ArduTFLite). Available online: https://github.com/spaziochirale/ArduTFLite (accessed on 6 June 2025).
- Rudoy, D.V.; Chigvintsev, V.V.; Olshevskaya, A.V. Use of neural networks for agrochemical analysis of soil. In Proceedings of the IV International Forum «Youth in Agribusiness», Rostov-on-Don, Russia, 5–8 November 2024. [Google Scholar]
- Rawat, W.; Wang, Z. Deep Convolutional Neural Networks for Image Classification: A Comprehensive Review. Neural Comput. 2017, 29, 2352–2449. [Google Scholar] [CrossRef] [PubMed]
- Chechkin, A.; Pleshakova, E.; Gataullin, S. A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity. Technologies 2025, 13, 223. [Google Scholar] [CrossRef]
- Jamali, A.; Roy, S.K.; Hong, D.; Lu, B.; Ghamisi, P. How to Learn More? Exploring Kolmogorov–Arnold Networks for Hyperspectral Image Classification. Remote Sens. 2024, 16, 4015. [Google Scholar] [CrossRef]
Model | Quantized | Accuracy (320 × 240), % | Accuracy (160 × 120), % | Accuracy (80 × 60), % |
---|---|---|---|---|
The first CNN architecture | No | 82.5 | 79.32 | 74.25 |
The first CNN architecture | Yes | 76.6 | 71.43 | 60.98 |
The second CNN architecture | No | 94.83 | 90.21 | 84.58 |
The second CNN architecture | Yes | 96.15 | 92.86 | 87.50 |
Model | Memory (320 × 240), KB | Memory (160 × 120), KB | Memory (80 × 60), KB |
---|---|---|---|
The first CNN architecture | 3584 | 1536 | 384 |
The second CNN architecture | 5120 | 2304 | 562 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Donskoy, D.; Gvindjiliya, V.; Ivliev, E. TinyML Classification for Agriculture Objects with ESP32. Digital 2025, 5, 48. https://doi.org/10.3390/digital5040048
Donskoy D, Gvindjiliya V, Ivliev E. TinyML Classification for Agriculture Objects with ESP32. Digital. 2025; 5(4):48. https://doi.org/10.3390/digital5040048
Chicago/Turabian StyleDonskoy, Danila, Valeria Gvindjiliya, and Evgeniy Ivliev. 2025. "TinyML Classification for Agriculture Objects with ESP32" Digital 5, no. 4: 48. https://doi.org/10.3390/digital5040048
APA StyleDonskoy, D., Gvindjiliya, V., & Ivliev, E. (2025). TinyML Classification for Agriculture Objects with ESP32. Digital, 5(4), 48. https://doi.org/10.3390/digital5040048