Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities
Abstract
:1. Introduction
2. Materials and Methods
2.1. The Edge Devices
2.2. The Images
3. Results
3.1. Building the Reference Database
- We take a random image from the folder of backgrounds (null_image.jpg). This folder contains images of backgrounds that differ slightly.
- We select at random a number between 0–6 and images from the ‘insect for the training set’ folder that matches this random number.
- Each image is rotated randomly between 0–360 degrees and placed in the background without overlap thus forming a single image. We store the composed image and the reference label (ground truth) of the total number of insects as well.
3.2. Verification Experiments
3.3. Operational Conditions
- (a)
- They wake up by following a pre-stored schedule and load the DL model weights;
- (b)
- They take a picture once per day at night with flash;
- (c)
- They infer the number of insects in the picture;
- (d)
- If the insect count in the current picture is different from the previous count, the image is uploaded to a server through WiFi by making an http, post request;
- (e)
- They store the last picture in the SD (non-mandatory);
- (f)
- They go into a deep sleep mode and follow steps (a)–(e).
4. Concluding Remarks and Further Steps
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
ESP32 | Raspberry Pi 4 Model B | Coral Dev Board Mini Datasheet | |
---|---|---|---|
CPU | Xtensa® dual-core 32-bit LX6 microprocessor(s), up to 600 MIPS 160 MHz | Broadcom BCM2711, quad-core Cortex-A72 (ARM v8) 64-bit SoC @ 1.5GHz | MediaTek 8167s SoC Quad-core ARM Cortex-A35 1.5 GHz |
GPU | Imagination PowerVR GE8300 | ||
TPU | Google Edge TPU ML accelerator | ||
RAM | 520KB SRAM +4M PSRAM | 4 Gigabyte LPDDR4 RAM | 2 GB LPDDR3 |
Flash | 4 MB Flash | 8 GB eMMC, | |
WiFi | 802.11 b/g/n/ | 802.11 b/g/n/ac Wireless LAN | WiFi 5 |
Ethernet | Gigabit Ethernet port (supports PoE with add-on PoE HAT) | ||
Bluetooth | Bluetooth 4.2 BR/EDR BLE | Bluetooth 5.0 with BLE | Bluetooth 5.0 |
SD Card | TFCard | SD Card | Meets SD/SDIO 3.0 standard |
Camera | SCCB | 1× Raspberry Pi 2-lane MIPI CSI Camera and 1x Raspberry Pi 2-lane MIPI DSI Display connector | MIPI-CSI2 camera input |
Operating System | Free RTOS | Raspbian | Mendel Linux |
Turn off flash lamp 180 mA@5V | Typical 800 mA | Accelerator Module | |
TurnOn flash lamp 310 mA@5V | Stress 1200 mA | TPU used 425 mA 212 mA@3,3 V | |
Deep-sleep 6 mA@5V | Idling 600 mA | Typical idle 114–121 mA@3,3 V | |
Halt current 23 mA | PMIC digital I/O power supply current (AON) 10 mA | ||
Sources | Copyright© 2022 Shenzhen Ai-Thinker Technology Co., Ltd. All Rights Reserved | Raspberry Pi 4 Model B Datasheet Copyright Raspberry Pi (Trading) Ltd. 2019 | Dev Board Mini datasheet |
US$7.99 | $55.00 | $110.95 | |
OV2640 | Raspberry Pi Camera Module | Coral Camera | |
Chip | OV2640 CAMERA CHIP | Sony IMX 219 PQ CMOS image sensor in a fixed-focus module. | 5-megapixel OmniVision sensor |
Resolution | 2-megapixel | 8-megapixel | 5-megapixel |
Max Resolution | 1600 × 1200 | 3280 × 2464 | 2582 × 1933 |
Video | UXGA(1600 × 1200) 15fps | 1080p(1920 × 1080) 30fps | |
SVGA(800 × 600) 30fps | 720p(1280 × 720) 60fps | ||
CIF (352 × 288) 60fps | (640 × 480) 60/90 fps | ||
Image area | 3.59 × 2.684 mm | 3.68 × 2.76 mm (4.6 mm diag.) | 2.5 mm |
Pixel size | 2.2 μm × 2.2 μm | 1.12 µm × 1.12 µm | 1.4 × 1.4 μm pixel size |
Picture formats | YUV(422/420) YCbCr422, RGB565/555, 8-bit Compressed data, 8~10 bit Raw RGB data | JPEG, JPEG + DNG (raw), BMP, PNG, YUV420, RGB888 | |
Len Size | 1/4″ | 1/4″ | 1/4″ |
Sensitivity | 600 mV /Lux-sec | 680 mV/lux-sec | |
Cam Cost | $7.99 camera and board | $25.00 | $24.95 |
Appendix B
- model_count_insects_final.h5. The model that was trained offline.
- model_count_insects_final.tflite a smaller version of model_count_bugs_final.h5 to be executed by Tensorflow Lite.
- model_count_insects_final_quant.tflite a smaller version of model_count_bugs_final.h5 to be executed by Tensorflow Lite but is now quantized (8 bit).
- model_count_insects_final_quant.cc a matrix with structure and quantized weights in C to work with Tensorflow Lite micro to esp32-cam.
- model_count_insects_final_quant_edgetpu.tflite This is the Coral version of model_count_insects_final_quant.tflite that has been processed to run in Coral’s TPU.
References
- Høye, T.T.; Ärje, J.; Bjerge, K.; Hansen, O.L.P.; Iosifidis, A.; Leese, F.; Mann, H.M.R.; Meissner, K.; Melvad, C.; Raitoharju, J. Deep learning and computer vision will transform entomology. Proc. Natl. Acad. Sci. USA 2021, 118, e2002545117. [Google Scholar] [CrossRef]
- Rigakis, I.I.; Varikou, K.N.; Nikolakakis, A.E.; Skarakis, Z.D.; Tatlas, N.A.; Potamitis, I.G. The e-funnel trap: Automatic monitoring of lepidoptera; a case study of tomato leaf miner. Comput. Electron. Agric. 2021, 185, 106154. [Google Scholar] [CrossRef]
- Flórián, N.; Gránicz, L.; Gergócs, V.; Tóth, F.; Dombos, M. Detecting Soil Microarthropods with a Camera-Supported Trap. Insects 2020, 11, 244. [Google Scholar] [CrossRef]
- Balla, E.; Flórián, N.; Gergócs, V.; Gránicz, L.; Tóth, F.; Németh, T.; Dombos, M. An Opto-Electronic Sensor-Ring to Detect Arthropods of Significantly Different Body Sizes. Sensors 2020, 20, 982. [Google Scholar] [CrossRef] [Green Version]
- Weber, M.; Geier, M.; Potamitis, I.; Pruszynski, C.; Doyle, M.; Rose, A.; Geismar, M.; Encarnacao, J. The BG-counter, the first operative automatic mosquito counting device for online mosquito monitoring: Field tests and technical outlook. In Proceedings of the AMCA 2017 83rd Annual Meeting, The American Mosquito Control Association, San Diego, CA, USA, 13–17 February 2017; p. 57. [Google Scholar]
- Preti, M.; Verheggen, F.; Angeli, S. Insect pest monitoring with camera-equipped traps: Strengths and limitations. J. Pest Sci. 2021, 94, 203–217. [Google Scholar] [CrossRef]
- Martineau, M.; Conte, D.; Raveaux, R.; Arnault, I.; Munier, D.; Venturini, G. A survey on image-based insect classification. Pattern Recognit. 2017, 65, 273–284. [Google Scholar] [CrossRef] [Green Version]
- Bjerge, K.; Nielsen, J.B.; Sepstrup, M.V.; Helsing-Nielsen, F.; Høye, T.T. An Automated Light Trap to Monitor Moths (Lepidoptera) Using Computer Vision-Based Tracking and Deep Learning. Sensors 2021, 21, 343. [Google Scholar] [CrossRef]
- Bjerge, K.; Mann, H.M.; Høye, T.T. Real-time insect tracking and monitoring with computer vision and deep learning. Remote Sens. Ecol. Conserv. 2021. [Google Scholar] [CrossRef]
- Eliopoulos, P.; Tatlas, N.-A.; Rigakis, I.; Potamitis, I. A Smart Trap Device for Detection of Crawling Insects and Other Arthropods in Urban Environments. Electronics 2018, 7, 161. [Google Scholar] [CrossRef] [Green Version]
- Sun, Y.; Lin, Y.; Zhao, G.; Svanberg, S. Identification of Flying Insects in the Spatial, Spectral, and Time Domains with Focus on Mosquito Imaging. Sensors 2021, 21, 3329. [Google Scholar] [CrossRef]
- Doitsidis, L.; Fouskitakis, G.N.; Varikou, K.N.; Rigakis, I.I.; Chatzichristofis, S.; Papafilippaki, A.; Birouraki, A.E. Remote monitoring of the Bactrocera oleae (Gmelin) (Diptera: Tephritidae) population using an automated McPhail trap. Comput. Electron. Agric. 2017, 137, 69–78. [Google Scholar] [CrossRef]
- Ramalingam, B.; Mohan, R.E.; Pookkuttath, S.; Gómez, B.F.; Sairam Borusu, C.S.C.; Wee Teng, T.; Tamilselvam, Y.K. Remote Insects Trap Monitoring System Using Deep Learning Framework and IoT. Sensors 2020, 20, 5280. [Google Scholar] [CrossRef]
- Schrader, M.J.; Smytheman, P.; Beers, E.H.; Khot, L.R. An Open-Source Low-Cost Imaging System Plug-In for Pheromone Traps Aiding Remote Insect Pest Population Monitoring in Fruit Crops. Machines 2022, 10, 52. [Google Scholar] [CrossRef]
- Rydhmer, K.; Bick, E.; Still, L.; Strand, A.; Luciano, R.; Helmreich, S.; Beck, B.D.; Grønne, C.; Malmros, L.; Poulsen, K.; et al. Automating insect monitoring using unsupervised near-infrared sensors. Sci. Rep. 2022, 12, 2603. [Google Scholar] [CrossRef]
- Brydegaard, M.; Svanberg, S. Photonic Monitoring of Atmospheric and Aquatic Fauna. Laser Photonics Rev. 2018, 12, 1800135. [Google Scholar] [CrossRef] [Green Version]
- Kirkeby, C.; Rydhmer, K.; Cook, S.M.; Strand, A.; Torrance, M.T.; Swain, J.L.; Prangsma, J.; Johnen, A.; Jensen, M.; Brydegaard, M.; et al. Advances in automatic identification of flying insects using optical sensors and machine learning. Sci. Rep. 2021, 11, 1555. [Google Scholar] [CrossRef]
- Genoud, A.P.; Torsiello, J.; Belson, M.; Thomas, B.P. Entomological photonic sensors: Estimating insect population density, its uncertainty and temporal resolution from transit data. Ecol. Inform. 2020, 61, 101186. [Google Scholar] [CrossRef]
- Venegas, P.; Calderon, F.; Riofrío, D.; Benítez, D.; Ramón, G.; Cisneros-Heredia, D.; Coimbra, M.; Rojo-Álvarez, J.L.; Pérez, N. Automatic ladybird beetle detection using deep-learning models. PLoS ONE 2021, 16, e0253027. [Google Scholar] [CrossRef]
- Steenweg, R.; Hebblewhite, M.; Kays, R.; Ahumada, J.; Fisher, J.T.; Burton, C.; Townsend, S.E.; Carbone, C.; Rowcliffe, M.; Whittington, J.; et al. Scaling-up camera traps: Monitoring the planet’s biodiversity with networks of remote sensors. Front. Ecol. Environ. 2016, 15, 26–34. [Google Scholar] [CrossRef]
- Xia, D.; Chen, P.; Wang, B.; Zhang, J.; Xie, C. Insect Detection and Classification Based on an Improved Convolutional Neural Network. Sensors 2018, 18, 4169. [Google Scholar] [CrossRef] [Green Version]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Chollet, F. Deep Learning with Python, 2nd ed.; Manning Publications: Shelter Island, NY, USA, 2021. [Google Scholar]
- Hassall, K.L.; Dye, A.; Potamitis, I.; Bell, J.R. Resolving the identification of weak-flying insects during flight: A coupling between rigorous data processing and biology. Agric. For. Èntomol. 2021, 23, 489–505. [Google Scholar] [CrossRef]
- Rigakis, I.; Potamitis, I.; Tatlas, N.-A.; Livadaras, I.; Ntalampiras, S. A Multispectral Backscattered Light Recorder of Insects’ Wingbeats. Electronics 2019, 8, 277. [Google Scholar] [CrossRef] [Green Version]
- Hill, A.P.; Prince, P.; Piña Covarrubias, E.; Doncaster, C.P.; Snaddon, J.L.; Rogers, A. AudioMoth: Evaluation of a smart open acoustic device for monitoring biodiversity and the environment. Methods Ecol. Evol. 2018, 9, 1199–1211. [Google Scholar] [CrossRef] [Green Version]
- Zualkernan, I.; Judas, J.; Mahbub, T.; Bhagwagar, A.; Chand, P. An AIoT System for Bat Species Classification. In Proceedings of the 2020 IEEE International Conference on Internet of Things and Intelligence System (IoTaIS), Bali, Indonesia, 27–28 January 2021; pp. 155–160. [Google Scholar] [CrossRef]
- Sanchez-Iborra, R.; Skarmeta, A.F. TinyML-Enabled Frugal Smart Objects: Challenges and Opportunities. IEEE Circuits Syst. Mag. 2020, 20, 4–18. [Google Scholar] [CrossRef]
- Albanese, A.; Nardello, M.; Brunelli, D. Automated Pest Detection with DNN on the Edge for Precision Agriculture. IEEE J. Emerg. Sel. Top. Circuits Syst. 2021, 11, 458–467. [Google Scholar] [CrossRef]
- Stork, N.E. How Many Species of Insects and Other Terrestrial Arthropods Are There on Earth? Annu. Rev. Èntomol. 2018, 63, 31–45. [Google Scholar] [CrossRef] [Green Version]
- Van Horn, G.; Mac Aodha, O.; Song, Y.; Cui, Y.; Sun, C.; Shepard, A.; Adam, H.; Perona, P.; Belongie, S. The iNaturalist Species Classification and Detection Dataset. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Piscataway, NJ, USA, 18–23 June 2018; pp. 8769–8778. [Google Scholar] [CrossRef] [Green Version]
- Bajaj, N.; Giampietro, N.C.; Mao, K.; Rushton, M.E.; Spomer, N.A.; Rhoads, J.F. Searching for bed bugs: The design, development, and evaluation of an oscillator-based trans-2-hexenal sensor array. Sens. Actuators B Chem. 2020, 333, 129161. [Google Scholar] [CrossRef]
- Gondhalekar, A.D.; Appel, A.G.; Thomas, G.M.; Romero, A. A Review of Alternative Management Tactics Employed for the Control of Various Cockroach Species (Order: Blattodea) in the USA. Insects 2021, 12, 550. [Google Scholar] [CrossRef]
- Ovadia, Y.; Halpern, Y.; Krishnan, D.; Livni, J.; Newburger, D.; Poplin, R.; Zha, T.; Sculley, D. Learning to Count Mosquitoes for the Sterile Insect Technique. In Proceedings of the 23rd SIGKDD Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, 13–17 August 2017. [Google Scholar]
# Images | Training (70%) | Validation (30%) | 0 | 1 | 2 | 3 | 4 | 5 | 6 |
---|---|---|---|---|---|---|---|---|---|
Train: 14,000 | 9800 | 4200 | 2000 | 2000 | 2000 | 2000 | 2000 | 2000 | 2000 |
0 | 1 | 2 | 3 | 4 | 5 | 6 | |||
Test: 1400 | 200 | 200 | 200 | 200 | 200 | 200 | 200 |
Software Back-End | Model Name | Acc. (α = 1 − |Μc − Ac|/Mc) | Model Size (MB) |
---|---|---|---|
TensorFlow | model_count_final.h5 | 0.951 | 5.9 |
TensorFlow Lite | model_count_final.tflite | 0.951 | 2 |
TensorFlow Lite Quantization | model_count_final_quant.tflite | 0.950 | 0.5 |
TensorFlow Lite Quantization TPU (Coral) | model_count_final_quant_edgetpu.tflite | 0.950 | 0.55 |
Number of Insects Per Image | Accuracy (α = 1 − |Μc − Ac|/Mc) |
---|---|
0 | 0.991 |
1 | 0.942 |
2 | 0.931 |
3 | 0.945 |
4 | 0.945 |
5 | 0.953 |
6 | 0.951 |
Mean Accuracy | 0.950 |
Edge Device | Model | Load Model and Initialize | Inference Time |
---|---|---|---|
ESP32-CAM | TensorFlow Lite Quant. Micro | 51 s | |
Raspberry Pi4 | TensorFlow Lite Quant. | 70.550 μs | 88.868 μs |
Coral mini Dev | TensorFlow Lite Quant. | 6.726 μs | 132.546 μs |
Coral mini Dev | TensorFlow Lite Quant. TPU (Coral) | 385.854 μs | 31.531 μs |
Edge Device | Stand By or Deep Sleep Avg. Current (mA) | Avg. Current | Avg. Current Inference | Avg. Current Store in SD Wifi Upload | Avg. Current Other Functions | mA in 63 s | ||||
---|---|---|---|---|---|---|---|---|---|---|
mA | Sec | mA | Sec | mA | Sec | mA | Sec | mA | mA | |
ESP32-CAM TensorFlow Lite Micro | 6 | 2 | 180 | 51 | 85 | 3.5 | 150 | 6.5 | 70 | 5595 |
Raspberry pi 4 B TensorFlow Lite | 410 | 0.914 | 470 | 0.174 | 560 | 0.918 | 490 | 60.994 | 410 | 25984.38 |
Coral mini Dev TensorFlow Lite | 240 | 0.062 | 400 | 0.135 | 460 | 0.08 | 450 | 62.723 | 240 | 15176.42 |
Coral mini Dev TensorFlow Lite TPU (Coral) | 240 | 0.062 | 400 | 0.036 | 460 | 0.076 | 450 | 62.826 | 240 | 15153.8 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Saradopoulos, I.; Potamitis, I.; Ntalampiras, S.; Konstantaras, A.I.; Antonidakis, E.N. Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities. Sensors 2022, 22, 2006. https://doi.org/10.3390/s22052006
Saradopoulos I, Potamitis I, Ntalampiras S, Konstantaras AI, Antonidakis EN. Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities. Sensors. 2022; 22(5):2006. https://doi.org/10.3390/s22052006
Chicago/Turabian StyleSaradopoulos, Ioannis, Ilyas Potamitis, Stavros Ntalampiras, Antonios I. Konstantaras, and Emmanuel N. Antonidakis. 2022. "Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities" Sensors 22, no. 5: 2006. https://doi.org/10.3390/s22052006
APA StyleSaradopoulos, I., Potamitis, I., Ntalampiras, S., Konstantaras, A. I., & Antonidakis, E. N. (2022). Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities. Sensors, 22(5), 2006. https://doi.org/10.3390/s22052006