Design and Implementation of a Low-Cost, Linear Robotic Camera System, Targeting Greenhouse Plant Growth Monitoring
Abstract
:1. Introduction
2. Materials and Methods
2.1. System’s Design and Construction
- The system should be able to automatically detect and accurately localize the spaced plants that are growing apart in plastic pots, positioned in a fully randomized design, in a 3 m long row;
- The system as a whole, should be lightweight and portable but also easily scalable if needed in order to facilitate its easy positioning for monitoring other rows inside the greenhouse too;
- To ensure that the linear robotic camera system can precisely reach each of the target plants, the actuator must exhibit sufficient precision in its movement. Upon reaching the target, the system should capture a high-resolution, color, top-view image with the plant ideally centered;
- With none or minor modifications and/or software adjustment settings, the linear robotic camera system could also be applicable for other plant species’ monitoring research and investigation purposes in hydroponic systems (e.g., spinach, sweet peppers, herbs, and strawberries, to name a few);
- The system should not, in any manner, obstruct the plants’ growth;
- Detected and localized plant images along with all the plant growth-related data measurements performed, together with the datalogging information, should be automatically stored locally, but also transmitted to a remote location and synched in a secure and transparent manner;
- No human intervention should be necessary at any stage of plant detection and monitoring.
2.1.1. Hardware: Stable and Moving Parts
2.1.2. Hardware: Motor, Controller, and Transmission
2.1.3. Hardware: Sensors, Auxiliary Electronics, and Raspberry Pi
2.2. System’s Software Programming and Workflow
2.2.1. NIR and RGB Cameras’ Calibration
2.2.2. Software: System Pipeline
2.2.3. Plant Finding
- The initial step in the process is crucial as it involves determining the starting position of the carriage. To achieve this, the carriage is moved automatically to the far-right edge of the beam until the right end switch is activated. This position is then labeled as “position zero” (home position).
- Subsequently, the user must have indicated the number of plants for identification. A plant counter is then established and employed to ascertain the number of procedural iterations that will be carried out. In instances where the number of plants remains constant (as is typically the case in practical scenarios, where a specific number of plants exist in a row), this step is omitted, and the number of plants remains unchanged.
- The RGB camera that is utilized for plant recognition is then initialized. An image of is captured for further processing.
- Since the RGB camera has a wide field of view (95° diagonal), it is necessary to crop the image to ensure that only one plant (and potentially small segments of other plants) is present. This operation results in a square frame (cropped image) that features the same center as the original image. This not only helps in better focusing on a single plant, but also reduces the image processing time for the smaller image; the amount of cropping in pixels (a border with the same number for left and right sides as well as the up and down sides) can be determined by the user through dedicated buttons in the GUI (left side in Figure 9a). If denotes the amount of pixels for cropping the image on the left and right sides (width dimension), and similarly, if denotes the amount of pixels for the top and bottom sides (height dimension), the following relationship holds: For example, to result in cropped images, and pixels, respectively. (In this study, the experiments were conducted using cropped images of dimensions and ).
- To filter out any high-frequency noise that may exist, a Gaussian blur (with 11 × 11 kernel size) is applied to the image.
- The image is converted to HSV color space.
- The automatic identification of the plant primarily relies on analyzing its color. To achieve this, users are required to predefine specific thresholds for the Hue (H), Saturation (S), and Value (V). These thresholds determine the range of colors that will be considered as the plant’s color. In this process, the undesired colors are masked out, resulting in a black color in the empty space of the image. To facilitate this procedure, a dedicated menu called “Color Picker” has been incorporated into the graphical user interface (GUI). By pressing the “Tweak Parameters” button, the user can access the “Color Picker” window, where they can adjust the sliders to carefully select the appropriate upper and lower thresholds for the Hue, Saturation, and Value. This ensures that only the colors of the plant are retained, while all other colors are filtered out, leaving a black background. Through an empirical analysis, specific lower and upper thresholds were determined for the examination of lettuces. The chosen thresholds, which proved to be effective and remained unchanged for all the experiments in this study, are as follows: , Saturation , and . Another crucial setting is the exposure value, which can range from 0 to 255. The appropriate value should be selected based on the lighting conditions. In the current application, a value of 200 was chosen during the experiments, which remained unchanged too (the installed LEDs used for illumination proved to be sufficient during all experiments). Additionally, a real-time video is displayed below the sliders to provide users with a convenient visual representation (Figure 8b).
- Having available the upper and lower thresholds for the color values in the HSV space from the previous step, the algorithm proceeds by tracking a particular color (i.e., green) in the image by defining a mask using the inRange() method in OpenCV. After this, a series of erosion and dilation operations are performed to enhance the solidity and distinguishability of the plants, which may result in the appearance of small blobs in the images. For the erosion morphological operation that is performed twice, a (8-neighborhood) structuring element is applied. The dilation operation is performed once with a similar structuring element. The objects present in the image are determined using the findContours() function in OpenCV. In case no objects are found, the process is repeated from the beginning, with a slight shift of the carriage to the left (equal to the number of steps set by the user as Jump Steps).
- If multiple contours are detected, the contourArea() function in OpenCV is applied to discard the smaller ones, retaining only the largest contour. If the identified contour is deemed too small to represent a plant, the process is repeated, with a slight shift of the carriage to the left (empirically set to 2000 steps).
- Once a contour that potentially represents a plant is identified, the minEnclosingCircle() function is utilized to obtain the coordinates of the circle’s center and its radius. This information is then used to draw a circle around the plant.
- If the center of the plant does not align with the center of the image, the carriage is shifted either to the left or right, aligning the camera precisely with the plant (by means of bringing the camera exactly on top of the plant). Subsequently, the entire process is repeated from the beginning. Otherwise, the position of the carriage (and consequently, the plant’s position) is recorded in a file, and the plant counter is increased by one.
- If there are no more plants to be searched for, the procedure concludes.
2.2.4. Image Acquisition, Plant Identification, and Localization
2.3. Experimental Setup Evaluation and Measurements on Plant Samples
2.3.1. Experimental Setup for System’s Accuracy and Repeatability Evaluation
2.3.2. Laboratory Experimental Setup on Lettuce Plants: Image Acquisition and Evaluation Metrics
3. Results and Discussion
3.1. Evaluation of the Proposed Robotic System Setup and Its Movement’s Accuracy and Repeatability
3.2. Experimental Results Example on Plant’s Identification and Localization
3.3. Measurements Related to Morphological Characteristics of the Plants
3.4. Limitations, Future Outlooks, and Related Studies
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Item # | Component | Short Usage Description | Quantity |
---|---|---|---|
1 | Camera tripod | Whole structure support | 2 |
2 | V-Slot 2060 1000 mm: natural anodized | Horizontal support for the carriage | 2 |
3 | V-Slot 2060 1500 mm: natural anodized | Horizontal support for the carriage | 1 |
4 | Cable drag chain 10 × 10 mm (m) | Cable management | 3 |
5 | Drag/cable chain mounting bracket | - | 1 |
6 | Aluminum 248 × 40 × 2 mm | Cable drag chain support | 2 |
7 | Double tee nut | V-slot assembly | 14 |
8 | Spring-loaded tee nuts | V-slot assembly | 10 |
9 | Stepper Motor Driver TMC2209 V2.0 | - | 1 |
10 | Plywood 250 × 150 × 6 mm | Carriage support | 1 |
11 | V-Slot Gantry Set 2020 × 2080 Xtreme | For the movement of the carriage | 1 |
12 | Stepper motor 2.8 kg.cm 42BYGHW208 | - | 1 |
13 | Timing belt: GT2 (m) | - | 4 |
14 | Aluminum GT2 timing pulley: 12T—5 mm | Toothed pulley for the precise movement of the carriage | 1 |
15 | Power Supply 12 V 2 A: output 5.5 × 2.1 PSU-1602 | - | 1 |
16 | Power supply 5 V 3 A for Raspberry Pi 4 | - | 1 |
17 | Raspberry Pi 4-Model B-8 GB | - | 1 |
18 | Raspberry Pi 4 B Heatsink: black (set of 3) | RPi 4B cooling | 1 |
19 | DC Fan 50 × 50 × 10 mm 12 V: MagLev | RPi 4B cooling | 1 |
20 | Memory card microSDHC 64 GB Class 10: SanDisk Ultra | RPi 4B OS and programming | 1 |
21 | Microswitch Mini SPDT ON-(ON): with roller lever | Determination of the two ends of the carriage | 2 |
22 | Adafruit Time-of-Flight (ToF) sensor VL53L0X | - | 1 |
23 | Humidity and temperature sensor (AM2301) | Temperature, humidity | 1 |
24 | GPIO Header for Raspberry Pi 2 × 20 stackable | - | 1 |
25 | infrared LED board | Illuminates the plants in the infrared spectrum | 2 |
26 | Arducam stereo USB camera, synchronized, 2MP 1080P | RGB and IR images capture | 1 |
27 | PCB board electronics and other components | JST XH Connectors (2 pin/3 pin/4 pin), P2N222ANPN Transistor, resistors, capacitors, Relay DPDT 5 V, 1N4007 Diode, cables, multi-socket, bolts, nuts, washers, screws, cable insulation, etc. | 1 |
Camera | Calibration Parameters | Equation | Calculated Values |
---|---|---|---|
RGB NIR | Intrinsic parameters * | ||
RGB NIR | Radial distortions | ||
RGB NIR | Tangential distortions | ] ] | |
RGB NIR | Rotation of RGB camera relative to NIR camera | ||
RGB NIR | Translation of RGB camera relative to NIR camera ** |
References
- Ehret, D.; Lau, A.; Bittman, S.; Lin, A.; Tim Shelford, T. Automated monitoring of greenhouse crops. Agronomie 2001, 21, 403–414. [Google Scholar] [CrossRef]
- Li, H.; Guo, Y.; Zhao, H.; Wang, Y.; Chow, D. Towards automated greenhouse: A state of the art review on greenhouse monitoring methods and technologies based on internet of things. Comput. Electron. Agric. 2021, 191, 106558. [Google Scholar] [CrossRef]
- Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef] [PubMed]
- Chang, C.-L.; Chung, S.-C.; Fu, W.-L.; Huang, C.-C. Artificial intelligence approaches to predict growth, harvest day, and quality of lettuce (Lactuca sativa L.) in a IoT-enabled greenhouse system. Biosyst. Eng. 2021, 212, 77–105. [Google Scholar] [CrossRef]
- Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
- Ma, J.; Li, Y.; Chen, Y.; Du, K.; Zheng, F.; Zhang, L.; Sun, Z. Estimating above ground biomass of winter wheat at early growth stages using digital images and deep convolutional neural network. Eur. J. Agron. 2019, 103, 117–129. [Google Scholar] [CrossRef]
- Zhang, L.; Xu, Z.; Xu, D.; Ma, J.; Chen, Y.; Fu, Z. Growth monitoring of greenhouse lettuce based on a convolutional neural network. Hortic. Res. 2020, 7, 124. [Google Scholar] [CrossRef] [PubMed]
- Sladojevic, S.; Arsenovic, M.; Anderla, A.; Culibrk, D.; Stefanovic, D. Deep Neural Networks Based Recognition of Plant Diseases by Leaf Image Classification. Comput. Intell. Neurosci. 2016, 2016, 3289801. [Google Scholar] [CrossRef] [PubMed]
- Too, E.C.; Yujian, L.; Njuki, S.; Yingchun, L. A comparative study of fine-tuning deep learning models for plant disease identification. Comput. Electron. Agric. 2019, 161, 272–279. [Google Scholar] [CrossRef]
- Reyes, A.K.; Caicedo, J.C.; Camargo, J.E. Fine-tuning Deep Convolutional Networks for Plant Recognition. Conf. Labs Eval. Forum 2015, 1391, 467–475. [Google Scholar]
- Rahnemoonfar, M.; Sheppard, C. Deep Count: Fruit Counting Based on Deep Simulated Learning. Sensors 2017, 17, 905. [Google Scholar] [CrossRef]
- Fuentes-Peñailillo, F.; Carrasco Silva, G.; Pérez Guzmán, R.; Burgos, I.; Ewertz, F. Automating Seedling Counts in Horticulture Using Computer Vision and AI. Horticulturae 2023, 9, 1134. [Google Scholar] [CrossRef]
- Kuwata, K.; Shibasaki, R. Estimating crop yields with deep learning and remotely sensed data. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 858–861. [Google Scholar]
- Rajalakshmi, T.S.; Panikulam, P.; Sharad, P.K.; Nair, R.R. Development of a small scale cartesian coordinate farming robot with deep learning based weed detection. J. Phys. Conf. Ser. 2021, 1969, 012007. [Google Scholar] [CrossRef]
- Chen, C.-J.; Huang, Y.-Y.; Li, Y.-S.; Chang, C.-Y.; Huang, Y.-M. An AIoT Based Smart Agricultural System for Pests Detection. IEEE Access 2020, 8, 180750–180761. [Google Scholar] [CrossRef]
- Hamidon, M.H.; Ahamed, T. Detection of Defective Lettuce Seedlings Grown in an Indoor Environment under Different Lighting Conditions Using Deep Learning Algorithms. Sensors 2023, 23, 5790. [Google Scholar] [CrossRef]
- Li, Z.; Li, Y.; Yang, Y.; Guo, R.; Yang, J.; Yue, J.; Wang, Y. A high-precision detection method of hydroponic lettuce seedlings status based on improved Faster RCNN. Comput. Electron. Agric. 2021, 182, 106054. [Google Scholar] [CrossRef]
- Franchetti, B.; Pirri, F. Detection and Localization of Tip-Burn on Large Lettuce Canopies. Front. Plant Sci. 2022, 13, 874035. [Google Scholar] [CrossRef] [PubMed]
- Vilani Sachithra, V.; Subhashini, L.D.C.S. How artificial intelligence uses to achieve the agriculture sustainability: Systematic review. Artif. Intell. Agric. 2023, 8, 46–59. [Google Scholar] [CrossRef]
- Tian, H.; Wang, T.; Liu, Y.; Qiao, X.; Li, Y. Computer vision technology in agricultural automation—A review. Inf. Process. Agric. 2020, 7, 1–19. [Google Scholar] [CrossRef]
- Adli, H.K.; Remli, M.A.; Wan Salihin Wong, K.N.S.; Ismail, N.A.; González-Briones, A.; Corchado, J.M.; Mohamad, M.S. Recent Advancements and Challenges of AIoT Application in Smart Agriculture: A Review. Sensors 2023, 23, 3752. [Google Scholar] [CrossRef]
- Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine Vision Systems in Precision Agriculture for Crop Farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [PubMed]
- Dhanya, V.G.; Subeesh, A.; Kushwaha, N.L.; Vishwakarma, D.K.; Kumar, T.N.; Ritika, G.; Singh, A.N. Deep learning based computer vision approaches for smart agricultural applications. Artif. Intell. Agric. 2022, 6, 211–229. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Yeh, Y.H.F.; Lai, T.C.; Liu, T.Y.; Liu, C.C.; Chung, W.C.; Lin, T.T. An automated growth measurement system for leafy vegetables. Biosyst. Eng. 2014, 117, 43–50. [Google Scholar] [CrossRef]
- Yang, R.; Wu, Z.; Fang, W.; Zhang, H.; Wang, W.; Fu, L.; Majeed, Y.; Li, R.; Cui, Y. Detection of abnormal hydroponic lettuce leaves based on image processing and machine learning. Inf. Process. Agric. 2021, 10, 1–10. [Google Scholar] [CrossRef]
- Abbasi, R.; Martinez, P.; Ahmad, R. Automated Visual Identification of Foliage Chlorosis in Lettuce Grown in Aquaponic Systems. Agriculture 2023, 13, 615. [Google Scholar] [CrossRef]
- Yang, C.I.; Hsieh, K.W.; Tsai, C.Y.; Huang, Y.I.; Chen, L.Y.; Chen, S. Development of an automation system for greenhouse seedling production management using radio-frequency-identification and local remote sensing techniques. Eng. Agric. Environ. Food 2014, 7, 52–58. [Google Scholar] [CrossRef]
- Wang, L.; Duan, Y.; Zhang, L.; Rehman, T.U.; Ma, D.; Jin, J. Precise Estimation of NDVI with a Simple NIR Sensitive RGB Camera and Machine Learning Methods for Corn Plants. Sensors 2020, 20, 3208. [Google Scholar] [CrossRef] [PubMed]
- Wäldchen, J.; Mäder, P. Plant Species Identification Using Computer Vision Techniques: A Systematic Literature Review. Arch. Comput. Methods Eng. 2017, 25, 507–543. [Google Scholar] [CrossRef] [PubMed]
- Xie, D.; Chen, L.; Liu, L.; Chen, L.; Wang, H. Actuators and Sensors for Application in Agricultural Robots: A Review. Machines 2022, 10, 913. [Google Scholar] [CrossRef]
- Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of machine vision in agricultural robot navigation: A review. Comput. Electron. Agric. 2022, 198, 107085. [Google Scholar] [CrossRef]
- Story, D.; Kacira, M. Design and implementation of a computer vision-guided greenhouse crop diagnostic system. Mach. Vis. Appl. 2015, 26, 495–506. [Google Scholar] [CrossRef]
- Moraitis, M.; Vaiopoulos, K.; Balafoutis, A.T. Design and Implementation of an Urban Farming Robot. Micromachines 2022, 13, 250. [Google Scholar] [CrossRef] [PubMed]
- Thomopoulos, V.; Bitas, D.; Papastavros, K.-N.; Tsipianitis, D.; Kavga, A. Development of an Integrated IoT-Based Greenhouse Control Three-Device Robotic System. Agronomy 2021, 11, 405. [Google Scholar] [CrossRef]
- Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A field-tested robotic harvesting system for iceberg lettuce. J. Field Robot. 2019, 37, 225–245. [Google Scholar] [CrossRef] [PubMed]
- Amer, M.A.; Rajan, P.; Mehanny, S.; Artyszak, A.; Ahmed, M.A.A. Innovative Design of an Experimental Jasmine Flower Automated Picker System Using Vertical Gripper and YOLOv5. Agriculture 2023, 13, 1595. [Google Scholar] [CrossRef]
- Kautsar, S.; Rosdiana, E.; Widiawan, B.; Setyohadi, D.P.S.; Riskiawan, H.Y.; Firgiyanto, R. Farming Bot: Precision Agriculture System in Limited Land Based On Computer Numerical Control (CNC). IOP Conf. Ser. Earth Environ. Sci. 2020, 411, 012059. [Google Scholar] [CrossRef]
- Lauretti, C.; Zompanti, A.; Sara Cimini, S.; De Gara, L.; Marco Santonico, M.; Zollo, L. A low-cost multispectral device for in-field fruit ripening assessment. IEEE Sens. J. 2023. [Google Scholar] [CrossRef]
- Noguera, M.; Millan, B.; Andújar, J.M. New, Low-Cost, Hand-Held Multispectral Device for In-Field Fruit-Ripening Assessment. Agriculture 2023, 13, 4. [Google Scholar] [CrossRef]
- Zhang, L.; Maki, H.; Ma, D.; Sánchez-Gallego, J.A.; Mickelbart, M.V.; Wang, L.; Rehman, T.U.; Jin, J. Optimized angles of the swing hyperspectral imaging system for single corn plant. Comput. Electron. Agric. 2019, 156, 349–359. [Google Scholar] [CrossRef]
- GRobotronics. V-Slot 2060 1500 mm-Natural Anodized. Available online: https://grobotronics.com/v-slot-2060-1500mm-natural-anodized.html (accessed on 10 June 2022).
- v-store.gr. Camera Tripod. Available online: https://v-store.gr/el/tripoda/u230254-tripodo-megalo-epekteinomeno-megistoy-ypsoys-210cm-p-7947.html (accessed on 10 June 2022).
- GRobotronics. V-Slot Gantry Plate 20 mm–80 mm Set. Available online: https://grobotronics.com/v-slot-gantry-plate-20mm-80mm-set.html (accessed on 10 June 2022).
- GRobotronics. Stepper Motor 2.8 kg·cm (200 step/rev) 42BYGHW298. Available online: https://grobotronics.com/stepper-motor-42byghw208-2.6kg.cm.html (accessed on 10 June 2022).
- GRobotronics. Stepper Motor Driver TMC2209 V2.0. Available online: https://grobotronics.com/stepper-motor-driver-tmc2209-v2.0.html (accessed on 10 June 2022).
- GRobotronics. Microswitch Mini SPDT ON-(ON)-with Roller Lever. Available online: https://grobotronics.com/microswitch-mini-spdt-on-on-with-roller-lever.html (accessed on 10 June 2022).
- Arducam. Arducam Stereo USB Camera, Synchronized Visible Light and Infrared Camera, 2MP 1080P Day and Night Mini UVC USB2.0 Webcam Board for Face Recognition and Biological Detection. Available online: https://www.uctronics.com/arducam-stereo-usb-camera-synchronized-visible-light-and-infrared-camera-2mp-1080p-day-and-night-mini-uvc-usb2-0-webcam-board-for-face-recognition-and-biological-detection.html (accessed on 14 June 2022).
- OMNIVISION. OV2710-1E Datasheet. Available online: https://www.ovt.com/products/ov02710-a68a-1e/ (accessed on 14 June 2022).
- GRobotronics. Infrared Led Board. Available online: https://grobotronics.com/infrared-led-board-pack-of-2.html (accessed on 20 June 2022).
- GRobotronics. Adafruit VL53L0X Time of Flight Distance Sensor- ~30 mm to 1000 mm. Available online: https://grobotronics.com/adafruit-vl53l0x-time-of-flight-distance-sensor-30-to-1000mm.html (accessed on 12 July 2022).
- GRobotronics. Ultrasonic Sensor—Ranging Detector 2–400 cm HC-SR04. Available online: https://grobotronics.com/ultrasonic-sensor-sr04.html (accessed on 12 July 2022).
- GRobotronics. Raspberry Pi 4 - Model B - 8 GB. Available online: https://grobotronics.com/raspberry-pi-4-model-b-8gb.html (accessed on 10 June 2022).
- Arducam. Datasheet OV2710 Dual-Lens Camera Module. Available online: https://www.uctronics.com/download/Amazon/B0198_Datasheet.pdf (accessed on 14 June 2022).
- Computer Vision User Interface (cvui). Available online: https://dovyski.github.io/cvui/ (accessed on 10 December 2022).
- Arvidsson, S.; Pérez-Rodríguez, P.; Mueller-Roeber, B. A growth phenotyping pipeline for Arabidopsis thaliana integrating image analysis and rosette area modeling for robust quantification of genotype effects. New Phytol. 2011, 191, 895–907. [Google Scholar] [CrossRef] [PubMed]
- Walter, A.; Scharr, H.; Gilmer, F.; Zierer, R.; Nagel, K.A.; Ernst, M.; Wiese, A.; Virnich, O.; Christ, M.M.; Uhlig, B.; et al. Dynamics of seedling growth acclimation towards altered light conditions can be quantified via GROWSCREEN: A setup and procedure designed for rapid optical phenotyping of different plant species. New Phytol. 2007, 174, 447–455. [Google Scholar] [CrossRef] [PubMed]
- Jansen, M.; Gilmer, F.; Biskup, B.; Nagel, K.A.; Rascher, U.; Fischbach, A.; Briem, S.; Dreissen, G.; Tittmann, S.; Braun, S.; et al. Simultaneous phenotyping of leaf growth and chlorophyll fluorescence via GROWSCREEN FLUORO allows detection of stress tolerance in Arabidopsis thaliana and other rosette plants. Funct. Plant Biol. 2009, 36, 902–914. [Google Scholar] [CrossRef] [PubMed]
- Minervini, M.; Abdelsamea, M.M.; Tsaftaris, S.A. Image-Based plant phenotyping with incremental learning and active contours. Ecol. Inform. 2013, 23, 35–48. [Google Scholar] [CrossRef]
- De Vylder, J.; Vandenbussche, F.; Hu, Y.; Philips, W.; Van Der Straeten, D. Rosette tracker: An open source image analysis tool for automatic quantification of genotype effects. Plant Physiol. 2012, 160, 1149–1159. [Google Scholar] [CrossRef]
- Kamarianakis, Z.; Panagiotakis, S. Design and Implementation of a Low-Cost Chlorophyll Content Meter. Sensors 2023, 23, 2699. [Google Scholar] [CrossRef]
- Advances in Agriculture 4.0. Available online: https://www.mdpi.com/journal/futureinternet/special_issues/A_IA4 (accessed on 4 April 2024).
Ultrasonic-True (r) | ToF-True (r) | Ultrasonic-ToF (r) |
---|---|---|
Target | Steps | Absolute Distance from “Home” in X-Axis (mm) | Calculated Distance from the Target’s Center (mm): Red Laser | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Experiment’s Iteration | Mean | STD | ||||||||||||
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | |||||
1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
2 | 1667 | 100.2 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0 |
3 | 5000 | 300 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
4 | 6500 | 390 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0 |
5 | 11,000 | 660 | 1 | 1 | 0.5 | 1 | 0.5 | 1 | 0.5 | 1 | 0.5 | 1 | 0.8 | 0.26 |
6 | 15,000 | 900 | 1.5 | 1.5 | 1.5 | 1.5 | 1 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.45 | 0.16 |
7 | 19,000 | 1140 | 1.5 | 1.5 | 2 | 2 | 2 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.65 | 0.24 |
8 | 22,000 | 1320 | 1 | 1.5 | 1 | 1.5 | 1 | 1 | 1 | 1 | 1 | 1 | 1.1 | 0.21 |
9 | 26,000 | 1560 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
10 | 30,000 | 1800 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0 |
11 | 33,000 | 1980 | 0.5 | 0 | 0.5 | 0 | 0 | 0.5 | 0.5 | 0.5 | 0 | 0.5 | 0.3 | 0.26 |
12 | 38,000 | 2280 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
13 | 43,000 | 2580 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 0 |
14 | 46,000 | 2760 | 1 | 1.5 | 1 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1 | 1.5 | 1.35 | 0.24 |
15 | 50,100 | 3000 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
Plant No. | Plant Localization Steps (Mean/Std) |
---|---|
A | |
B | |
C | |
D | |
E | |
F |
Evaluation Metric | Result |
---|---|
IoU | |
Precision | |
Recall | |
F1 score |
Day | Plant No. | IoU | Precision | Recall | F1 Score | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Day 10 | A | 0.952 | 0.997 | 0.955 | 0.975 | 0.87 | 0.37 | 215 | 217 | 0.42 | 0.46 | 0.79 | 0.80 |
B | 0.932 | 0.998 | 0.933 | 0.965 | 0.67 | 4.1 | 195 | 198 | 0.39 | 0.47 | 0.82 | 0.83 | |
C | 0.943 | 0.995 | 0.947 | 0.971 | 0.90 | 1 | 231 | 231 | 0.33 | 0.38 | 0.79 | 0.80 | |
D | 0.919 | 0.994 | 0.925 | 0.958 | 0.74 | 4.9 | 210 | 211 | 0.33 | 0.46 | 0.81 | 0.82 | |
E | 0.939 | 0.996 | 0.943 | 0.968 | 0.43 | 1.44 | 235 | 239 | 0.31 | 0.39 | 0.75 | 0.77 | |
Day 17 | A | 0.977 | 0.992 | 0.982 | 0.987 | 1.75 | 2.65 | 373 | 374 | 0.38 | 0.45 | 0.88 | 0.88 |
B | 0.974 | 0.998 | 0.977 | 0.987 | 0.82 | 0.99 | 375 | 376 | 0.41 | 0.51 | 0.85 | 0.86 | |
C | 0.967 | 0.996 | 0.971 | 0.983 | 1.65 | 3.1 | 414 | 416 | 0.38 | 0.46 | 0.84 | 0.85 | |
D | 0.962 | 0.998 | 0.964 | 0.981 | 1.66 | 2.37 | 379 | 387 | 0.36 | 0.45 | 0.88 | 0.89 | |
E | 0.976 | 0.994 | 0.982 | 0.988 | 1.55 | 2.24 | 421 | 420 | 0.39 | 0.53 | 0.86 | 0.87 | |
Day 24 | A | 0.991 | 0.995 | 0.996 | 0.996 | 0.67 | 4.6 | 632 | 631 | 0.63 | 0.61 | 0.90 | 0.90 |
B | 0.992 | 0.997 | 0.995 | 0.996 | 0.55 | 2.57 | 612 | 611 | 0.62 | 0.62 | 0.93 | 0.94 | |
C | 0.991 | 0.996 | 0.995 | 0.996 | 0.69 | 0.21 | 565 | 565 | 0.62 | 0.64 | 0.94 | 0.94 | |
D | 0.992 | 0.998 | 0.994 | 0.996 | 0.64 | 1.1 | 668 | 667 | 0.59 | 0.57 | 0.91 | 0.91 | |
E | 0.990 | 0.997 | 0.993 | 0.995 | 1.1 | 2.3 | 593 | 594 | 0.63 | 0.62 | 0.92 | 0.93 | |
Day 31 | A | 0.988 | 0.989 | 0.999 | 0.994 | 0.59 | 0.91 | 668 | 666 | 0.53 | 0.52 | 0.89 | 0.89 |
B | 0.989 | 0.991 | 0.998 | 0.995 | 0.98 | 0.12 | 640 | 638 | 0.65 | 0.64 | 0.93 | 0.93 | |
C | 0.990 | 0.991 | 0.999 | 0.995 | 0.28 | 2.23 | 608 | 606 | 0.54 | 0.54 | 0.91 | 0.91 | |
D | 0.991 | 0.993 | 0.998 | 0.996 | 0.38 | 8.32 | 681 | 678 | 0.58 | 0.59 | 0.90 | 0.90 | |
E | 0.988 | 0.991 | 0.997 | 0.994 | 0.58 | 1.06 | 631 | 629 | 0.49 | 0.49 | 0.91 | 0.91 |
Target | Steps | Absolute Distance from “Home” in X-Axis (mm) | Measured Deviation along Y-Axis (mm): Red and Green Lasers | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Experiment’s Iteration | Mean | STD | ||||||||||||
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | |||||
1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
2 | 1667 | 100.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
3 | 5000 | 300 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
4 | 6500 | 390 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 11,000 | 660 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
6 | 15,000 | 900 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0.5 | 0 |
7 | 19,000 | 1140 | 1 | 1 | 1 | 1 | 0.5 | 1 | 0.5 | 1 | 1 | 1 | 0.9 | 0.21 |
8 | 22,000 | 1320 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
9 | 26,000 | 1560 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
10 | 30,000 | 1800 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
11 | 33,000 | 1980 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
12 | 38,000 | 2280 | 1 | 1 | 1 | 0.5 | 1 | 1 | 1 | 1 | 0.5 | 1 | 0.9 | 0.21 |
13 | 43,000 | 2580 | 0 | 0.5 | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0.15 | 0.24 |
14 | 46,000 | 2760 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
15 | 50,100 | 3000 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Evaluation Metric | Result |
---|---|
IoU | |
Precision | |
Recall | |
F1 score | |
D | |
S | |
C |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kamarianakis, Z.; Perdikakis, S.; Daliakopoulos, I.N.; Papadimitriou, D.M.; Panagiotakis, S. Design and Implementation of a Low-Cost, Linear Robotic Camera System, Targeting Greenhouse Plant Growth Monitoring. Future Internet 2024, 16, 145. https://doi.org/10.3390/fi16050145
Kamarianakis Z, Perdikakis S, Daliakopoulos IN, Papadimitriou DM, Panagiotakis S. Design and Implementation of a Low-Cost, Linear Robotic Camera System, Targeting Greenhouse Plant Growth Monitoring. Future Internet. 2024; 16(5):145. https://doi.org/10.3390/fi16050145
Chicago/Turabian StyleKamarianakis, Zacharias, Spyros Perdikakis, Ioannis N. Daliakopoulos, Dimitrios M. Papadimitriou, and Spyros Panagiotakis. 2024. "Design and Implementation of a Low-Cost, Linear Robotic Camera System, Targeting Greenhouse Plant Growth Monitoring" Future Internet 16, no. 5: 145. https://doi.org/10.3390/fi16050145
APA StyleKamarianakis, Z., Perdikakis, S., Daliakopoulos, I. N., Papadimitriou, D. M., & Panagiotakis, S. (2024). Design and Implementation of a Low-Cost, Linear Robotic Camera System, Targeting Greenhouse Plant Growth Monitoring. Future Internet, 16(5), 145. https://doi.org/10.3390/fi16050145