AI-Assisted Vision for Agricultural Robots
Abstract
:1. Introduction
2. Review Framework
3. Operational Classification Analysis
3.1. Weed Detection
Sensor | Crop | Proposed AI Algorithm | Metrics | Ref. |
---|---|---|---|---|
RGB | Peach | Faster-RCNN | PR: 86.41% RE: 92.15% MIoU: 85.45% | [11] |
RGB | Rice | ESNet | PR: 86.18% RE: 86.53% MIoU: 51.78% F1: 86.08% | [10] |
RGB | Red radish, garden cress, and dandelion | Faster-RCNN | mAP: 67–95% plants mAP: 84–99% weeds | [19] |
RGB | Maize | YOLOv3 | AC: 93.43 maize AC: 90.9 weeds | [12] |
RGB | Soybean | Area feature Template matching Saturation threshold Voting algorithm | AC: 73.3% AC: 68.42% AC: 65.22% AC: 81.82% | [21] |
RGB | 32 kinds [22] | kNN SVM Decision tree Random forest CNN | AC: 84.4% PR: 85.2% AC: 77% PR: 71% AC: 78.8% PR: 79.5% AC: 90% PR: 79.5% AC: 99.5% | [23] |
RGB | Maize, common bean | Mask RCNN | mAP: 0.49 | [15] |
RGB | Cabbage | Haar cascade classifier | AC: 96.3% | [24] |
3.2. Crop Scouting
Sensor | Crop | Task | Proposed AI Algorithm | Metrics | Ref. |
---|---|---|---|---|---|
Spectral | Soybeans | Image segmentation | Simple linear regression NDVI based segmentation | R2: 0.71 (daytime), 0.85 (nighttime) AC: 72.5% (daytime), 73.9% (nighttime) | [25] |
Stereo camera | Chinese cabbage, potato, sesame, radish, and soybean | Crop height measurement | Coordinate transformations of pixels | R2: 0.78–0.84 | [26] |
Stereo, thermal, spectral camera | Grape | Harvesting zone sorting | - | - | [27] |
RGB | Greenhouse tomato | Fruit counting | Faster R-CNN (detection) Centroid based (counting) | AC: 88.6% (occluded objects included) AC: 90.2% (without occluded objects) | [28] |
OptRx | Orchards and vineyards | Canopy thickness | Proprietary | R2: 0.78–0.80 | [29] |
Spectral, thermal | Grape | Water status | PLS | R2: 0.57 (morning), 0.42 (midday) RMSE 0.191 | [30] |
3.3. Phenotyping
Sensor | Crop | Task | Proposed AI Algorithm | Metrics | Ref. |
---|---|---|---|---|---|
Trinocular stereo camera | Maize, sorghum | Plant height measurements | 3D reconstruction | R2: 0.99 | [31] |
Stereo camera, ToF depth sensor, IR camera | Energy sorghum | Plant height, stem width measurements | - | Absolute measurement error: 15% (plant height), 13% (stem width) | [32] |
RGB-D | Sorghum | Leaf area, length, and width measurements | Structure from Motion (3D reconstruction), SVM (pixel classification), MLP (phenotype classification) | Relative RMSE: 26.15% (Leaf area), 26.67% (Leaf length), 25.15% (Leaf width) | [33] |
Binocular RGB cameras | Sorghum | Plant height, plant width, convex hull volume, and surface area measurements | 3DMST, OpenCV’s Semi-Global Block Matching | Height: STD 0.05 m, CV 4.7%, 3DMST, STD 0.04 m, CV 3.8% SGBM Width: STD 0.03, CV 8.6% (3DMST), STD 0.04 m, CV 9.8% (SGBM) Convex hull volume: STD 0.03 m3 CV 17.8% (3DMST), STD 0.03 m3 CV 10.7% (SGBM) Plant surface area: STD 0.08 m2, CV 8.7% (3DMST), STD 0.11 m2 CV = 9.1 (SGBM) | [34] |
RGB and IR | Wheat | - | - | - | [35] |
RGB | Corn | Corn stand counting | Transfer learning, CNN with Softmax layer replaced by SVM | R2: 0.96 | [36] |
RGB-D | Maize | Stem detection and diameter measurement | Faster RCNN, convex hull and plane projection | mAP: 67% R2: 0.72 RMSE 2.95 | [37] |
RGB-D | Maize | Maize stalk detection | CNN | AC: 90% | [38] |
3.4. Disease/Insect/Deficiency Detection
Sensor | Crop | Task | Proposed AI Algorithm | Metrics | Ref. |
---|---|---|---|---|---|
RGB | Greenhouse tomato | Leaf mold Yellow leaf curl virus | SVM RF CNN | AC: 98.61% AC: 80% AC: 80% | [43] |
RGB | Greenhouse tomato | Plant village dataset diseases [44] | AlexNet SqueezeNet | AC: 96% AC: 94% | [40] |
RGB | Cotton, Groundnut | Bacterial blight/magnesium deficiency (cotton) Leaf spot/anthracnose (groundnut) | NN | AC: 83–96% | [42] |
RGB | Cotton | healthy leaf, healthy cotton, healthy repining ball, diseased leaf, diseased damages cotton, diseased repining ball, and insect | - | AC: 90% | [48] |
RGB | Coffee | Alternaria, Bacterial Blight, Myrothecium | K-means | AC: 79% | [45] |
RGB, Spectral, Thermal | Olive tree | Xylella Fastidiosa | - | R2 < 0.45 | [46] |
RGB-D, Spectral, Thermal | Apple | Rust Scab | Mask R-CNN (segmentation) VGG16 (classification) | PR: 99.2% RE: 97.5% F1: 98.3% (Healthy) PR: 96% RE: 98% F1: 97% (Rust) PR: 97.2% RE: 97.2% F1: 97.2% (Scab) | |
RGB | - | Pyralidae insects | Otsu segmentation, Hu moments | - | [47] |
RGB | Basil | Bacterial blight | K-means SVM | - | [41] |
3.5. Robotic Spraying
Sensor | Crop | Proposed AI Algorithm | Metrics | Ref. |
---|---|---|---|---|
RGB | Vineyards | Data fusion algorithm | Error: Standard Deviation: x = 0.34, y = 0,81, θ = 0.11, φ = 0.17 | [49] |
Multispectral camera | Grapes (powdery mildew) | Spectral indices | 85–100% (detection accuracy), 65–85% (reduction in pesticide use) | [52] |
kinect (RGB, IR) | Any | Huff transformations | 19% detection error | [53] |
Stereo camera | greenhouse tomato, vineyards | Sensor fusion | 0.23 m trajectory error | [54] |
RGB camera | Cereals | Image processing, dedicated developed algorithm | 18–97% savings in herbicide | [51] |
High Speed RGB camera | Orchards, vineyards | Image analysis | Airflow up to 10 m/s−1, distance 300 mm, 150 mm diameter | [50] |
RGB | Any | Image analysis | Pesticide reduction: 45% | [55] |
3.6. Robotic Harvesting
Sensor | Crop | Proposed AI Algorithm | Metrics | Ref. |
---|---|---|---|---|
RGB camera | Apples | CNN with improved YOLOv5 | PR: 83.83% RE: 91.48% mAP: 86.75% F1: 87.49% | [63] |
RGB camera | Zanthoxylum pepper | CNN with improved YOLOv5 | mAP: 94.5%, Detection speed (s/pic) 0.012, Detection speed on edge device (s/pic) 0.072, GPU load on edge device 20.11 Detection FPS on edge device 33.23 | [64] |
RGB camera | Strawberries | Mask R-CNN and custom vision strawberry localization method to find the location of the strawberries | PR: 95.78% RE: 95.41%, MIoU: 89.85% average error of visual strawberry localization = ±1.2 mm | [65] |
RGB-D camera | Sweet peppers | Color filter for identifying the pepper, semantic segmentation using deep learning, Canny edge and Hough transformation for stem detection | Under a single row system assumption, harvest performance averaged over both crop varieties was 61% for the modified crop as opposed to 29% of the unmodified crop. Under a single row system assumption, the harvest success rate for the modified condition was 81% for first pepper variety and 43% for second pepper variety. For the unmodified condition, results were closer at 36% for first pepper variety and 23% for second pepper variety. | [56] |
RGB camera | Apples | Support vector machine with radial basis function | AC: 77% Average harvesting time 15 s per apple. | [66] |
RGB and stereo camera | Strawberries | Color filtering to detect strawberry and set ROI for searching for the peduncle in 3D image | AC: 60% Fruits with 80% maturity or more, have the successful harvesting rate at 41.3% with suction device and 34.9% without suction. | [58] |
Stereo camera | Tomatoes | Image normalization with difference of red and green graying, OTSU segmentation, ellipse fitting, localization of fruits can be found using feature extraction and matching using Harris corners and camera’s intrinsic | ACC 99.3% Tomato position error less that 10 mm when distance is less than 600 mm except some singular points. Success rate of picking tomatoes at 86% and 88% in two round tests. | [62] |
RGB | Tomatoes | HIS color filtering, image binarization, circular fitting | Success rates 83.9% in first round tests and 79.4% in second round tests | [61] |
RGB camera | Apples and oranges | Image pre- and post-processing and Yolov3 for detection | MIoU: 89.7% PR: 93.7% RE: 93.3% F1: 92.8% | [67] |
Stereo camera | Tomatoes | Color extraction, Euclidean distance clustering, Z-sorting, Sphere fitting using RANSAC | Successful harvesting rate: 60% | [68] |
RGB camera | Green pepper | Local contrast enhancement, Super-pixels extracted via energy-driven sampling (SEEDS), saliency map construction, morphological operations | AC: 83.6% RE: 81.2% | [69] |
RGB camera | Apples | Adaptive gamma correction, color filtering, total pixel area calculation | Average time reduction ratio: 70%. | [70] |
RGB camera | Broccoli | low-pass filtering, E7*E7 Laws’ texture energy filter, median filtering, morphological operation | PR: 99.5%, AC: 92.4% | [59] |
RGB and ToF camera | Aubergine | For detection of aubergine: color transformation, cubic SVM, watershed transformation, point cloud extraction, ellipse fitting for occlusions: centroid calculation, calculation of vector direction. | RE: 88.10% PR: 88.35% F1: 87.8% | [57] |
RGB and NIR camera | Apples, avocado, capsicum, mango, rock melon, strawberry, orange, and sweet pepper | Faster R-CNN with late or early fusion. | F1: 0.828–0.948 | [71] |
RGB and Stereo cameras | Strawberries | Color segmentation using HSI color space, thresholds in this color space to get the maturity level, region of interest (ROI) for searching for the peduncle on a predefined threshold area | Successful harvesting rate: 54.9% | [60] |
RGB camera | Strawberries and oranges | Multi-task Cascaded Convolutional Networks using 3 CNN, augmentation fusion dataset | TPR: 0.98 with 0.9 threshold value | [72] |
3.7. Robotic Vision Navigation
Sensor | Crop | Proposed AI Algorithm | Metrics | Ref. |
---|---|---|---|---|
Multispectral camera | Orchards | Green plane extraction, thresholding | Maximum deviation: 3.5 cm RMSE: 2.13 cm | [82] |
RGB camera | Wheat and sorghum | Kalman filter | RMSE: 28–120 mm | [76] |
RGB camera | Orchards | Gabor filtering, PCA, K-means, Silhouette method, medial axis, fuzzy logic | Maximum trajectory tracking: 14.6 ± 6.8 mm RMSE: 45.3 mm | [73] |
RGB camera | Maize | Background segmentation, binary image, line extraction | RMSE: 78.1 ± 7.5 mm | [77] |
RGB camera | Crop-row fields | Deep CNN | RMSE: 5.8 degrees | [89] |
RGB camera | Sweet, green and snow pea, Chinese lettuce, Cabbage, green pepper, tomato, and tea | Vegetation index, K-means, pixel spatial operations, RANSAC | Max. RMSE of positioning (pixels): 83.9 Max. RMSE angle (degrees): 16.1 | [88] |
Stereo camera | Soya bean Fields | Sobel, sum squared difference | RMSE speed estimation: 0.13 m/s Yaw angle estimation: 0.44 degrees | [91] |
RGB camera | Arable fields | Otsu’s method, SIFT | Average deviation: 3.82 cm (Navigation accuracy) F1: 62.1% (Lane Switching technique) | [74] |
RGB camera | Crop-row fields | Excess Green Index, least-square fitting method | Deviation:4 cm | [79] |
RGB camera | Maize | Otsu’s method, Canny edge, Hough transformation, linear fitting | Angle difference ±5 cm between extracted and artificial navigation line | [80] |
RGB camera | Wolfberry orchards | gray scaling, maximum entropy threshold, morphological opening operation, rectangle fitting, least-square method, fuzzy control | Lateral deviation ≤ 6.2 cm, Average lateral deviation: 2.9 cm | [83] |
RGB camera | Crop row fields | Vegetation index-based image segmentation, Otsu’s method, Particle Swarm Optimization, Morphological Operation, Floyd algorithm, linear least-square method | Maximum deviation detection accuracy Θ(left): 0.49 Θ(middle): 0.4303 Θ(right): 4.2302, Θ(average): 1.5283 | [87] |
RGB camera | Greenhouse cucumber | Gray scaling, image binarization, morphological operations, Hough transformation, least-square method | First experiment, Max angle deviation Predicted point Hough transform: 0.48° Traditional Hough transform: 9.51°, least-square method: 15.21°. Second experiment, Maximum deviation angle, predicted point Hough transform: 0.46° Traditional Hough transform: 1.46°, least-square method: 5.28°. | [84] |
RGB camera | Paddy fields | SUSAN corners, Hough transform, fuzzy controller | With initial position and angle error: 0, the SE: 4.29 degrees and 44.68 mm, With initial position: −20 mm and angle error: −12 degrees, SE = 8,61 degrees and 45, 42 mm, With initial position error: 80 mm and angle error: 5 degrees, SE: 8, 85 degrees and 53, 56 mm, With initial position error: 40 mm and angle error: 17, SE: 8, 60 degrees and (47 and 32 mm) | [78] |
RGB camera | Maize | Image segmentation, image denoising, position point extraction, straight-line fitting, extract navigation line | AC: 92% | [75] |
Time-of-Flight camera | Maize and sorghum | Bilateral filtering, RANSAC, Euclidean clustering | MAE: 3.4–3.6 cm. Lateral positioning MAE: 4.2–5.0 cm | [64] |
RGB camera | Sugar beet | Gray scaling, Hough transform | Mean error: 5–198 (±6–108) mm Median error: 22 mm | [85] |
RGB camera | Tea crops | Semantic segmentation, Hough-line transform | Angle bias: 6.2 degrees and 13.9 pixels distance | [81] |
NIR and depth camera | Carrots, onions, and radish | Image segmentation, RANSAC, particle filter | RANSAC: 94.40% Particle filter: 97.71% | [90] |
RGB camera | Potato | Excess green, morphological operation, least-square error method | The mean detection rate DBMR against TMGEM and DAGP regarding CRDA TMGEM: 0.627 DAGP: 0.860 DBMR: 0.871 | [86] |
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Santos, C.H.; Pekkeriet, E. Agricultural robotics for field operations. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef] [PubMed]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
- Reddy, N.V.; Reddy, A.; Pranavadithya, S.; Kumar, J.J. A critical review on agricultural robots. Int. J. Mech. Eng. Technol. 2016, 7, 183–188. [Google Scholar]
- Alberto-Rodriguez, A.; Neri-Muñoz, M.; Fernández, J.C.R.; Márquez-Vera, M.A.; Velasco, L.E.R.; Díaz-Parra, O.; Hernández-Huerta, E. Review of control on agricultural robot tractors. Int. J. Comb. Optim. Probl. Inform. 2020, 11, 9–20. [Google Scholar]
- Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. A review of autonomous navigation systems in agricultural environments. In Proceedings of the SEAg 2013: Innovative Agricultural Technologies for a Sustainable Future, Barton, Australia, 22–25 September 2013. [Google Scholar]
- Zhao, Y.; Gong, L.; Huang, Y.; Liu, C. A review of key techniques of vision-based control for harvesting robot. Comput. Electron. Agric. 2016, 127, 311–323. [Google Scholar] [CrossRef]
- Fue, K.G.; Porter, W.M.; Barnes, E.M.; Rains, G.C. An extensive review of mobile agricultural robotics for field operations: Focus on cotton harvesting. AgriEngineering 2020, 2, 150–174. [Google Scholar] [CrossRef] [Green Version]
- Defterli, S.G. Review of robotic technology for strawberry production. Appl. Eng. Agric. 2016, 32, 301–318. [Google Scholar]
- Raja, R.; Nguyen, T.T.; Slaughter, D.C.; Fennimore, S.A. Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosyst. Eng. 2020, 192, 257–274. [Google Scholar] [CrossRef]
- Adhikari, S.P.; Yang, H.; Kim, H. Learning semantic graphics using convolutional encoder–decoder network for autonomous weeding in paddy. Front. Plant Sci. 2019, 10, 1404. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Luo, J.; You, Y.; Wang, D.; Sun, X.; Lv, J.; Ma, W.; Zhang, X. Peach tree detection for weeding robot based on Faster-RCNN. In Proceedings of the 2020 ASABE Annual International Virtual Meeting, Virtual, 13–15 July 2020; p. 1. [Google Scholar]
- Quan, L.; Jiang, W.; Li, H.; Li, H.; Wang, Q.; Chen, L. Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosyst. Eng. 2022, 216, 13–31. [Google Scholar] [CrossRef]
- Raja, R.; Nguyen, T.T.; Vuong, V.L.; Slaughter, D.C.; Fennimore, S.A. RTD-SEPs: Real-time detection of stem emerging points and classification of crop-weed for robotic weed control in producing tomato. Biosyst. Eng. 2020, 195, 152–171. [Google Scholar] [CrossRef]
- Choi, K.H.; Han, S.K.; Han, S.H.; Park, K.-H.; Kim, K.-S.; Kim, S. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Comput. Electron. Agric. 2015, 113, 266–274. Available online: https://www.sciencedirect.com/science/article/pii/S0168169915000563 (accessed on 17 March 2022). [CrossRef]
- Champ, J.; Mora-Fallas, A.; Goëau, H.; Mata-Montero, E.; Bonnet, P.; Joly, A. Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots. Appl. Plant Sci. 2020, 8, e11373. [Google Scholar] [CrossRef]
- Machleb, J.; Peteinatos, G.G.; Sökefeld, M.; Gerhards, R. Sensor-based intrarow mechanical weed control in sugar beets with motorized finger weeders. Agronomy 2021, 11, 1517. [Google Scholar] [CrossRef]
- Igawa, H.; Tanaka, T.; Kaneko, S.; Tada, T.; Suzuki, S.; Ohmura, I. Base position detection of grape stem considering its displacement for weeding robot in vineyards. In Proceedings of the IECON 2012-38th Annual Conference on IEEE Industrial Electronics Society, Montreal, QC, Canada, 25–28 October 2012; pp. 2567–2572. [Google Scholar]
- Zhang, C.; Huang, X.; Liu, W.; Zhang, Y.; Li, N.; Zhang, J.; Li, W. Information acquisition method for mechanical intra-row weeding robot. Trans. Chin. Soc. Agric. Eng. 2012, 28, 142–146. [Google Scholar]
- Shah, T.M.; Nasika, D.P.B.; Otterpohl, R. Plant and weed identifier robot as an agroecological tool using artificial neural networks for image identification. Agriculture 2021, 11, 222. [Google Scholar] [CrossRef]
- Raja, R.; Nguyen, T.T.; Slaughter, D.C.; Fennimore, S.A. Real-time robotic weed knife control system for tomato and lettuce based on geometric appearance of plant labels. Biosyst. Eng. 2020, 194, 152–164. [Google Scholar] [CrossRef]
- Miao, Z.; Yu, X.; Li, N.; He, C.; Sun, T. Weed Detection Based on the Fusion of Multiple Image Processing Algorithms. In Proceedings of the 2021 40th Chinese Control Conference (CCC), Shanghai, China, 26–28 July 2021; pp. 4217–4222. [Google Scholar]
- Wu, S.G.; Bao, F.S.; Xu, E.Y.; Wang, Y.-X.; Chang, Y.-F.; Xiang, Q.-L. A leaf recognition algorithm for plant classification using probabilistic neural network. In Proceedings of the 2007 IEEE International Symposium on Signal Processing and Information Technology, Giza, Egypt, 15–18 December 2007; pp. 11–16. [Google Scholar]
- Sethia, G.; Guragol, H.K.S.; Sandhya, S.; Shruthi, J.; Rashmi, N. Automated Computer Vision based Weed Removal Bot. In Proceedings of the 2020 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), Bangalore, India, 2–4 July 2020; pp. 1–6. [Google Scholar]
- Vedula, R.; Nanda, A.; Gochhayat, S.S.; Hota, A.; Agarwal, R.; Reddy, S.K.; Mahapatra, S.; Swain, K.K.; Das, S. Computer vision assisted autonomous intra-row weeder. In Proceedings of the 2018 International Conference on Information Technology (ICIT), Bhubaneswar, India, 20–22 December 2018; pp. 79–84. [Google Scholar]
- Yamasaki, Y.; Morie, M.; Noguchi, N. Development of a high-accuracy autonomous sensing system for a field scouting robot. Comput. Electron. Agric. 2022, 193, 106630. [Google Scholar] [CrossRef]
- Kim, W.-S.; Lee, D.-H.; Kim, Y.-J.; Kim, T.; Lee, W.-S.; Choi, C.-H. Stereo-vision-based crop height estimation for agricultural robots. Comput. Electron. Agric. 2021, 181, 105937. [Google Scholar] [CrossRef]
- Rovira-Más, F.; Saiz-Rubio, V.; Cuenca-Cuenca, A. Sensing architecture for terrestrial crop monitoring: Harvesting data as an asset. Sensors 2021, 21, 3114. [Google Scholar] [CrossRef] [PubMed]
- Seo, D.; Cho, B.-H.; Kim, K. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, 11, 2211. [Google Scholar] [CrossRef]
- Vidoni, R.; Gallo, R.; Ristorto, G.; Carabin, G.; Mazzetto, F.; Scalera, L.; Gasparetto, A. ByeLab: An agricultural mobile robot prototype for proximal sensing and precision farming. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition 2017, Tampa, FL, USA, 3–9 November 2017; Volume 58370, p. V04AT05A057. [Google Scholar]
- Fernández-Novales, J.; Saiz-Rubio, V.; Barrio, I.; Rovira-Más, F.; Cuenca-Cuenca, A.; Santos Alves, F.; Valente, J.; Tardaguila, J.; Diago, M.P. Monitoring and mapping vineyard water status using non-invasive technologies by a ground robot. Remote Sens. 2021, 13, 2830. [Google Scholar] [CrossRef]
- Shafiekhani, A.; Kadam, S.; Fritschi, F.B.; DeSouza, G.N. Vinobot and vinoculer: Two robotic platforms for high-throughput field phenotyping. Sensors 2017, 17, 214. [Google Scholar] [CrossRef] [PubMed]
- SYoung, N.; Kayacan, E.; Peschel, J.M. Design and field evaluation of a ground robot for high-throughput phenotyping of energy sorghum. Precis. Agric. 2019, 20, 697–722. [Google Scholar]
- Vijayarangan, S.; Sodhi, P.; Kini, P.; Bourne, J.; Du, S.; Sun, H.; Poczos, B.; Apostolopoulos, D.; Wettergreen, D. High-throughput robotic phenotyping of energy sorghum crops. In Field and Service Robotics; Springer: Cham, Switzerland, 2018; pp. 99–113. [Google Scholar]
- Bao, Y.; Tang, L.; Breitzman, M.W.; Fernandez, M.G.S.; Schnable, P.S. Field-based robotic phenotyping of sorghum plant architecture using stereo vision. J. Field Robot. 2019, 36, 397–415. [Google Scholar] [CrossRef]
- Grimstad, L.; Skattum, K.; Solberg, E.; Loureiro, G.; From, P.J. Thorvald II configuration for wheat phenotyping. In Proceedings of the IROS Workshop on Agri-Food Robotics: Learning from Industry, Vancouver, BC, Canada, 28 September 2017; p. 4. [Google Scholar]
- Kayacan, E.; Zhang, Z.-Z.; Chowdhary, G. Embedded High Precision Control and Corn Stand Counting Algorithms for an Ultra-Compact 3D Printed Field Robot. In Proceedings of the Robotics: Science and Systems, Pittsburgh, PA, USA, 26–30 June 2018; Volume 14, p. 9. [Google Scholar]
- Fan, Z.; Sun, N.; Qiu, Q.; Li, T.; Feng, Q.; Zhao, C. In Situ Measuring Stem Diameters of Maize Crops with a High-Throughput Phenotyping Robot. Remote Sens. 2022, 14, 1030. [Google Scholar] [CrossRef]
- Manish, R.; Lin, Y.-C.; Ravi, R.; Hasheminasab, S.M.; Zhou, T.; Habib, A. Development of a Miniaturized Mobile Mapping System for In-Row, Under-Canopy Phenotyping. Remote Sens. 2021, 13, 276. [Google Scholar] [CrossRef]
- Karpyshev, P.; Ilin, V.; Kalinov, I.; Petrovsky, A.; Tsetserukou, D. Autonomous mobile robot for apple plant disease detection based on cnn and multi-spectral vision system. In Proceedings of the 2021 IEEE/SICE International Symposium on System Integration (SII), Iwaki, Japan, 11–14 January 2021; pp. 157–162. [Google Scholar]
- Durmuş, H.; Güneş, E.O.; Kırcı, M. Disease detection on the leaves of the tomato plants by using deep learning. In Proceedings of the 2017 6th International Conference on Agro-Geoinformatics, Fairfax, VA, USA, 7–10 August 2017; pp. 1–5. [Google Scholar]
- Nooraiyeen, A. Robotic Vehicle for Automated Detection of Leaf Diseases. In Proceedings of the 2020 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), Bangalore, India, 2–4 July 2020; pp. 1–6. [Google Scholar]
- Pilli, S.K.; Nallathambi, B.; George, S.J.; Diwanji, V. eAGROBOT—A robot for early crop disease detection using image processing. In Proceedings of the 2015 2nd International Conference on Electronics and Communication Systems (ICECS), Coimbatore, India, 26–27 February 2015; pp. 1684–1689. [Google Scholar]
- Fernando, S.; Nethmi, R.; Silva, A.; Perera, A.; de Silva, R.; Abeygunawardhana, P.K.W. Intelligent disease detection system for greenhouse with a robotic monitoring system. In Proceedings of the 2020 2nd International Conference on Advancements in Computing (ICAC), Colombo, Sri Lanka, 10–11 December 2020; Volume 1, pp. 204–209. [Google Scholar]
- Hughes, D.; Salathé, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv 2015, arXiv:1511.08060. [Google Scholar]
- Rahul, M.S.P.; Rajesh, M. Image processing based Automatic Plant Disease Detection and Stem Cutting Robot. In Proceedings of the 2020 Third International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 20–22 August 2020; pp. 889–894. [Google Scholar]
- Rey, B.; Aleixos, N.; Cubero, S.; Blasco, J. XF-ROVIM. A field robot to detect olive trees infected by Xylella fastidiosa using proximal sensing. Remote Sens. 2019, 11, 221. [Google Scholar] [CrossRef] [Green Version]
- Hu, Z.; Liu, B.; Zhao, Y. Agricultural robot for intelligent detection of pyralidae insects. In Agricultural Robots-Fundamentals and Applications; IntechOpen: London, UK, 2018. [Google Scholar]
- Doddamani, S.T.; Karadgi, S.; Giriyapur, A.C. Multi-Label Classification of Cotton Plant with Agriculture Mobile Robot. In Data Intelligence and Cognitive Informatics; Springer: Singapore, 2022; pp. 759–772. [Google Scholar]
- Zaidner, G.; Shapiro, A. A novel data fusion algorithm for low-cost localisation and navigation of autonomous vineyard sprayer robots. Biosyst. Eng. 2016, 146, 133–148. [Google Scholar] [CrossRef]
- Malneršič, A.; Dular, M.; Širok, B.; Oberti, R.; Hočevar, M. Close-range air-assisted precision spot-spraying for robotic applications: Aerodynamics and spray coverage analysis. Biosyst. Eng. 2016, 146, 216–226. [Google Scholar] [CrossRef]
- Berge, T.W.; Goldberg, S.; Kaspersen, K.; Netland, J. Towards machine vision based site-specific weed management in cereals. Comput. Electron. Agric. 2012, 81, 79–86. [Google Scholar] [CrossRef]
- Oberti, R.; Marchi, M.; Tirelli, P.; Calcante, A.; Iriti, M.; Tona, E.; Hočevar, M.; Baur, J.; Pfaff, J.; Schütz, C. Selective spraying of grapevines for disease control using a modular agricultural robot. Biosyst. Eng. 2016, 146, 203–215. [Google Scholar] [CrossRef]
- Hejazipoor, H.; Massah, J.; Soryani, M.; Vakilian, K.A.; Chegini, G. An intelligent spraying robot based on plant bulk volume. Comput. Electron. Agric. 2021, 180, 105859. [Google Scholar] [CrossRef]
- Cantelli, L.; Bonaccorso, F.; Longo, D.; Melita, C.D.; Schillaci, G.; Muscato, G. A small versatile electrical robot for autonomous spraying in agriculture. AgriEngineering 2019, 1, 391–402. [Google Scholar] [CrossRef] [Green Version]
- Berenstein, R.; Edan, Y. Automatic adjustable spraying device for site-specific agricultural application. IEEE Trans. Autom. Sci. Eng. 2017, 15, 641–650. [Google Scholar] [CrossRef]
- Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.; et al. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
- SepúLveda, D.; Fernández, R.; Navas, E.; Armada, M.; González-De-Santos, P. Robotic aubergine harvesting using dual-arm manipulation. IEEE Access 2020, 8, 121889–121904. [Google Scholar] [CrossRef]
- Hayashi, S.; Shigematsu, K.; Yamamoto, S.; Kobayashi, K.; Kohno, Y.; Kamata, J.; Kurita, M. Evaluation of a strawberry-harvesting robot in a field test. Biosyst. Eng. 2010, 105, 160–171. [Google Scholar] [CrossRef]
- Blok, P.M.; Barth, R.; van den Berg, W. Machine vision for a selective broccoli harvesting robot. IFAC-PapersOnLine 2016, 49, 66–71. [Google Scholar] [CrossRef]
- Hayashi, S.; Yamamoto, S.; Saito, S.; Ochiai, Y.; Kamata, J.; Kurita, M.; Yamamoto, K. Field operation of a movable strawberry-harvesting robot using a travel platform. Jpn. Agric. Res. Q. JARQ 2014, 48, 307–316. [Google Scholar] [CrossRef] [Green Version]
- Wang, X.; Wu, P.; Feng, Q.; Wang, G. Design and test of tomatoes harvesting robot. J. Agric. Mech. Res. 2016, 4, 94–98. [Google Scholar]
- Lili, W.; Bo, Z.; Jinwei, F.; Xiaoan, H.; Shu, W.; Yashuo, L.; Zhou, Q.; Chongfeng, W. Development of a tomato harvesting robot used in greenhouse. Int. J. Agric. Biol. Eng. 2017, 10, 140–149. [Google Scholar] [CrossRef]
- Yan, B.; Fan, P.; Lei, X.; Liu, Z.; Yang, F. A real-time apple targets detection method for picking robot based on improved YOLOv5. Remote Sens. 2021, 13, 1619. [Google Scholar] [CrossRef]
- Gai, J.; Xiang, L.; Tang, L. Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
- Yu, Y.; Zhang, K.; Yang, L.; Zhang, D. Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN. Comput. Electron. Agric. 2019, 163, 104846. [Google Scholar] [CrossRef]
- De-An, Z.; Jidong, L.; Wei, J.; Ying, Z.; Yu, C. Design and control of an apple harvesting robot. Biosyst. Eng. 2011, 110, 112–122. [Google Scholar] [CrossRef]
- Kuznetsova, A.; Maleva, T.; Soloviev, V. Using YOLOv3 algorithm with pre-and post-processing for apple detection in fruit-harvesting robot. Agronomy 2020, 10, 1016. [Google Scholar] [CrossRef]
- Yaguchi, H.; Nagahama, K.; Hasegawa, T.; Inaba, M. Development of an autonomous tomato harvesting robot with rotational plucking gripper. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 652–657. [Google Scholar]
- Ji, W.; Gao, X.; Xu, B.; Chen, G.; Zhao, D. Target recognition method of green pepper harvesting robot based on manifold ranking. Comput. Electron. Agric. 2020, 177, 105663. [Google Scholar] [CrossRef]
- Lv, J.; Wang, Y.; Xu, L.; Gu, Y.; Zou, L.; Yang, B.; Ma, Z. A method to obtain the near-large fruit from apple image in orchard for single-arm apple harvesting robot. Sci. Hortic. 2019, 257, 108758. [Google Scholar] [CrossRef]
- Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C. Deepfruits: A fruit detection system using deep neural networks. Sensors 2016, 16, 1222. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Gui, G.; Khattak, A.M.; Wang, M.; Gao, W.; Jia, J. Multi-task cascaded convolutional networks based intelligent fruit detection for designing automated robot. IEEE Access 2019, 7, 56028–56038. [Google Scholar] [CrossRef]
- Opiyo, S.; Okinda, C.; Zhou, J.; Mwangi, E.; Makange, N. Medial axis-based machine-vision system for orchard robot navigation. Comput. Electron. Agric. 2021, 185, 106153. [Google Scholar] [CrossRef]
- Ahmadi, A.; Halstead, M.; McCool, C. Towards autonomous crop-agnostic visual navigation in arable fields. arXiv 2021, arXiv:2109.11936. [Google Scholar]
- Yang, S.; Mei, S.; Zhang, Y. Detection of maize navigation centerline based on machine vision. IFAC-PapersOnLine 2018, 51, 570–575. [Google Scholar] [CrossRef]
- English, A.; Ross, P.; Ball, D.; Corke, P. Vision based guidance for robot navigation in agriculture. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 1693–1698. [Google Scholar]
- Xue, J.; Zhang, L.; Grift, T.E. Variable field-of-view machine vision based row guidance of an agricultural robot. Comput. Electron. Agric. 2012, 84, 85–91. [Google Scholar] [CrossRef]
- Zhang, Q.; Chen, M.E.S.; Li, B. A visual navigation algorithm for paddy field weeding robot based on image understanding. Comput. Electron. Agric. 2017, 143, 66–78. [Google Scholar] [CrossRef]
- Ahmadi, A.; Nardi, L.; Chebrolu, N.; Stachniss, C. Visual servoing-based navigation for monitoring row-crop fields. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4920–4926. [Google Scholar]
- Gong, J.; Wang, X.; Zhang, Y.; Lan, Y.; Mostafa, K. Navigation line extraction based on root and stalk composite locating points. Comput. Electr. Eng. 2021, 92, 107115. [Google Scholar] [CrossRef]
- Lin, Y.-K.; Chen, S.-F. Development of navigation system for tea field machine using semantic segmentation. IFAC-PapersOnLine 2019, 52, 108–113. [Google Scholar] [CrossRef]
- Radcliffe, J.; Cox, J.; Bulanon, D.M. Machine vision for orchard navigation. Comput. Ind. 2018, 98, 165–171. [Google Scholar] [CrossRef]
- Ma, Y.; Zhang, W.; Qureshi, W.S.; Gao, C.; Zhang, C.; Li, W. Autonomous navigation for a wolfberry picking robot using visual cues and fuzzy control. Inf. Process. Agric. 2021, 8, 15–26. [Google Scholar] [CrossRef]
- Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
- Bakker, T.; Wouters, H.; Van Asselt, K.; Bontsema, J.; Tang, L.; Müller, J.; van Straten, G. A vision based row detection system for sugar beet. Comput. Electron. Agric. 2008, 60, 87–95. [Google Scholar] [CrossRef]
- García-Santillán, I.; Peluffo-Ordoñez, D.; Caranqui, V.; Pusdá, M.; Garrido, F.; Granda, P. Computer vision-based method for automatic detection of crop rows in potato fields. In Proceedings of the International Conference on Information Technology & Systems, Libertad City, Ecuador, 10–12 January 2018; pp. 355–366. [Google Scholar]
- Zhang, X.; Li, X.; Zhang, B.; Zhou, J.; Tian, G.; Xiong, Y.; Gu, B. Automated robust crop-row detection in maize fields based on position clustering algorithm and shortest path method. Comput. Electron. Agric. 2018, 154, 165–175. [Google Scholar] [CrossRef]
- Morio, Y.; Teramoto, K.; Murakami, K. Vision-based furrow line detection for navigating intelligent worker assistance robot. Eng. Agric. Environ. food 2017, 10, 87–103. [Google Scholar] [CrossRef]
- Bakken, M.; Moore, R.J.D.; From, P. End-to-end learning for autonomous crop row-following. IFAC-PapersOnLine 2019, 52, 102–107. [Google Scholar] [CrossRef]
- Halmetschlager, G.; Prankl, J.; Vincze, M. Probabilistic near infrared and depth based crop line identification. In Proceedings of the Workshop Proceedings of IAS-13 Conference on 2014, Padova, Italy, 18 July 2014; pp. 474–482. [Google Scholar]
- Kise, M.; Zhang, Q. Development of a stereovision sensing system for 3D crop row structure mapping and tractor guidance. Biosyst. Eng. 2008, 101, 191–198. [Google Scholar] [CrossRef]
- Chen, T.; Kornblith, S.; Norouzi, M.; Hinton, G.E. A Simple Framework for Contrastive Learning of Visual Representations. In Proceedings of the 37th International Conference on Machine Learning, Virtual Event, 13–18 July 2020. [Google Scholar]
- Güldenring, R.; Nalpantidis, L. Self-supervised contrastive learning on agricultural images. Comput. Electron. Agric. 2021, 191, 106510. [Google Scholar] [CrossRef]
- Feurer, M.; Klein, A.; Eggensperger, K.; Springenberg, J.T.; Blum, M.; Hutter, F. Efficient and Robust Automated Machine Learning. In Proceedings of the NIPS 2015, Montreal, QC, Canada, 7–12 December 2015. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv 2021, arXiv:abs/2010.11929. [Google Scholar]
- Danzon-Chambaud, S. PRISMA Checklist for ‘A Systematic Review of Automated Journalism Scholarship: Guidelines and Suggestions for Future Research’. Zenodo, 16 January 2021. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fountas, S.; Malounas, I.; Athanasakos, L.; Avgoustakis, I.; Espejo-Garcia, B. AI-Assisted Vision for Agricultural Robots. AgriEngineering 2022, 4, 674-694. https://doi.org/10.3390/agriengineering4030043
Fountas S, Malounas I, Athanasakos L, Avgoustakis I, Espejo-Garcia B. AI-Assisted Vision for Agricultural Robots. AgriEngineering. 2022; 4(3):674-694. https://doi.org/10.3390/agriengineering4030043
Chicago/Turabian StyleFountas, Spyros, Ioannis Malounas, Loukas Athanasakos, Ioannis Avgoustakis, and Borja Espejo-Garcia. 2022. "AI-Assisted Vision for Agricultural Robots" AgriEngineering 4, no. 3: 674-694. https://doi.org/10.3390/agriengineering4030043
APA StyleFountas, S., Malounas, I., Athanasakos, L., Avgoustakis, I., & Espejo-Garcia, B. (2022). AI-Assisted Vision for Agricultural Robots. AgriEngineering, 4(3), 674-694. https://doi.org/10.3390/agriengineering4030043