Precise Navigation of Small Agricultural Robots in Sensitive Areas with a Smart Plant Camera
Abstract
:1. Introduction
2. Methods
2.1. Plant Camera and Imaging
2.2. Cross-Correlation
3. Strategies
3.1. Mask
3.2. Number of Tracking Points and Averaged Image Lines
3.3. Error Correction
3.4. Embedded System
4. Results
R² | RMSE | Normalized RMSE in mm | |||||
---|---|---|---|---|---|---|---|
Scene | Gray | Binary | Gray | Binary | Pixel/Row | Gray | Binary |
1 | 0.3362 | 0.1477 | 1.508 | 2.299 | 64 | 3.77 | 5.75 |
2 | 0.9962 | 0.9909 | 0.9015 | 0.9328 | 47 | 3.07 | 3.18 |
3 | 0.9984 | 0.9935 | 0.8452 | 1.345 | 64 | 2.11 | 3.36 |
4 | 0.9904 | 0.9753 | 0.870 | 1.071 | 59 | 2.36 | 2.90 |
5 | 0.9728 | 0.9205 | 1.758 | 1.210 | 77 | 3.65 | 2.51 |
6 | 0.9303 | 0.5164 | 2.844 | 4.820 | 80 | 5.69 | 9.64 |
7 | 0.9968 | 0.9866 | 0.8468 | 1.029 | 38 | 3.57 | 4.33 |
8 | 0.9463 | 0.9744 | 2.726 | 1.957 | 58 | 7.52 | 5.40 |
9 | 0.9644 | 0.6476 | 2.365 | 4.021 | 114 | 3.32 | 5.64 |
Binary Results
5. Conclusions
- Image lines in the x direction can be concentrated by averaging in the y direction.
- The cross-correlation function does not have to move over the entire pixel line.
- The moving mask of the cross-correlation can be reduced to a few periodic replications with a rectangular shape, thereby reducing the length of the used pixels and number of multiplications.
- The reduced mask must only move over a length smaller than one period to have only one maximum peak of the cross-correlation in the inspection window.
- Missing tracking points can be interpolated using a simple linear regression function.
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Shi, Y.Y.; Wang, N.; Taylor, R.K.; Raun, W.R.; Hardin, J.A. Automatic corn plant location and spacing measurement using laser line-scan technique. Precis. Agric. 2013, 14, 478–494. [Google Scholar] [CrossRef]
- Astrand, B.; Baerveldt, A.J. A vision based row-following system for agricultural field machinery. Mechatronics 2005, 15, 251–269. [Google Scholar] [CrossRef]
- Olsen, H.J. Determination of row position in small-grain crops by analysis of video images. Comput. Electron. Agric. 1995, 12, 147–162. [Google Scholar] [CrossRef]
- Máthé, K.; Buşoniu, L. Vision and control for UAVs: A survey of general methods and of inexpensive platforms for infrastructure inspection. Sensors 2015, 15, 14887–14916. [Google Scholar] [CrossRef] [PubMed]
- Wallace, R. Achieving Optimum Radio Range; Application Report from Texas Instruments Incorporated SWRA479: Dallas, TX, USA, 2015. [Google Scholar]
- Norremark, M.; Griepentrog, H.W.; Nielsen, J.; Sogaard, H.T. The development and assessment of the accuracy of an autonomous GPS-based system for intra-row mechanical weed control in row crops. Biosyst. Eng. 2008, 101, 396–410. [Google Scholar] [CrossRef]
- Mathanker, S.K.; Maughan, J.D.; Hansen, A.C.; Grift, T.E.; Ting, K.C. Sensing miscanthus swath volume for maximizing baler throughput rate. Trans. ASABE 2014, 57, 355–362. [Google Scholar]
- Molin, J.P.; Colacco, A.F.; Carlos, E.F.; de Mattos, D. Yield mapping, soil fertility and tree gaps in an orange orchard. Revista Brasileira de Fruticultura 2012, 34, 1256–1265. [Google Scholar] [CrossRef] [Green Version]
- Jiang, G.Q.; Wang, Z.H.; Liu, H.M. Automatic detection of crop rows based on multi-ROIs. Expert Syst. Appl. 2015, 42, 2429–2441. [Google Scholar] [CrossRef]
- Bakker, T.; Wouters, H.; van Asselt, K.; Bontsema, J.; Tang, L.; Muller, J.; van Straten, G. A vision based row detection system for sugar beet. Comput. Electron. Agric. 2008, 60, 87–95. [Google Scholar] [CrossRef]
- Torres-Sospedra, J.; Nebot, P. A new approach to visual-based sensory system for navigation into orange groves. Sensors 2011, 11, 4086–4103. [Google Scholar] [CrossRef] [PubMed]
- Fernandez, J.; Calavia, L.; Baladron, C.; Aguiar, J.M.; Carro, B.; Sanchez-Esguevillas, A.; Alonso-Lopez, J.A.; Smilansky, Z. An intelligent surveillance platform for large metropolitan areas with dense sensor deployment. Sensors 2013, 13, 7414–7442. [Google Scholar] [CrossRef] [PubMed]
- Dworak, V.; Selbeck, J.; Ehlert, D. Ranging sensors for vehicle-based measurement of crop stand and orchard parameters: A review. Trans. ASABE 2011, 54, 1497–1510. [Google Scholar] [CrossRef]
- Martínez, M.A.; Martínez, J.L.; Morales, J. Motion detection from mobile robots with fuzzy threshold selection in consecutive 2D Laser scans. Electronics 2015, 4, 82–93. [Google Scholar] [CrossRef]
- Sanz, R.; Rosell, J.R.; Llorens, J.; Gil, E.; Planas, S. Relationship between tree row LIDAR-volume and leaf area density for fruit orchards and vineyards obtained with a LIDAR 3D Dynamic Measurement System. Agric. For. Meteorol. 2013, 171, 153–162. [Google Scholar] [CrossRef]
- Hoefle, B. Radiometric correction of terrestrial LiDAR point cloud data for individual maize plant detection. IEEE Geosci. Remote Sens. Lett. 2014, 11, 94–98. [Google Scholar] [CrossRef]
- Hague, T.; Marchant, J.A.; Tillett, N.D. Ground based sensing systems for autonomous agricultural vehicles. Comput. Electron. Agric. 2000, 25, 11–28. [Google Scholar] [CrossRef]
- Dworak, V.; Selbeck, J.; Dammer, K.H.; Hoffmann, M.; Zarezadeh, A.A.; Bobda, C. Strategy for the development of a smart NDVI camera system for outdoor plant detection and agricultural embedded systems. Sensors 2013, 13, 1523–1538. [Google Scholar] [CrossRef] [PubMed]
- Gebbers, R.; Tavakoli, H.; Herbst, R. Crop sensor readings in winter wheat as affected by nitrogen and water supply. In Precision Agriculture’13; Stafford, J.V., Ed.; Wageningen Academic Publishers: Gelderland, The Netherlands, 2013; pp. 79–86. [Google Scholar]
- Tillett, R.D. Image analysis for agricultural processes: A review of potential opportunities. J. Agric. Eng. Res. 1991, 50, 247–258. [Google Scholar] [CrossRef]
- Peteinatos, G.G.; Weis, M.; Andujar, D.; Ayala, V.R.; Gerhards, R. Potential use of ground-based sensor technologies for weed detection. Pest Manag. Sci. 2014, 70, 190–199. [Google Scholar] [CrossRef] [PubMed]
- Billingsley, J.; Schoenfisch, M. Vision-guidance of agricultural vehicles. Auton. Robot. 1995, 2, 65–76. [Google Scholar] [CrossRef]
- Zynq-7000 AP SoC Technical Reference Manual, UG585 (V1.10). Available online: www.xilinx.com (accessed on 23 February 2015).
- Janßen, B.; Schwiegelshohn, F.; Hübner, M. Adaptive computing in real-time applications. In Proceedings of the 13th IEEE International NEW Circuits And Systems (NEWCAS) Conference, Grenoble, France, 7–10 June 2015; pp. 166–173.
- Janssen, B.; Mori, J.Y.; Navarro, O.; Gohringer, D.; Hubner, M. Future trends on adaptive processing systems. In Proceedings of the 12th IEEE International Symposium on Parallel and Distributed Processing with Applications (ISPA 2014), Milan, Italy, 26–28 August 2014.
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dworak, V.; Huebner, M.; Selbeck, J. Precise Navigation of Small Agricultural Robots in Sensitive Areas with a Smart Plant Camera. J. Imaging 2015, 1, 115-133. https://doi.org/10.3390/jimaging1010115
Dworak V, Huebner M, Selbeck J. Precise Navigation of Small Agricultural Robots in Sensitive Areas with a Smart Plant Camera. Journal of Imaging. 2015; 1(1):115-133. https://doi.org/10.3390/jimaging1010115
Chicago/Turabian StyleDworak, Volker, Michael Huebner, and Joern Selbeck. 2015. "Precise Navigation of Small Agricultural Robots in Sensitive Areas with a Smart Plant Camera" Journal of Imaging 1, no. 1: 115-133. https://doi.org/10.3390/jimaging1010115
APA StyleDworak, V., Huebner, M., & Selbeck, J. (2015). Precise Navigation of Small Agricultural Robots in Sensitive Areas with a Smart Plant Camera. Journal of Imaging, 1(1), 115-133. https://doi.org/10.3390/jimaging1010115