Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion
Abstract
1. Introduction
- A General Framework for Visual-Enhanced LiDAR Inertial Odometry: This work presents a principled methodology for embedding sparse but high-fidelity visual fiducial markers into a tightly coupled LiDAR inertial odometry framework. The core contribution is to propose a method of integrating the Apriltag factor instead of the original GNSS factor into the factor graph, which provides global constraints and cumulative error correction for the robot in the GNSS-denied environment. By designing a tightly coupled Apriltag factor within the LIO-SAM backbone, we achieve more than just the substitution of GNSS. The system is specifically engineered to handle the intermittent nature of visual observations, effectively leveraging sporadic tag detections to correct long-term drift without relying on continuous GNSS signals. This approach effectively bridges the gap between dense, relative LiDAR measurements and sparse, absolute visual detections, establishing a generalizable paradigm for enhancing state estimation in GNSS-denied environments beyond the specific case of Apriltags.
- A Systematic Deployment Framework for Visual Fiducial Makers: A key contribution of this work is to transform the deployment of tags from a temporary process to a systematic, principle-driven practice. Through rigorous experiments, we established a quantitative relationship between tag spacing and positioning accuracy. We determined that the spacing of 3 m is an optimal balance point, which can achieve centimeter-level accuracy (0.06 m RMSE) while ensuring actual deployment efficiency. In addition, this study further proposes a systematic deployment strategy: the vertical bracket is used to deploy Apriltag vertically, and its spatial density can be flexibly adjusted according to different positioning accuracy requirements. This strategy fundamentally reduces the complexity and cost of large-scale deployment, making scalable and reliable deployment possible in a broad agricultural environment.
- Comprehensive Experimental Validation: We provide a thorough quantitative evaluation of the system’s performance. Analogy experiments simulating real-world experiments were conducted to rigorously validate not only the superior localization accuracy (achieving centimeter-level precision with dense tag deployment) but also the robustness and reliability of Apriltag detection under varying lighting conditions and viewing angles. This empirical validation offers strong evidence for the reliability and practicality of our proposed approach.
2. Materials and Methods
2.1. Mobile Robot Composition
2.2. Construction of the Moving Platform
2.2.1. Environmental Awareness Module
2.2.2. Chassis Module
2.3. Mobile Robot Detection Platform Software Design
2.4. Realization Principle of Autonomous Mapping and Navigation Function of Mobile Robot
2.4.1. Multi-Sensor Calibration and Time Synchronization
2.4.2. Point Cloud Preprocessing
2.4.3. Feature Detection and Matching
- (1)
- Label detection and decoding
- (2)
- Homography matrix solution
- (3)
- Pose estimation
2.4.4. Factor Graph Construction
- (1)
- Factor graph model construction
- (2)
- From GNSS factor to Apriltag factor
- (3)
- Integration factor graph
2.5. Performance Experiments
2.5.1. Experimental Equipment and Materials
2.5.2. Comparison Test of Map Accuracy in Different Scenarios and Methods
2.5.3. Testing the Relationship Between Tag Distance and Positioning Accuracy
2.5.4. Navigation Accuracy Experiments
2.5.5. Apriltag Stability Experiments
- (1)
- Occlusion experiments
- (2)
- Different illumination condition experiments
- (3)
- Background interference and changing distance experiments
3. Results
3.1. The Results and Analysis of Map Construction with Different Methods
3.2. Experimental Results and Analysis of Tag Distance and Positioning Accuracy
3.3. Navigation Accuracy Test Analysis
3.4. Results and Analysis of Apriltag Stability Experiments
4. Discussion
5. Conclusions
- (1)
- The average errors of the real-time map constructed by the Gmapping, LIO-SAM, and April-LIO-SAM algorithms in the greenhouse experimental scene were 0.599 m, 0.316 m, and 0.049 m, and the root mean square error was 0.630 m, 0.372 m, and 0.057 m, respectively. In the circumstances without a GNSS signal, the mapping accuracy of the April-LIO-SAM algorithm proposed in this paper is significantly better than other mapping algorithms. The navigation experiment results indicate that the average lateral deviation of autonomous navigation for mobile robots at different speeds is within the centimeter range, with the average heading deviation not exceeding 2.5°.
- (2)
- The positioning accuracy experiments of the autonomous positioning and navigation system were carried out in the greenhouse and orchard experimental environments. When the spacing is 3.0 m, 5.0 m, 7.0 m, and 10.0 m, the average deviations are 0.044 m, 0.057 m, 0.059 m, and 0.067 m in the X direction and 0.052 m, 0.060 m, 0.066 m, and 0.084 m in the Y direction in Scene1. In Scene2, the average deviations were 0.036 m, 0.039 m, 0.050 m, and 0.056 m in the X direction and 0.032 m, 0.040 m, 0.049 m, and 0.071 m in the Y direction. And the root mean square errors were 0.060, 0.069, 0.840, and 0.098 m in Scene1 and 0.059 m, 0.064 m, 0.079 m, and 0.082 m in Scene2. It can be seen that the autonomous positioning and navigation system designed in this paper still has a high accuracy in the greenhouse and orchard environments, meeting the operational requirements.
- (3)
- The Apriltag stability experiments were carried out with different occlusion and illumination conditions, background interference, and changing distance. The results show that the Apriltag recognition rate is high when the occlusion ratio is less than 30%. Apriltag positioning errors are within 3.0 mm under different interference conditions.
- (4)
- The autonomous navigation system designed in this study can effectively cope with the challenges of high similarity between rows in the greenhouse, poor GNSS signals, and uneven orchard terrain.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
| GNSS | Global Navigation Satellite System |
| LIO-SAM | Tightly Coupled LiDAR Inertial Odometry via Smoothing and Mapping |
| IMU | Inertial Measurement Unit |
| SLAM | Simultaneous Localization and Mapping |
| LiDAR | Light Detection and Ranging |
| UWB | Ultra-Wide Band |
| AMCL | Adaptive Monte Carlo Localization |
| RANSAC | Random Sample Consensus |
| RTK | Real-Time Kinematic |
| RFID | Radio Frequency Identification |
| BLE | Bluetooth Low Energy |
| GPS | Global Positioning System |
Appendix A
| Appearance Size (Length × Width × Height) | 830 × 620 × 370 mm |
| Weight | 55 kg |
| Speed | 0–2.2 m/s |
| Steering characteristics | Differential drive |
| Drive battery specifications | 24 V 30 ah |
| Maximum climbing slope | 25° |
| Remote control communication frequency | 2.4 G/Hz |
| Remote control distance | 0–150 m |
| Rated self-rotating load | 75 kg |
| Motor rated power | 250 W |
| Encoder accuracy | 0.09° |
| Reducer speed ratio | 29.17 |
References
- Ma, Z.; Yang, S.; Li, J.; Qi, J. Research on SLAM Localization Algorithm for Orchard Dynamic Vision Based on YOLOD-SLAM2. Agriculture 2024, 9, 1622. [Google Scholar] [CrossRef]
- Guan, C.; Zhao, W.; Xu, B.; Cui, Z.; Yang, Y.; Gong, Y. Design and Experiment of Electric Uncrewed Transport Vehicle for Solanaceous Vegetables in Greenhouse. Agriculture 2025, 15, 118. [Google Scholar] [CrossRef]
- Zhang, E.; Ma, Z.; Geng, C.; Li, W. Control system for automatic track transferring of greenhouse hanging sprayer. J. China Agric. Univ. 2013, 18, 170–174. Available online: http://zgnydxxb.ijournals.cn/zgnydxxb/ch/reader/view_abstract.aspx?file_no=20130625&flag=1 (accessed on 18 September 2025).
- Aranda, F.J.; Parralejo, F.; Álvarez, F.J.; Paredes, J.A. Performance analysis of fingerprinting indoor positioning methods with BLE. Expert Syst. Appl. 2022, 202, 117095. [Google Scholar] [CrossRef]
- Abdullah, A.; Aziz, O.A.; Rashid, R.A.; Haris, M.; Sarijari, M.A. Robust and fast algorithm design for efficient Wi-Fi fingerprinting based indoor positioning systems. J. King Saud Univ. Comput. Inf. Sci. 2023, 35, 101696. [Google Scholar] [CrossRef]
- Jin, G.; Liu, Q.; Yu, Y. Indoor Dynamic Positioning Method Exploiting Passive RFID System. IEEE Access 2024, 12, 170451–170458. [Google Scholar] [CrossRef]
- Xu, X.; Xu, D.; Zhang, T.; Zhao, H. In-Motion Coarse Alignment Method for SINS/GPS Using Position Loci. IEEE Sens. J. 2019, 19, 3930–3938. [Google Scholar] [CrossRef]
- Yao, Z.; Zhao, C.; Zhang, T. Agricultural machinery automatic navigation technology. iScience 2024, 27, 108714. [Google Scholar] [CrossRef]
- Balaso, D.; Arima, S.; Ueka, Y.; Kono, M.; Nishina, H.; Kenji, H.; Takayama, K.; Takahashi, N. Development of a multi-operation system for intelligent greenhouses. IFAC Proc. Vol. 2013, 46, 158–178. [Google Scholar] [CrossRef]
- Jiang, S.; Qi, P.; Han, L.; Liu, L.; Li, Y.; Huang, Z.; Liu, Y.; He, X. Navigation system for orchard spraying robot based on 3D LiDAR SLAM with NDT_ICP point cloud registration. Comput. Electron. Agric. 2024, 220, 108870. [Google Scholar] [CrossRef]
- Bagagiolo, G.; Matranga, G.; Cavallo, E.; Pampuro, N. Greenhouse Robots: Ultimate Solutions to Improve Automation in Protected Cropping Systems—A Review. Sustainability 2022, 14, 6436. [Google Scholar] [CrossRef]
- Chen, X.; Wang, S.; Zhang, B.; Luo, L. Multi-feature fusion tree trunk detection and orchard mobile robot localization using camera/ultrasonic sensors. Comput. Electron. Agric. 2018, 147, 91–108. [Google Scholar] [CrossRef]
- Long, Z.; Xiang, Y.; Lei, X.; Li, Y.; Hu, Z.; Dai, X. Integrated indoor positioning system of greenhouse robot based on UWB/IMU/ODOM/LIDAR. Sensors 2022, 22, 4819. [Google Scholar] [CrossRef]
- Niu, Z.; Yang, H.; Zhou, L.; Farag Taha, M.; He, Y.; Qiu, Z. Deep learning-based ranging error mitigation method for UWB localization system in greenhouse. Comput. Electron. Agric. 2023, 205, 107573. [Google Scholar] [CrossRef]
- Li, Y.; Shen, H.; Fu, Y.; Wang, K. A method of dense point cloud SLAM based on improved YOLOV8 and fused with ORB-SLAM3 to cope with dynamic environments. Expert Syst. Appl. 2024, 255, 124918. [Google Scholar] [CrossRef]
- Lv, X.; Zhang, X.; Gao, H.; He, T.; Lv, Z.; Zhang, L. When crops meet machine vision: A review and development framework for a low-cost nondestructive online monitoring technology in agricultural production. Agric. Commun. 2024, 2, 100029. [Google Scholar] [CrossRef]
- Khan, M.S.U.; Ahmed, T.; Chen, H. A Review of SLAM Techniques for Agricultural Robotics: Challenges, Architectures, and Future Directions. In Proceedings of the 2025 IEEE 15th International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Shanghai, China, 15–18 July 2025; pp. 598–603. [Google Scholar] [CrossRef]
- Alhmiedat, T.; M Marei, A.; Messoudi, W. A SLAM-Based Localization and Navigation System for Social Robots: The Pepper Robot Case. Machines 2023, 11, 158. [Google Scholar] [CrossRef]
- Jiang, A.; Ahamed, T. Development of an autonomous navigation system for orchard spraying robots integrating a thermal camera and LiDAR using a deep learning algorithm under low- and no-light conditions. Comput. Electron. Agric. 2025, 235, 110359. [Google Scholar] [CrossRef]
- Baltazar, J.D.A.; Coelho, A.L.D.F.; Valente, D.S.M.; Queiroz, D.M.D.; Villar, F.M.D.M. Development of a Robotic Platform with Autonomous Navigation System for Agriculture. AgriEngineering 2024, 6, 3362–3374. [Google Scholar] [CrossRef]
- Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3400–3407. [Google Scholar] [CrossRef]
- Kalaitzakis, M.; Cain, B.; Carroll, S.; Ambrosi, A.; Whitehead, C.; Vitzilaios, N. Fiducial Markers for Pose Estimation: Overview, Applications and Experimental Comparison of the ARTag, AprilTag, ArUco and STag Markers. J. Intell. Robot. Syst. 2021, 101, 71. [Google Scholar] [CrossRef]
- Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Medina-Carnicer, R. Generation of fiducial marker dictionaries using Mixed Integer Linear Programming. Pattern Recognit. 2016, 51, 481–491. [Google Scholar] [CrossRef]
- Zhang, W.; Gong, L.; Huang, S.; Wu, S.; Liu, C. Factor graph-based high-precision visual positioning for agricultural robots with fiducial markers. Comput. Electron. Agric. 2022, 201, 107295. [Google Scholar] [CrossRef]
- Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z.; Liu, X. Extracting the navigationpath of a tomato-cucumber greenhouse robot based on amedian point hough transform. Comput. Electron. Agric. 2020, 174, 105472. [Google Scholar] [CrossRef]
- Tan, H.; Zhao, X.; Fu, H.; Yang, M.; Zhai, C. A novel fusion positioning navigation system for greenhouse strawberry spraying robot using LiDAR and ultrasonic tags. Agric. Commun. 2025, 3, 100087. [Google Scholar] [CrossRef]
- Nishimura, K.; Ishikawa, T.; Sasaki, H.; Kato, S. RAPLET: Demystifying Publish/Subscribe Latency for ROS Applications. In Proceedings of the 2021 IEEE 27th International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA), Houston, TX, USA, 18–20 August 2021; pp. 41–50. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
- Lv, J.; Xu, J.; Hu, K.; Liu, Y.; Zuo, X. Targetless Calibration of LiDAR-IMU System Based on Continuous-time Batch Estimation. arXiv 2007, arXiv:2007.14759. [Google Scholar] [CrossRef]
- Gimpelj, J.; Semolič, M.; Munih, M.; Šlajpah, S.; Mihelj, M. Navigation of a Mobile Platform in a Greenhouse, in Advances in Service and Industrial Robotics. Mech. Mach. Sci. 2023, 135, 393–400. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 5135–5142. [Google Scholar] [CrossRef]
- Xu, W.; Zhang, F. FAST-LIO: A Fast, Robust LiDAR-Inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter. IEEE Robot. Autom. Lett. 2021, 6, 3317–3324. [Google Scholar] [CrossRef]
- Qin, T.; Pan, J.; Cao, S.; Shen, S. A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors. arXiv 2019, arXiv:1901.03638. [Google Scholar] [CrossRef]
- Huang, X.; Mei, G.; Zhang, J.; Abbas, R. A comprehensive survey on point cloud registration. arXiv 2021, arXiv:2103.02690. [Google Scholar] [CrossRef]
- Rusu, R.B.; Cousins, S. 3D is here: Point Cloud Library (PCL). In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 1–4. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. Real-Time. In Robotics: Science and Systems X; Robotics: Science and Systems Foundation: College Station, TX, USA, 2014; Volume 12. [Google Scholar] [CrossRef]
- Hong, L.; Han, L.; Liu, H. Unstructured Road Slope Recognition Based on Improved RANSAC Algorithm. In Proceedings of the 2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE), Guangzhou, China, 14–16 April 2023; pp. 322–325. [Google Scholar] [CrossRef]
- Wang, Y.; Zheng, J.; Xu, Q.-Z.; Li, B.; Hu, H.-M. An improved RANSAC based on the scale variation homogeneity. J. Vis. Commun. Image Represent. 2016, 40, 751–764. [Google Scholar] [CrossRef]
- Hartley, R.; Jadidi, M.G.; Gan, L.; Huang, J.-K.; Grizzle, J.W.; Eustice, R.M. Hybrid Contact Preintegration for Visual-Inertial-Contact State Estimation Using Factor Graphs. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 3783–3790. [Google Scholar] [CrossRef]
- Barfoot, T.D. State Estimation for Robotics; Cambridge University Press: Cambridge, UK, 2017; pp. 31–52. [Google Scholar]
- Nisar, B.; Foehn, P.; Falanga, D.; Scaramuzza, D. VIMO: Simultaneous Visual Inertial Model-Based Odometry and Force Estimation. IEEE Robot. Autom. Lett. 2019, 4, 2785–2792. [Google Scholar] [CrossRef]
- Yu, S.; Wu, T. Application research of GPS-RTK combined with total station in cadastral surveying in mountainous areas. Heilongjiang Hydraul. Sci. Technol. 2018, 46, 135–136. Available online: https://www.cnki.com.cn/Article/CJFDTotal-HSKJ201809045.htm (accessed on 18 September 2025).
- Moore, T.; Stouch, D. A Generalized Extended Kalman Filter Implementation for The Robot Operating System. Intell. Auton. Syst. 2016, 13, 335–348. [Google Scholar]
- Huang, Y.-H.; Lin, C.-T. Indoor Localization Method for a Mobile Robot Using LiDAR and a Dual Apriltag. Electronics 2023, 12, 1023. [Google Scholar] [CrossRef]
- Chen, J.; Zhu, F.; Guan, Z.; Zhu, Y.; Shi, H.; Cheng, K. Development of a combined harvester navigation control system based on visual simultaneous localization and mapping-inertial guidance fusion. J. Agric. Eng. 2024, 7, 7–13. [Google Scholar] [CrossRef]
- Li, B.; Huang, X.; Cai, J.; Ma, F. LB-LIOSAM: An improved mapping and localization method with loop detection. Ind. Robot Int. J. Robot. Res. Appl. 2025, 52, 381–390. [Google Scholar] [CrossRef]
- Pedersini, F. Simple Calibration of Three-Axis Accelerometers with Freely Rotating Vertical Bench. Sensors 2025, 25, 3998. [Google Scholar] [CrossRef] [PubMed]
- Zhou, J.; Mao, K.; Wang, R.; He, T.; Liang, H. Autonomous Navigation Method for UAV Based on the Fusion of Laser SLAM and Apriltag. J. Phys. Conf. Ser. 2025, 3012, 012002. [Google Scholar] [CrossRef]



















| Algorithm | Parameter | Position1 | Position2 | Position3 | Position4 | Position5 | Position6 | Position7 | RMSE/m | Average Error |
|---|---|---|---|---|---|---|---|---|---|---|
| Gmapping | Absolute error | 0.732 | 0.607 | 0.542 | 0.823 | 0.496 | 0.521 | 0.472 | 0.630 | 0.599 |
| LIO-SAM | Absolute error | 0.316 | 0.224 | 0.257 | 0.188 | 0.202 | 0.706 | 0.320 | 0.372 | 0.316 |
| Apirl-LIO-SAM | Absolute error | 0.044 | 0.051 | 0.076 | 0.050 | 0.017 | 0.069 | 0.041 | 0.057 | 0.049 |
| Scene Distance | Deviation in X Direction (m) | Deviation in Y Direction (m) | RMSE | |||||
|---|---|---|---|---|---|---|---|---|
| Average Deviation | Maximum | Standard | Average Deviation | Maximum | Standard | |||
| 1 | 3.0 | 0.044 | 0.058 | 0.032 | 0.052 | 0.073 | 0.027 | 0.0603 |
| 5.0 | 0.057 | 0.098 | 0.048 | 0.060 | 0.072 | 0.038 | 0.0694 | |
| 7.0 | 0.059 | 0.114 | 0.053 | 0.066 | 0.096 | 0.067 | 0.0842 | |
| 10.0 | 0.067 | 0.115 | 0.062 | 0.084 | 0.094 | 0.080 | 0.0983 | |
| 2 | 3.0 | 0.036 | 0.051 | 0.029 | 0.032 | 0.046 | 0.030 | 0.0593 |
| 5.0 | 0.039 | 0.057 | 0.031 | 0.040 | 0.057 | 0.037 | 0.0644 | |
| 7.0 | 0.050 | 0.078 | 0.034 | 0.049 | 0.068 | 0.047 | 0.0793 | |
| 10.0 | 0.056 | 0.082 | 0.040 | 0.071 | 0.102 | 0.068 | 0.0821 | |
| Speed/ms | Exp | Deviation in X Direction (m) | Heading Deviation (°) | ||||
|---|---|---|---|---|---|---|---|
| Maximum | Average | Standard | Maximum | Average | Standard | ||
| 0.2 | 1 | 0.079 | 0.035 | 0.024 | 5.0 | 1.9 | 1.2 |
| 2 | 0.119 | 0.040 | 0.027 | 4.8 | 1.8 | 1.4 | |
| 3 | 0.094 | 0.039 | 0.021 | 4.5 | 2.0 | 1.1 | |
| 0.3 | 1 | 0.116 | 0.036 | 0.030 | 4.5 | 1.7 | 1.3 |
| 2 | 0.105 | 0.040 | 0.024 | 4.9 | 1.9 | 1.4 | |
| 3 | 0.112 | 0.043 | 0.026 | 5.0 | 1.8 | 1.2 | |
| 0.4 | 1 | 0.114 | 0.047 | 0.032 | 5.1 | 2.0 | 1.2 |
| 2 | 0.143 | 0.051 | 0.034 | 5.7 | 2.1 | 1.5 | |
| 3 | 0.117 | 0.053 | 0.029 | 6.1 | 2.3 | 1.6 | |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Guan, X.; Ge, H.; Nie, S.; Ding, Y. Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion. Agronomy 2025, 15, 2731. https://doi.org/10.3390/agronomy15122731
Guan X, Ge H, Nie S, Ding Y. Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion. Agronomy. 2025; 15(12):2731. https://doi.org/10.3390/agronomy15122731
Chicago/Turabian StyleGuan, Xianping, Hongrui Ge, Shicheng Nie, and Yuhan Ding. 2025. "Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion" Agronomy 15, no. 12: 2731. https://doi.org/10.3390/agronomy15122731
APA StyleGuan, X., Ge, H., Nie, S., & Ding, Y. (2025). Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion. Agronomy, 15(12), 2731. https://doi.org/10.3390/agronomy15122731

