A Blended Extended Kalman Filter Approach for Enhanced AGV Localization in Centralized Camera-Based Control Systems
Abstract
1. Introduction
2. Materials and Methods
2.1. Preparation and Equipment Used in the Experiment
2.1.1. Preparation Used in the Experiment
2.1.2. Equipment Used in the Experiment
2.1.3. IoT System Architecture Concepts
- Sensor (Sensor) Layer: Consists of the encoder + IMU and a CCTV camera that records data from the experimental area.
- Network (Communication) Layer: Uses Wi-Fi wireless communication with the MQTT protocol for lightweight messaging to exchange sensor data with low latency.
- Processing Layer (Central Server): The central server collects sensor data streams, performs synchronization, integrates the data, and stores the records for analysis.
- Application Layer: Provides visualization and location identification for the experimental area.
2.2. Localization Methods
2.2.1. Dead Reckoning Localization
- Distance measurement from encoder: An encoder mounted to the drive motor provides wheel rotation data that is converted to linear velocity and angular velocity using the vehicle kinematics model. However, this method is limited by accumulated errors caused by wheel slip, calibration inaccuracies, and surface irregularities [18,19].
- Direction from IMU: A 9-DOF IMU provides angular velocity and linear acceleration . The gyroscope output is crucial for direction calculations, as accelerometer data is highly sensitive to noise and bias errors that increase with time [20].
2.2.2. Camera Localization
- Feature Detection: The system detects the features of the AGV to be located using standard computer vision algorithms.
- Coordinate Transformation: The AGV’s position in the image plane is converted to its real-world coordinates (world frame) via perspective transformation or a pre-calibrated homography matrix.
2.3. Pre-Processing Data Analysis
2.3.1. Bias Correction

2.3.2. Inverse-Variance Weighting
2.4. Extended Kalman Filter (EKF)
2.4.1. Normal Series Fusion EKF (Normal EKF)
2.4.2. Blended EKF
- is the weight assigned to the Camera (CAM) measurements;
- is the weight assigned to the Dead Reckoning (DR) measurements.
2.4.3. Parameter Optimization Using Optuna and TPE (Tree-Structured Parzen Estimator) Sampler for the Blended EKF
Search Space
Objective Function
TPE Mechanism
Optimization Procedure
- The TPE sampler proposes candidate values for ;
- The Blended EKF is executed using these parameters;
- The RMSE is computed and stored;
- TPE updates its probability density models (, );
- A new candidate is selected by maximizing ;
- The process repeats until the maximum number of trials is reached.
2.5. Sensor Synchronization Between EKF and Camera
3. Experimental Design
- Bias Correction Analysis: The raw data obtained from each sensor type were analyzed against Ground Truth (GT) data, which is akin to the “true path”, to determine the systematic bias and variance.This step is crucial in reducing the Root Mean Squared Error (RMSE) of the raw data and enhancing the reliability of the data before further use.
- Weighting with Inverse-Variance Weighting: The error-corrected data were then statistically weighted. The principle is that sensors with lower bias variance (which means higher reliability) will receive higher weight in data fusion.
- Data Fusion with EKF: The pre-processed data is fed into the EKF system to estimate the state (position and orientation). Two EKF models have been developed and compared: (1) a conventional EKF and (2) a blended EKF, which fuses measurements from the CAM and DR sensors before the update process.
- Straight Section: This movement involves constant speed and no sudden changes of direction to assess the ability to maintain a precise position under normal conditions.
- Curved Section: It is a dynamic movement, which easily causes accumulated drift in the dead reckoning system to assess the system capability and stability under the condition of changing direction of movement.
4. Results and Discussion
4.1. Analysis of Bias Correction (BC) Results
- Camera Sensor (CAM): The bias error values in the X-axis (Mean Error X) were positive, while the Y-axis (Mean Error Y) were found to be negative in both cases ( = 0.1 and 0.2). However, the camera sensor did not measure the rotation angle, so there was no error for the Mean Error .
- Dead Reckoning (DR) Sensor: For DR, the error values in the X-axis (Mean Error X) were negative, while the Y-axis (Mean Error Y) were positive, indicating that there was error in both directions.
- Inertial Measurement Unit (IMU) Sensors: IMU sensors are used to measure changes in angle (). The only error is the Mean Error ().
- Camera Sensor (CAM):
- –
- Before Correction (Raw): CAM had RMSE X values of 0.0665 m and 0.0601 m, and RMSE Y values of 0.1002 m and 0.0993 m for s and s, respectively.
- –
- After Correction (Corrected): RMSE values significantly decreased, with RMSE X decreasing to 0.0660 m and 0.0598 m, and RMSE Y decreasing to 0.0859 m and 0.0858 m, respectively.
- Dead Reckoning (DR) Sensor:
- –
- Before Correction (Raw): DR had RMSE X values of 0.0934 m and 0.0898 m, and RMSE Y values of 0.0503 m and 0.0561 m.
- –
- After Correction (Corrected): Error correction significantly reduced the RMSE of DR. RMSE X decreased to 0.0897 m and 0.0858 m, while RMSE Y decreased to 0.0272 m and 0.0317 m, respectively.
- Inertial Measurement Unit (IMU) Sensor:
- –
- Before Correction (Raw): The IMU is used to measure the rotation angle (). The RMSE values before correction were approximately 1.1959 rad and 1.2039 rad.
- –
- After Correction (Corrected): The RMSE values decreased to approximately 1.1524 rad and 1.1497 rad.
- True Trajectory (GT): The black line represents the actual trajectory of the object, which is the reference line.
- CAM (Raw): The dark blue dots represent the CAM path before correction.
- CAM (Bias-Corrected): The orange dots represent the CAM path after correction.
- DR (Raw): The green dots represent the DR path before correction.
- DR (Bias-Corrected): The red dots represent the DR path after correction.
4.2. Analysis of the Results of Sensor Data Fusion Using Inverse-Variance Weighting
4.3. Analysis of Data Integration Results Using EKF
- EKF Normal: the standard EKF model using independent CAM and DR observations (tested on both raw and bias-corrected data).
- EKF Blended: an improved EKF model that uses pre-blended CAM–DR measurements based on the user-defined weighting coefficients .
4.3.1. RMSE Comparison of EKF Normal and EKF Blended
4.3.2. Contribution of Optuna to Blending-Weight Selection
- optimization history,
- RMSE contour over the domain,
- distribution of Optuna search points.
- higher (greater CAM contribution in the X-axis), and
- lower (greater DR contribution in the Y-axis).
4.3.3. Trajectory Comparison
4.3.4. Interpretation and Summary
4.4. Trajectory Roughness Index (TRI)
4.5. Discussion and Suggestions
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ito, S.; Hiratsuka, S.; Ohta, M.; Matsubara, H.; Ogawa, M. Small Imaging Depth LIDAR and DCNN-Based Localization for Automated Guided Vehicle. Sensors 2018, 18, 177. [Google Scholar] [CrossRef]
- Wang, Q.; Wu, J.; Liao, Y.; Huang, B.; Li, H.; Zhou, J. Research on Multi-Sensor Fusion Localization for Forklift AGV Based on Adaptive Weight Extended Kalman Filter. Sensors 2025, 25, 5670. [Google Scholar] [CrossRef]
- Milam, G.; Xie, B.; Liu, R.; Zhu, X.; Park, J.; Kim, G.; Park, C.H. Trainable Quaternion Extended Kalman Filter with Multi-Head Attention for Dead Reckoning in Autonomous Ground Vehicles. Sensors 2022, 22, 7701. [Google Scholar] [CrossRef]
- Fragapane, G.; de Koster, R.; Sgarbossa, F.; Strandhagen, J.O. Planning and Control of Autonomous Mobile Robots for Intralogistics: Literature Review and Research Agenda. Eur. J. Oper. Res. 2021, 294, 405–426. [Google Scholar] [CrossRef]
- Bereszyński, K.; Pelic, M.; Paszkowiak, W.; Pabiszczak, S.; Myszkowski, A.; Walas, K.; Czechmanowski, G.; Węgrzynowski, J.; Bartkowiak, T. Passive Wheels—A New Localization System for Automated Guided Vehicles. Heliyon 2024, 10, e34967. [Google Scholar] [CrossRef]
- Vignarca, D.; Vignati, M.; Arrigoni, S.; Sabbioni, E. Infrastructure-Based Vehicle Localization through Camera Calibration for I2V Communication Warning. Sensors 2023, 23, 7136. [Google Scholar] [CrossRef]
- Zheng, Z.; Lu, Y. Research on AGV Trackless Guidance Technology Based on the Global Vision. Sci. Prog. 2022, 105, 00368504221103766. [Google Scholar] [CrossRef] [PubMed]
- Sousa, R.B.; Sobreira, H.M.; Moreira, A.P. A Systematic Literature Review on Long-Term Localization and Mapping for Mobile Robots. J. Field Robot. 2023, 40, 1245–1322. [Google Scholar] [CrossRef]
- Zeghmi, L.; Amamou, A.; Kelouwani, S.; Boisclair, J.; Agbossou, K. A Kalman–Particle Hybrid Filter for Improved Localization of AGV in Indoor Environment. In Proceedings of the 2nd International Conference on Robotics, Automation and Artificial Intelligence (RAAI), Singapore, 9–11 December 2022; pp. 141–147. [Google Scholar] [CrossRef]
- Xu, H.; Li, Y.; Lu, Y. Research on Indoor AGV Fusion Localization Based on Adaptive Weight EKF Using Multi-Sensor. J. Phys. Conf. Ser. 2023, 2428, 012028. [Google Scholar] [CrossRef]
- Sousa, L.C.; Silva, Y.M.R.; Schettino, V.B.; Santos, T.M.B.; Zachi, A.R.L.; Gouvea, J.A.; Pinto, M.F. Obstacle Avoidance Technique for Mobile Robots at Autonomous Human–Robot Collaborative Warehouse Environments. Sensors 2025, 25, 2387. [Google Scholar] [CrossRef]
- Urrea, C.; Agramonte, R. Kalman Filter: Historical Overview and Review of Its Use in Robotics 60 Years after Its Creation. J. Sens. 2021, 2021, 9674015. [Google Scholar] [CrossRef]
- Grzechca, D.; Ziebinski, A.; Paszek, K.; Hanzel, K.; Giel, A.; Czerny, M.; Becker, A. How Accurate Can UWB and Dead Reckoning Positioning Systems Be? Comparison to SLAM Using the RPLidar System. Sensors 2020, 20, 3761. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, S.; Xie, Y.; Xiong, T.; Wu, M. A Review of Sensing Technologies for Indoor Autonomous Mobile Robots. Sensors 2024, 24, 1222. [Google Scholar] [CrossRef] [PubMed]
- Brooks, A.; Makarenko, A.; Upcroft, B. Gaussian Process Models for Sensor-Centric Robot Localisation. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation (ICRA 2006), Orlando, FL, USA, 15–19 May 2006; pp. 56–61. [Google Scholar] [CrossRef]
- Li, M.; Mourikis, A.I. High-Precision, Consistent EKF-Based Visual–Inertial Odometry. Int. J. Robot. Res. 2013, 32, 690–711. [Google Scholar] [CrossRef]
- Fan, Z.; Zhang, L.; Wang, X.; Shen, Y.; Deng, F. LiDAR, IMU, and Camera Fusion for Simultaneous Localization and Mapping: A Systematic Review. Artif. Intell. Rev. 2025, 58, 174. [Google Scholar] [CrossRef]
- Teng, X.; Shen, Z.; Huang, L.; Li, H.; Li, W. Multi-Sensor Fusion Based Wheeled Robot Research on Indoor Positioning Method. Results Eng. 2024, 22, 102268. [Google Scholar] [CrossRef]
- Yan, Y.; Zhang, B.; Zhou, J.; Zhang, Y.; Liu, X. Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual–IMU–Wheel Odometry for Agricultural Robots. Agronomy 2022, 12, 1740. [Google Scholar] [CrossRef]
- Hesch, J.A.; Kottas, D.G.; Bowman, S.L.; Roumeliotis, S.I. Camera–IMU-Based Localization: Observability Analysis and Consistency Improvement. Int. J. Robot. Res. 2013, 33, 182–201. [Google Scholar] [CrossRef]
- Niu, X.; Wu, Y.; Kuang, J. Wheel-INS: A Wheel-Mounted MEMS IMU-Based Dead Reckoning System. IEEE Trans. Veh. Technol. 2021, 70, 9814–9825. [Google Scholar] [CrossRef]
- Pawako, S.; Khaewnak, N.; Kosiyanurak, A.; Wanglomklang, T.; Srisertpol, J. Optimizing Multi-Camera PnP Localization via Gaussian Process Regression for Intelligent AGV Navigation. Int. J. Intell. Eng. Syst. 2025, 18, 673–686. [Google Scholar] [CrossRef]
- Perera, L.D.L.; Wijesoma, W.S.; Adams, M.D. The Estimation Theoretic Sensor Bias Correction Problem in Map Aided Localization. Int. J. Robot. Res. 2006, 25, 645–667. [Google Scholar] [CrossRef]
- Khaleghi, B.; Khamis, A.; Karray, F.O.; Razavi, S.N. Multisensor Data Fusion: A Review of the State-of-the-Art. Inf. Fusion 2013, 14, 28–44. [Google Scholar] [CrossRef]
- Chen, Y.; Cui, Q.; Wang, S. Fusion Ranging Method of Monocular Camera and Millimeter-Wave Radar Based on Improved Extended Kalman Filtering. Sensors 2025, 25, 3045. [Google Scholar] [CrossRef] [PubMed]
- Akhlaghi, S.; Zhou, N.; Huang, Z. Adaptive Adjustment of Noise Covariance in Kalman Filter for Dynamic State Estimation. In Proceedings of the 2017 IEEE Power & Energy Society General Meeting, Chicago, IL, USA, 16–20 July 2017; pp. 1–5. [Google Scholar] [CrossRef]
- Akiba, T.; Sano, S.; Yanase, T.; Ohta, T.; Koyama, M. Optuna: A Next-Generation Hyperparameter Optimization Framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; ACM: New York, NY, USA, 2019; pp. 2623–2631. [Google Scholar] [CrossRef]
- Bergstra, J.; Bardenet, R.; Bengio, Y.; Kégl, B. Algorithms for Hyper-Parameter Optimization. In Advances in Neural Information Processing Systems; Shawe-Taylor, J., Zemel, R., Bartlett, P., Pereira, F., Weinberger, K.Q., Eds.; Curran Associates, Inc.: Red Hook, NY, USA, 2011; Volume 24, pp. 1–9. Available online: https://proceedings.neurips.cc/paper_files/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf (accessed on 23 November 2025).
- Snoek, J.; Larochelle, H.; Adams, R.P. Practical Bayesian Optimization of Machine Learning Algorithms. In Advances in Neural Information Processing Systems; Pereira, F., Burges, C.J., Bottou, L., Weinberger, K.Q., Eds.; Curran Associates, Inc.: Red Hook, NY, USA, 2012; Volume 25, pp. 1–9. Available online: https://proceedings.neurips.cc/paper_files/paper/2012/file/05311655a15b75fab86956663e1819cd-Paper.pdf (accessed on 23 November 2025).
- Roy, S.; Petrizze, D.; Dihel, L.; Xue, M.; Dolph, C.; Holbrook, H. Using Trajectory Smoothness Metrics to Identify Drones in Radar Track Data. In Proceedings of the AIAA Aviation 2022 Forum. NASA Technical Report 20220006977, Chicago, IL, USA, 27 June–1 July 2022; Available online: https://ntrs.nasa.gov/citations/20220006977 (accessed on 23 November 2025).
- Desai, S.; Shrivastava, A.; D’Elia, M.; Najm, H.N.; Dingreville, R. Trade-Offs in the Latent Representation of Microstructure Evolution. Acta Mater. 2024, 263, 119514. [Google Scholar] [CrossRef]












| Sensor | Model | Specifications |
|---|---|---|
| Encoder | DC Motor JGB37-520 with Double Magnetic Hall Encoder | AB-phase (quadrature) output for direction and revolution detection; 90 pulses/rev; operating voltage: 3.3–5 V DC. |
| IMU | WitMotion WT901C (9-DOF) | Measures linear acceleration ( g), angular velocity (/s), and orientation; sampling rate: 10 Hz. |
| CCTV | TP-Link Tapo C200 | Wireless IP camera; 1080p at 30 fps; H.264 compression; IR LED (850 nm, up to 30 ft); WiFi video streaming. |
| Reference Camera (GT) | OKER HD-869 | Full HD (1920 × 1080) resolution; auto-focus lens; used as the ground-truth reference camera. |
| Sampling Interval dt (s) | Sensor | Mean Error | Mean Error | Mean Error |
|---|---|---|---|---|
| X (m) | Y (m) | θ (rad) | ||
| 0.1 | CAM | 0.0084 | −0.0515 | – |
| DR | −0.0258 | 0.0423 | – | |
| IMU | – | – | 0.2176 | |
| 0.2 | CAM | 0.0058 | −0.0500 | – |
| DR | −0.0263 | 0.0463 | – | |
| IMU | – | – | 0.2492 |
| Sampling Interval dt (s) | Sensor | X | X | Y | Y | θ | θ |
|---|---|---|---|---|---|---|---|
| Raw (m) | BC (m) | Raw (m) | BC (m) | Raw (rad) | BC (rad) | ||
| 0.1 | CAM | 0.0665 | 0.0660 | 0.1002 | 0.0859 | – | – |
| DR | 0.0934 | 0.0897 | 0.0503 | 0.0272 | – | – | |
| IMU | – | – | – | – | 1.1959 | 1.1524 | |
| 0.2 | CAM | 0.0601 | 0.0598 | 0.0993 | 0.0858 | – | – |
| DR | 0.0898 | 0.0858 | 0.0561 | 0.0317 | – | – | |
| IMU | – | – | – | – | 1.2039 | 1.1497 |
| Data Type | Sensor | Var X | Var Y | Var | |||
|---|---|---|---|---|---|---|---|
| (m2) | (m2) | (rad2) | |||||
| Raw | CAM | 0.004374 | 0.007429 | – | 0.648666 | 0.091047 | – |
| DR | 0.008076 | 0.000744 | – | 0.351334 | 0.908953 | – | |
| IMU | – | – | 1.320644 | – | – | 1.000000 | |
| Bias corrected | CAM | 0.004374 | 0.007429 | – | 0.648666 | 0.091047 | – |
| DR | 0.008076 | 0.000744 | – | 0.351334 | 0.908953 | – | |
| IMU | – | – | 1.320644 | – | – | 1.000000 |
| Data Type | Sensor | Var X | Var Y | Var | |||
|---|---|---|---|---|---|---|---|
| (m2) | (m2) | (rad2) | |||||
| Raw | CAM | 0.003599 | 0.007408 | – | 0.673066 | 0.119823 | – |
| DR | 0.007409 | 0.001009 | – | 0.326934 | 0.880177 | – | |
| IMU | – | – | 1.320571 | – | – | 1.000000 | |
| Bias corrected | CAM | 0.003599 | 0.007408 | – | 0.673066 | 0.119823 | – |
| DR | 0.007409 | 0.001009 | – | 0.326934 | 0.880177 | – | |
| IMU | – | – | 1.320571 | – | – | 1.000000 |
| Sampling Interval dt (s) | Fusion Type | RMSE X (m) | RMSE Y (m) |
|---|---|---|---|
| 0.1 | Fused (Raw) | 0.0718 | 0.0447 |
| Fused (Corrected) | 0.0695 | 0.0234 | |
| 0.2 | Fused (Raw) | 0.0410 | 0.0414 |
| Fused (Corrected) | 0.0411 | 0.0223 |
| Sampling Interval (s) | EKF Model | RMSE X (m) | RMSE Y (m) | RMSE XY (m) |
|---|---|---|---|---|
| 0.1 | Normal (RAW) | 0.055219 | 0.044184 | 0.070721 |
| Normal (Bias Corrected) | 0.053581 | 0.022928 | 0.058280 | |
| Blend () | 0.059065 | 0.017334 | 0.061556 | |
| Blend () | 0.070484 | 0.017570 | 0.072640 | |
| Blend () | 0.052599 | 0.017642 | 0.055479 | |
| Blend () | 0.070489 | 0.017659 | 0.072668 | |
| Blend () | 0.052560 | 0.017574 | 0.055420 | |
| 0.2 | Normal (RAW) | 0.039454 | 0.041266 | 0.057092 |
| Normal (Bias Corrected) | 0.039536 | 0.022023 | 0.045256 | |
| Blend () | 0.042736 | 0.032832 | 0.053891 | |
| Blend () | 0.056509 | 0.018966 | 0.059606 | |
| Blend () | 0.039983 | 0.051770 | 0.065413 | |
| Blend () | 0.056462 | 0.051782 | 0.076611 | |
| Blend () | 0.039963 | 0.018912 | 0.044212 |
| Sampling Interval dt (s) | EKF Model | TRI |
|---|---|---|
| 0.1 | EKF Normal (RAW) | 0.0405 |
| EKF Normal (BC) | 0.0371 | |
| EKF Blend () | 0.0368 | |
| EKF Blend () | 0.0266 | |
| EKF Blend () | 0.0455 | |
| EKF Blend () | 0.0301 | |
| EKF Blend () | 0.0428 | |
| 0.2 | EKF Normal (RAW) | 0.0235 |
| EKF Normal (BC) | 0.0235 | |
| EKF Blend () | 0.0227 | |
| EKF Blend () | 0.0219 | |
| EKF Blend () | 0.0237 | |
| EKF Blend () | 0.0221 | |
| EKF Blend () | 0.0236 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Khaewnak, N.; Seangsri, S.; Pawako, S.; Khaengkarn, S.; Srisertpol, J. A Blended Extended Kalman Filter Approach for Enhanced AGV Localization in Centralized Camera-Based Control Systems. Automation 2026, 7, 4. https://doi.org/10.3390/automation7010004
Khaewnak N, Seangsri S, Pawako S, Khaengkarn S, Srisertpol J. A Blended Extended Kalman Filter Approach for Enhanced AGV Localization in Centralized Camera-Based Control Systems. Automation. 2026; 7(1):4. https://doi.org/10.3390/automation7010004
Chicago/Turabian StyleKhaewnak, Nopparut, Soontaree Seangsri, Siripong Pawako, Sorada Khaengkarn, and Jiraphon Srisertpol. 2026. "A Blended Extended Kalman Filter Approach for Enhanced AGV Localization in Centralized Camera-Based Control Systems" Automation 7, no. 1: 4. https://doi.org/10.3390/automation7010004
APA StyleKhaewnak, N., Seangsri, S., Pawako, S., Khaengkarn, S., & Srisertpol, J. (2026). A Blended Extended Kalman Filter Approach for Enhanced AGV Localization in Centralized Camera-Based Control Systems. Automation, 7(1), 4. https://doi.org/10.3390/automation7010004

