Computer Vision-Based Optical Odometry Sensors: A Comparative Study of Classical Tracking Methods for Non-Contact Surface Measurement
Abstract
1. Introduction
2. State of the Art
2.1. Phase Correlation
2.2. Template Matching
2.3. Optical Flow
2.4. Surface Texture and Patterning
2.5. Deep-Learning Approaches
2.6. Taxonomy and Comparative Summary
3. Materials and Methods
3.1. Experimental Design Overview
3.2. Test Image Sources
- Macro-scale image: A single image of a randomly textured surface captured using a standard camera setup. This image contains broad, non-repetitive features and moderate lighting variations, simulating typical visual conditions in everyday applications.
- Micro-scale image: A single high-resolution microscopic image of a metallic surface with a random microstructure. The fine-grained texture, combined with inherent sensor noise and illumination inconsistencies, poses a greater challenge to tracking algorithms under high magnification.
3.3. Subpixel Feed Generation
- 1.
- Calculating the desired window centre at fractional coordinates , where and may be non-integer values;
- 2.
- Sampling the source image at the required subpixel locations using bilinear interpolation;
- 3.
- Generating a new image feed of specified dimensions centred on the interpolated coordinates.

3.4. Experimental Parameters
3.4.1. Feed Window Size
3.4.2. Step Size
3.4.3. Controlled Conditions and Reproducibility
3.5. Performance Metrics
- Mean absolute error: Average Euclidean distance between estimated and true positions across all frames;
- Maximum error: Peak deviation magnitude, indicating worst-case performance;
- Standard deviation of error: Measure of tracking consistency and stability;
- Final position error: Accumulated drift after complete path traversal;
- Error evolution: Temporal analysis of how tracking accuracy degrades or stabilises over long sequences.
3.6. Path Generation
3.7. Implementation Details
- Gaussian blur for noise reduction (kernel size: , );
- Contrast-limited adaptive histogram equalisation (CLAHE) for illumination normalisation;
- Hanning windowing for phase correlation to reduce edge effects.
4. Results
4.1. Experimental Procedure
4.1.1. Virtual Camera Setup
4.1.2. Test Trajectory Execution
4.1.3. Performance Assessment
4.1.4. Statistical Error Analysis
4.1.5. Visual Validation
4.2. Parametric Analysis Results
4.2.1. Macro-Texture Surface Performance
4.2.2. Micro-Texture Surface Performance
4.2.3. Comparative Performance Trends
4.2.4. Failure Analysis and Operational Limits
5. Discussion
5.1. Texture-Dependent Algorithm Behavior
5.2. Fundamental Limitations of Optical Flow
5.3. Window Size Effects and Computational Trade-Offs
5.4. Statistical Robustness and Reliability
5.5. Practical Implementation Framework
- -
- General-purpose applications (8–26% step size): Phase correlation with 400 × 400 pixel windows provides consistent sub-3.5 pixel accuracy,
- -
- Low-speed applications (2–6% step size): Optical flow with 200 × 200 windows offers marginal advantages but with high failure risk,
- -
- Precision-calibrated systems: Template matching at 16%, 22%, or 24% step sizes achieves 2.2–2.8 pixel accuracy.
- -
- Reliable tracking (8–30% step size): Phase correlation with 400 × 400 windows is the only consistently viable option (1.9–2.1 pixels),
- -
- Precision requirements at 6% step size: Template matching with 400 × 400 windows can achieve 4.0 pixel accuracy but requires precise control,
- -
- Optical flow should be categorically avoided regardless of operating conditions.
- -
- Phase correlation at 2–6% step sizes on all textures,
- -
- Template matching with 200 × 200 windows on micro-textures,
- -
- Optical flow for any precision application on micro-textures,
- -
- All methods at step sizes exceeding 26% except phase correlation.
5.6. Implications for Sensor Design
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| 2D | Two-dimensional |
| 3D | Three-dimensional |
| AI | Artificial intelligence |
| CLAHE | Contrast-limited adaptive histogram equalisation |
| CNN | Convolutional neural network |
| DIC | Digital image correlation |
| FFT | Fast Fourier transform |
| GPU | Graphics processing unit |
| NCC | Normalised cross-correlation |
| SNR | Signal-to-noise ratio |
References
- Blikharskyy, Y.; Kopiika, N.; Khmil, R.; Selejdak, J.; Blikharskyy, Z. Review of Development and Application of Digital Image Correlation Method for Study of Stress–Strain State of RC Structures. Appl. Sci. 2022, 12, 10157. [Google Scholar] [CrossRef]
- Quan, S.; Liang, X.; Zhu, H.; Hirano, M.; Yamakawa, Y. HiVTac: A High-Speed Vision-Based Tactile Sensor for Precise and Real-Time Force Reconstruction with Fewer Markers. Sensors 2022, 22, 4196. [Google Scholar] [CrossRef]
- Mallya, R.; Uchil, A.K.; Shenoy, S.B.; Rai, S.K.; Shetty, A. Application of Digital Image Correlation in Aerospace Engineering: Structural Health Monitoring of Aircraft Components. Aerosp. Sci. Technol. 2024, 7, 663–675. [Google Scholar] [CrossRef]
- Mylo, M.D.; Poppinga, S. Digital Image Correlation Techniques for Motion Analysis and Biomechanical Characterization of Plants. Front. Plant Sci. 2024, 14, 1335445. [Google Scholar] [CrossRef]
- Dong, Y.; Pan, B. A Review of Speckle Pattern Fabrication and Assessment for Digital Image Correlation. Exp. Mech. 2017, 57, 1161–1181. [Google Scholar] [CrossRef]
- Lentz, J.; Sevil, H.E.; Fries, D. Image Preprocessing to Enhance Phase Correlation of Featureless Images. Sci. Rep. 2025, 15, 10287. [Google Scholar] [CrossRef]
- Nasajpour-Esfahani, N.; Karimi, S.; Nasseri, S.; Borna, H.; Boostani, A.F.; Gao, R.; Huang, W.; Garmestani, H.; Liang, S.Y. Advancements and Applications of Digital Image Correlation to Characterize Residual Stress: A Review. Mater. Charact. 2025, 228, 115416. [Google Scholar] [CrossRef]
- Rasmy, L.; Sebari, I.; Ettarid, M. Automatic Sub-Pixel Co-Registration of Remote Sensing Images Using Phase Correlation and Harris Detector. Remote Sens. 2021, 13, 2314. [Google Scholar] [CrossRef]
- Foroosh, H.; Zerubia, J.; Berthod, M. Extension of Phase Correlation to Subpixel Registration. IEEE Trans. Image Process. 2002, 11, 188–200. [Google Scholar] [CrossRef]
- Wan, X.; Liu, J.G.; Yan, H. The Illumination Robustness of Phase Correlation for Image Alignment. IEEE Trans. Geosci. Remote Sens. 2015, 53, 5746–5759. [Google Scholar] [CrossRef]
- Xiong, B.; Zhang, Q. On Quadratic Surface Fitting for Subpixel Motion Extraction from Video Images. Entropy 2022, 24, 190. [Google Scholar]
- Guizar-Sicairos, M.; Thurman, S.; Fienup, J. Efficient Subpixel Image-Registration Algorithms. Opt. Lett. 2008, 33, 156–158. [Google Scholar] [CrossRef]
- Padfield, D. Masked Object Registration in the Fourier Domain. IEEE Trans. Image Process. 2012, 21, 2709–2718. [Google Scholar] [CrossRef]
- Li, T.; Wang, J.; Yao, K. Subpixel Image Registration Algorithm Based on Pyramid Phase Correlation and Upsampling. Signal Image Video Process. 2022, 16, 1973–1979. [Google Scholar] [CrossRef]
- Sharma, S.; Kulkarni, R. State-Space Modeling Approach for Fringe Pattern Demodulation. Appl. Opt. 2023, 62, 7330–7337. [Google Scholar] [CrossRef]
- Schubert, F.; Mikolajczyk, K. Benchmarking GPU-Based Phase Correlation for Homography-Based Registration of Aerial Imagery. In Computer Analysis of Images and Patterns, Proceedings of the CAIP 2013, York, UK, 27–29 August 2013; Wilson, R., Hancock, E., Bors, A., Smith, W., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2013; pp. 83–90. [Google Scholar]
- Li, H.; Tan, B.; Pandiyan, V.P.; Barathi, V.A.; Sabesan, R.; Schmetterer, L.; Ling, T. Phase-Restoring Subpixel Image Registration: Enhancing Motion Detection Performance in Fourier-Domain Optical Coherence Tomography. J. Phys. D Appl. Phys. 2025, 58, 145102. [Google Scholar] [CrossRef] [PubMed]
- Ojansivu, V.; Rahtu, E. Image Registration Using Blur-Invariant Phase Correlation. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 27, 555–568. [Google Scholar] [CrossRef]
- Chen, Z.; Chen, Q.; Chen, W.; Wang, Y.; Xu, C.; Li, X.; Dou, Q. DPCN++: Differentiable Phase Correlation Network for Versatile Pose Registration. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 14366–14384. [Google Scholar] [CrossRef]
- Annaby, M.; Fouda, Y. Fast Template Matching and Object Detection Techniques Using ϕ-Correlation and Binary Circuits. Multimed. Tools Appl. 2024, 83, 6469–6496. [Google Scholar] [CrossRef]
- Woodford, O.J. Least Squares Normalized Cross Correlation. arXiv 2018, arXiv:1810.04320. [Google Scholar]
- Salaris, M.; Damiani, A.; Putti, E.; Stornaiuolo, L. FPGA-Based Implementation of 2D Normalized Cross-Correlation for Large Scale Signals. In Proceedings of the 2021 IEEE 6th International Forum on Research and Technology for Society and Industry (RTSI), Naples, Italy, 6–9 September 2021; pp. 300–305. [Google Scholar]
- Cheng, J.; Wu, Y.; AbdAlmageed, W.; Natarajan, P. QATM: Quality-Aware Template Matching for Deep Learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 11545–11554. [Google Scholar]
- Gao, B.; Spratling, M.W. Robust Template Matching via Hierarchical Convolutional Features from a Shape Biased CNN. In Proceedings of the International Conference on Image, Vision and Intelligent Systems (ICIVIS 2021), Changsha, China, 15–17 June 2021; Yao, J., Xiao, Y., You, P., Sun, G., Eds.; Lecture Notes in Electrical Engineering. Springer: Singapore, 2022; Volume 813. [Google Scholar]
- Le, M.-T.; Tu, C.-T.; Guo, S.-M.; Lien, J.-J.J. A PCB Alignment System Using RST Template Matching with CUDA on Embedded GPU Board. Sensors 2020, 20, 2736. [Google Scholar] [CrossRef]
- Matthews, I.; Ishikawa, T.; Baker, S. The Template-Update Problem. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 810–815. [Google Scholar] [CrossRef]
- Gräßl, C.; Zinßer, T.; Niemann, H. Illumination-Insensitive Template Matching with Hyperplanes. In Proceedings of the DAGM Symposium, Magdeburg, Germany, 10–12 September 2003; pp. 273–280. [Google Scholar]
- Alfarano, A.; Maiano, L.; Papa, L.; Amerini, I. Estimating Optical Flow: A Comprehensive Review of the State of the Art. Comput. Vis. Image Underst. 2024, 249, 104160. [Google Scholar] [CrossRef]
- Winkler, J.R. Error Analysis and Condition Estimation of the Pyramidal Form of the Lucas–Kanade Method in Optical Flow. Electronics 2024, 13, 812. [Google Scholar] [CrossRef]
- Xu, H.; Yang, J.; Cai, J.; Zhang, J.; Tong, X. High-Resolution Optical Flow from 1D Attention and Correlation. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 10–17 October 2021; pp. 10498–10507. [Google Scholar]
- Xu, H.; Zhang, J.; Cai, J.; Rezatofighi, H.; Tao, D. GMFlow: Learning Optical Flow via Global Matching. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA, 19–24 June 2022; pp. 8121–8130. [Google Scholar]
- Shi, X.; Huang, Z.; Li, D.; Zhang, M.; Cheung, K.C.; See, S.; Li, H. Flowformer++: Masked Cost Volume Autoencoding for Pretraining Optical Flow Estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 18–22 June 2023; pp. 1599–1610. [Google Scholar]
- Zhou, S.; He, R.; Tan, W.; Yan, B. SAMFlow: Eliminating Any Fragmentation in Optical Flow with Segment Anything Model. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 20–27 February 2024; Volume 38, pp. 7695–7703. [Google Scholar]
- Ishii, I.; Taniguchi, T.; Yamamoto, K.; Takaki, T. 1000-fps Real-Time Optical Flow Detection System. Proc. SPIE 2010, 7538, 181–191. [Google Scholar]
- Baker, S.; Scharstein, D.; Lewis, J.P.; Roth, S.; Black, M.J.; Szeliski, R. A Database and Evaluation Methodology for Optical Flow. Int. J. Comput. Vis. 2011, 92, 1–31. [Google Scholar] [CrossRef]
- Chen, Z.; Shao, X.; Xu, X.; He, X. Optimized Digital Speckle Patterns for Digital Image Correlation by Consideration of Both Accuracy and Efficiency. Appl. Opt. 2018, 57, 884–893. [Google Scholar] [CrossRef] [PubMed]
- Wen, Y.; Wang, J.; Zheng, L.; Chen, S.; An, H.; Li, L.; Long, Y. Method of Generating Speckle Patterns for Digital Image Correlation Based on Modified Conway’s Game of Life. Opt. Express 2024, 32, 11654–11664. [Google Scholar] [CrossRef]
- Hu, X.; Xie, Z.; Liu, F. Assessment of Speckle Pattern Quality in Digital Image Correlation from the Perspective of Mean Bias Error. Measurement 2021, 173, 108618. [Google Scholar] [CrossRef]
- Hu, W.; Sheng, Z.; Yan, K.; Miao, H.; Fu, Y. A New Pattern Quality Assessment Criterion and Defocusing Degree Determination of Laser Speckle Correlation Method. Sensors 2021, 21, 4728. [Google Scholar] [CrossRef]
- Kwon, T.H.; Park, J.; Jeong, H.; Park, K. Assessment of Speckle-Pattern Quality Using Deep-Learning-Based CNN. Exp. Mech. 2023, 63, 163–176. [Google Scholar] [CrossRef]
- Li, Y.; Xue, Y.; Tian, L. Deep Speckle Correlation: A Deep Learning Approach toward Scalable Imaging through Scattering Media. Optica 2018, 5, 1181–1190. [Google Scholar] [CrossRef]
- Guelpa, V.; Laurent, G.J.; Sandoz, P.; Zea, J.G.; Clévy, C. Subpixelic Measurement of Large 1D Displacements: Principle, Processing Algorithms, Performances and Software. Sensors 2014, 14, 5056–5073. [Google Scholar] [CrossRef]
- Dosovitskiy, A.; Fischer, P.; Ilg, E.; Häusser, P.; Hazırbaş, C.; Golkov, V.; van der Smagt, P.; Cremers, D.; Brox, T. FlowNet: Learning Optical Flow with Convolutional Networks. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 2758–2766. [Google Scholar]
- Sun, D.; Yang, X.; Liu, M.Y.; Kautz, J. PWC-Net: CNNs for Optical Flow Using Pyramid, Warping, and Cost Volume. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–22 June 2018; pp. 8934–8943. [Google Scholar]
- Teed, Z.; Deng, J. RAFT: Recurrent All-Pairs Field Transforms for Optical Flow. In Proceedings of the European Conference on Computer Vision (ECCV), Glasgow, UK, 23–28 August 2020; Springer: Cham, Switzerland, 2020; pp. 402–419. [Google Scholar]
- Yang, R.; Li, Y.; Zeng, D.; Guo, P. Deep DIC: Deep Learning-Based Digital Image Correlation for End-to-End Displacement and Strain Measurement. J. Mater. Process. Technol. 2022, 302, 117474. [Google Scholar] [CrossRef]
- Duan, X.; Xu, H.; Dong, R.; Lin, F.; Huang, J. Digital Image Correlation Based on Convolutional Neural Networks. Opt. Lasers Eng. 2023, 160, 107234. [Google Scholar] [CrossRef]
- Shan, Y.; Zhen, M.; Fill, H.D. Research on Structural Mechanics Stress and Strain Prediction Models Combining Multi-Sensor Image Fusion and Deep Learning. Appl. Sci. 2025, 15, 4067. [Google Scholar] [CrossRef]
- Cheng, X.; Zhou, S.; Xing, T.; Zhu, Y.; Ma, S. Solving Digital Image Correlation with Neural Networks Constrained by Strain–Displacement Relations. Opt. Express 2023, 31, 3865–3880. [Google Scholar] [CrossRef]
- Sjödahl, M. Gradient Correlation Functions in Digital Image Correlation. Appl. Sci. 2019, 9, 2127. [Google Scholar] [CrossRef]


















| Approach | Use Cases | Strengths | Weaknesses |
|---|---|---|---|
| Phase correlation | Image registration, microscopy, blur-robust alignment | Illumination/offset invariance; efficient FFT refinement; GPU acceleration | Sensitive to low-texture regions; peak broadening under strong blur |
| Template matching | Visual tracking, PCB/industrial inspection | Simple formulation; scale-adaptive variants; FPGA acceleration | Template drift; poor generalisation under large appearance change |
| Optical flow | Dense motion estimation, robotics, high-speed vision | Dense field; strong CNN/transformer benchmarks; SAM-guided boundary precision | Brightness constancy assumption; weak on low-texture or occluded regions |
| Surface patterning | DIC, precision metrology, experimental mechanics | Optimised speckles; objective quality metrics; CNN assessors | Limited to controlled surfaces; risk of periodic mis-registration |
| Deep learning | FlowNet, RAFT, DeepDIC, hybrid physics-informed models | High accuracy; real-time inference; domain adaptation possible | Require large training data; limited traceability; weaker physical guarantees |
| Parameter | Values | Rationale/Reproducibility |
|---|---|---|
| Feed window size | , px | Balance between texture coverage and computational efficiency; identical regions used for all methods |
| Step size | 2–30% of window dimension (2% increments) | Scale-invariant metric; ensures comparability across window sizes; fixed trajectory applied to all methods |
| Texture Type | Window Size | Algorithm | Safe Op. Zone (% of Window) |
|---|---|---|---|
| Macro-texture | Phase Correlation | ≥14% | |
| Template Matching | 16%, 22%, 24% (isolated) | ||
| Optical Flow | ≤6% only | ||
| Phase Correlation | 8–30% | ||
| Template Matching | 16%, 22%, 24% (isolated) | ||
| Optical Flow | ≤6% only | ||
| Micro-texture | Phase Correlation | – (no consistent safe zone) | |
| Template Matching | ≈6% only | ||
| Optical Flow | – | ||
| Phase Correlation | ≥8% | ||
| Template Matching | ≈6% only | ||
| Optical Flow | – |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Andrijauskas, I.; Šumanas, M.; Dzedzickis, A.; Tanaś, W.; Bučinskas, V. Computer Vision-Based Optical Odometry Sensors: A Comparative Study of Classical Tracking Methods for Non-Contact Surface Measurement. Sensors 2025, 25, 6051. https://doi.org/10.3390/s25196051
Andrijauskas I, Šumanas M, Dzedzickis A, Tanaś W, Bučinskas V. Computer Vision-Based Optical Odometry Sensors: A Comparative Study of Classical Tracking Methods for Non-Contact Surface Measurement. Sensors. 2025; 25(19):6051. https://doi.org/10.3390/s25196051
Chicago/Turabian StyleAndrijauskas, Ignas, Marius Šumanas, Andrius Dzedzickis, Wojciech Tanaś, and Vytautas Bučinskas. 2025. "Computer Vision-Based Optical Odometry Sensors: A Comparative Study of Classical Tracking Methods for Non-Contact Surface Measurement" Sensors 25, no. 19: 6051. https://doi.org/10.3390/s25196051
APA StyleAndrijauskas, I., Šumanas, M., Dzedzickis, A., Tanaś, W., & Bučinskas, V. (2025). Computer Vision-Based Optical Odometry Sensors: A Comparative Study of Classical Tracking Methods for Non-Contact Surface Measurement. Sensors, 25(19), 6051. https://doi.org/10.3390/s25196051

