Next Article in Journal
Automated Assessment of Green Infrastructure Using E-nose, Integrated Visible-Thermal Cameras and Computer Vision Algorithms
Previous Article in Journal
Evaluation of Pre-Applied Conductive Materials in Electrode Grids for Longterm EEG Recording
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Comprehensive Review of Optical Metrology and Perception Technologies

Tsinghua Shenzhen International Graduate School, Tsinghua University, Shenzhen 518055, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2025, 25(22), 6811; https://doi.org/10.3390/s25226811
Submission received: 22 August 2025 / Revised: 20 October 2025 / Accepted: 1 November 2025 / Published: 7 November 2025
(This article belongs to the Section Optical Sensors)

Abstract

Optical metrology and perception technologies employ light as an information carrier to enable non-contact, high-precision measurement of geometry, dynamics, and material properties. They are widely deployed in industrial and consumer domains, from nanoscale defect inspection in semiconductor manufacturing to environmental perception in autonomous driving and spatial tracking in AR/VR. However, existing reviews often treat individual modalities—such as interferometry, imaging, or spectroscopy—in isolation, overlooking the increasing cross-domain integration in emerging systems. This review proposes a hierarchical taxonomy encompassing four core systems: interferometry, imaging, spectroscopy, and hybrid/advanced methods. It introduces a “theory–application–innovation” framework to unify fundamental principles, application scenarios, and evolutionary trends, revealing synergies across modalities. By mapping technological progress to industrial and societal needs, including AI-driven optimization and quantum-enhanced sensing, this work provides a structured, evolving knowledge base. The framework supports both cross-disciplinary understanding and strategic decision-making, offering researchers and engineers a consolidated reference for navigating the rapidly expanding frontiers of optical metrology and perception.

1. Introduction

Optical metrology and perception technologies employ light waves as carriers of information to enable precise quantification and perception of geometric morphology, dynamic behaviors, and material properties of target objects by measuring light intensity, phase, wavelength, polarization, and other physical parameters. Distinguished by their non-contact nature, high precision, and exceptional sensitivity, these technologies have become indispensable in both industrial and consumer domains. In industrial settings, they support critical processes such as nanoscale defect inspection in semiconductor manufacturing, high-precision positioning for robotic manipulation, and environmental perception for autonomous driving (e.g., LiDAR-based 3D reconstruction). On the consumer side, they underpin applications ranging from spatial interaction tracking in AR/VR to biometric authentication (e.g., facial recognition) and motion capture for immersive entertainment, as shown in Figure 1. As shown in Table 1, optical measurement techniques cover multiple spatial scales, ranging from nanometer-level interferometry to cross-scale computational imaging techniques. With the advancement of smart manufacturing [1,2] and the rise of immersive digital ecosystems, optical sensing is evolving toward greater multidimensionality, higher temporal resolution [3,4], and intelligent adaptability [5].
Despite substantial progress, existing literature tends to focus narrowly on specific technical branches, such as structured-light 3D imaging, interferometric displacement metrology, or spectral inspection, without sufficiently addressing the cross-domain synergies that increasingly define the field. Modern optical measurement systems often integrate multiple modalities [6]—interferometry leverages wave interference for nanometric displacement and surface roughness analysis [7]; imaging techniques span from triangulation-based contour scanning to computational imaging through scattering media; spectroscopy underpins both material identification and dimensional metrology; and hybrid/advanced methods, including fringe projection for deformation analysis, time-of-flight depth sensing, quantum-enhanced interferometry, and metasurface-enabled wavefront control—are pushing performance frontiers. Traditional taxonomies, however, struggle to accommodate such convergence—e.g., light-field imaging that blends geometric optics with computational reconstruction, or spectral confocal techniques that merge spectroscopic and spatial measurement principles.
Fundamentally, the objectives of optical measurement and perception can be distilled into two primary categories: positioning and surface contour characterization. Positioning encompasses precision motion control and accurate determination of spatial posture, while contour characterization involves capturing macroscopic shape, positional deviations, fine-scale profile, roughness, and other surface integrity parameters. These goals form the performance benchmarks by which diverse optical metrology techniques are evaluated, and they provide a common ground for integrating different modalities into coherent measurement strategies.
In this context, a comprehensive and integrative review is both timely and necessary. This work classifies optical metrology into four core systems—interferometry, imaging, spectroscopy, and hybrid or advanced methods—within a “theory–application–innovation” analytical framework. This structure reveals the links between different techniques, traces their evolution, and connects technological progress with industrial and societal needs such as AI-driven optical optimization and quantum-enhanced sensing. The review provides both a technical reference for researchers and a strategic guide for engineers and planners working in the expanding fields of optical metrology and perception.
In this review, the concept of optical perception is introduced as an evolutionary extension of optical metrology. While metrology primarily focuses on achieving quantitative precision in measurement—such as displacement, wavelength, or surface profiling—perception emphasizes the ability of optical systems to interpret, recognize, and respond to complex environmental information. Through the integration of computational algorithms, artificial intelligence, and data-driven modeling, modern optical systems are transitioning from traditional measurement instruments to perception-oriented platforms capable of contextual understanding and adaptive decision-making. In this framework, metrology provides the quantitative foundation, whereas perception represents the cognitive extension of optical measurement.
Scope note: The discussion focuses on industrial and consumer applications (e.g., manufacturing, robotics, automotive), excluding specialized domains such as Raman spectroscopy in materials science or biomedical imaging. While comprehensive coverage is inherently challenging amid rapid innovation, the intent is to offer a scalable framework that can be updated as the field advances.

2. Interferometry-Based Metrology

Optical interferometry-based metrology can be categorized into photodetector (PD)-based methods and CCD-based methods. PD-based interferometric systems rely on photodetectors to decode interference patterns and serve as the cornerstone of high-precision metrology [8]. Such approaches primarily encompass three techniques: grating interferometry, laser interferometry, and optical frequency comb interferometry. This section provides an review of their underlying principles, recent technological advances, and representative applications. Figure 2 shows the common interference-based measurement methods.

2.1. Laser Interferometry

Laser interferometry is a cornerstone of modern precision metrology. By utilizing the interference of coherent laser beams, it provides displacement, angle, and vibration measurements with exceptional resolution—ranging from nanometers down to picometers or even femtometers. Its flexibility in design and adaptability to various modulation techniques has led to numerous implementations tailored to specific application needs. Figure 2a shows the basic construction principle of the Laser Interferometer Gravitational-Wave Observatory (LIGO) system, which is based on laser interferometry technology. This section presents a detailed examination of several core configurations, including homodyne, heterodyne, and superheterodyne systems, along with specialized modalities and their use in advanced metrology.

2.1.1. Homodyne Systems

A homodyne laser interferometer employs a single-frequency laser source and extracts phase information from the intensity variations of the interference signal. Let the reference arm field be E r ( t ) = A r e i ( ω t + ϕ r ) , and the measurement arm field be E s ( t ) = A s e i ( ω t + ϕ s ) . The resulting superimposed field leads to an intensity of
I ( t ) = | E r + E s | 2 = A r 2 + A s 2 + 2 A r A s cos ( Δ ϕ )
where Δ ϕ = ϕ s ϕ r is related to the optical path difference Δ L via Δ ϕ = 2 π λ Δ L . By measuring relative changes in phase, one can determine the corresponding displacement, deformation, vibration, and other quantities of interest.
Homodyne interferometry can achieve sub-nanometer precision, with a theoretical non-ambiguity range of λ / 2 . To extend this range, phase unwrapping techniques are required. A common approach to phase retrieval is to generate two orthogonal signals using polarization optics, from which the phase is calculated via an arctangent operation. The homodyne configuration is advantageous in its structural simplicity, low cost, and ease of integration. However, because the measurement signal contains a DC component, it is susceptible to DC drift induced by external disturbances, which degrades phase estimation accuracy. Furthermore, due to inherent imperfections in optical components and environmental influences, perfect orthogonality between signals is rarely achieved, introducing additional nonlinear errors.
To address these limitations, a variety of compensation techniques have been developed. For instance, optical path-folding designs can enhance resolution without increasing system dimensions, while quadrant-based signal acquisition can suppress channel crosstalk. Zhang et al. implemented a homodyne SPM system with real-time phase delay correction, achieving less than 1 nm error in large-range displacement scanning interferometry [9]. In practice, homodyne interferometers have been widely employed in scanning probe microscopy (SPM), nanoscale feedback loops, and angle encoders. Notably, both the Renishaw XL-80 laser system and the API 5D system are based on the homodyne configuration [10,11], and see extensive use in production environments for machine tool calibration and linear axis verification. Further advancements include adaptive gain-based phase compensation algorithms; for example, Keem et al. proposed a real-time gain adjustment method in which the preamplifier gains of a quadrant detector are tuned to effectively eliminate periodic errors caused by imperfections in the polarization beam splitter, achieving experimental errors as low as 0.04 nm [12].

2.1.2. Heterodyne Interferometry

A heterodyne interferometric system employs two optical beams with a small frequency difference Δ f = f 1 f 2 to generate a time-varying beat signal. Let the two optical fields be E 1 = A 1 e i ( ω 1 t + ϕ 1 ) , E 2 = A 1 e i ( ω 2 t + ϕ 2 ) , the resulting superimposed intensity can be expressed as
I ( t ) = A 1 2 + A 2 2 + 2 A 1 A 2 cos ( Δ ω t + Δ ϕ )
where Δ ω = 2 π Δ f and Δ ϕ = ϕ 2 ϕ 1 carries the information of the optical path difference. By processing the beat signal using a lock-in amplifier or a digital phase demodulator, Δ ϕ can be measured with high accuracy in the low-frequency domain. Compared with the amplitude-modulated signal in homodyne interferometry, the frequency-modulated signal in heterodyne interferometry inherently suppresses DC bias and exhibits greater robustness against environmental disturbances.
Heterodyne interferometric systems are particularly well-suited for long-range displacement measurements, as well as industrial applications requiring high dynamic range, immunity to zero drift, and precision measurements of displacement, vibration, or angular motion. With a stable frequency reference and high signal-to-noise ratio (SNR) detection, heterodyne interferometry often achieves relative measurement resolution from the picometer to sub-nanometer scale. Moreover, the beat frequency processing effectively down-converts the optical phase information into a frequency band more amenable to digital processing, thereby enhancing the stability of phase extraction.
For example, Joo et al. demonstrated a high-resolution heterodyne laser interferometer for linear displacement measurement that exhibited no periodic nonlinearity. Their design employed acousto-optic frequency shifting and spatial separation of the signal paths, thereby minimizing mode coupling and crosstalk [13,14]. Additional examples include Leirset’s GHz-range vibration sensor with femtometer sensitivity [15], Hao Yan’s dual-beam heterodyne sensor achieving 1 pm/Hz performance for linear and angular displacement [16], and Dong’s heterodyne system using phase-locked loop demodulation to reach 10-pm resolution in both air and vacuum environments [17]. These advances highlight the adaptability of heterodyne methods in emerging micro/nano-motion sensing applications.

2.1.3. Superheterodyne Interferometry

Optical super-heterodyne interferometry draws inspiration from the super-heterodyne concept in radio technology: two (or more) optical frequencies are mixed, either electronically or optically, to generate an intermediate frequency (IF) that can be directly measured. This enables the detection of phase or frequency differences that are difficult to measure at the original frequency separation (for example, when the frequency difference between two optical waves exceeds the bandwidth of the electronic detector).
For a dual-wavelength/dual-frequency source λ 1 and λ 2 , if each is mixed with a local oscillator (or a modulated local oscillator), a low-frequency IF can be obtained, whose phase contains the phase difference information of the two beams. Super-heterodyne techniques are often used to generate a “synthetic wavelength” Λ to increase the non-ambiguity range. The synthetic wavelength Λ satisfies:
Λ = λ 1 λ 2 | λ 1 λ 2 |
and provides displacement information over a larger range via the synthetic phase Φ Λ . Implementation methods for super-heterodyne detection include AOM/EO modulation, electronic down-conversion, and optoelectronic mixing of frequency combs or swept sources. In a two-wavelength super-heterodyne system, the use of the synthetic wavelength allows the phase measurement to be converted into a measurement corresponding to a longer “equivalent wavelength,” thereby extending the non-ambiguity range while maintaining high phase sensitivity. However, the synthetic wavelength is inherently sensitive to noise, wavelength drift, and system calibration, and the ultimate absolute measurement uncertainty is determined by the relative wavelength accuracy of each monochromatic source and the system phase noise. In practical applications, absolute ranging accuracy from sub-millimeter to sub-micrometer can be achieved, making this approach suitable for scenarios requiring either a large measurement range or absolute distance measurement.
Le Floch et al. proposed a novel super-heterodyne technique based on the two-wavelength interferometry (TWI) method for long-distance measurement, in which the two frequencies are generated from synchronously scanned optical and radio frequencies, achieving an accuracy of ±50 μm over a range of 1–15 m [18]. Yin et al. combined heterodyne and super-heterodyne interferometers to realize simultaneous multi-wavelength detection and demodulation, achieving an accuracy of 17 μm within a 2 m range in their experiments [19].

2.1.4. Specialty Modalities

Beyond the classical homodyne and heterodyne architectures, several specialized laser interferometry techniques have emerged to serve application-specific requirements:
Fabry–Perot Interferometers (FPI) have shown excellent sensitivity and compactness for refractive index, gas pressure, and temperature sensing. Liu and Qu fabricated an FPI cavity via femtosecond laser-induced water breakdown and arc annealing, achieving high fringe visibility (30 dB), sensitivity (1147.48 nm/RIU), and temperature insensitivity, suitable for liquid refractive index measurements [20]. Xu et al. developed a fiber FPI with a glass microsphere enabling simultaneous gas pressure and temperature sensing via multi-beam interference, offering robust, easy-to-fabricate devices for industrial monitoring [21].
Self-Mixing Interferometry (SMI) uses feedback light re-entering the laser cavity to modulate output intensity in a compact setup. Albert et al. developed a theoretical and experimental framework for long-distance (>10 m) SMI, demonstrating practical vibration and displacement sensing with SNR = 3.6 at 12 m [22]. Zhang et al. introduced a multichannel SMI system employing injection current modulation and frequency division multiplexing, achieving multi-target displacement sensing with <1.4% relative error. These advances extend SMI to distributed sensing and precision manufacturing [23].
Dual-Comb Interferometry (DCI) exploits two stabilized frequency combs for fast, absolute distance and spectral measurements without mechanical scanning. Herman et al. achieved picometer-level displacement precision via carrier-envelope phase tracking [24]. Deng et al. demonstrated a high-coherence dual-comb system enabling simultaneous gas spectroscopy and absolute distance measurement with 0.68 μm precision and 40 μm lateral resolution. DCI excels in speed, spectral resolution, and multifunctionality, ideal for environmental and aerospace metrology [25].
Frequency-Sweeping Interferometry (FSI) encodes distance via tunable laser wavelength sweeping for large-range absolute measurements with simple hardware. Zhang et al. improved FSI accuracy and stability at 4.5 m by integrating reference interferometers and target drift compensation [26]. Coggrave et al. addressed high sampling demands in long-range FSI using silicon photonic adaptive delay lines, enabling chip-scale integration. FSI offers cost-effective, flexible solutions but requires precise wavelength calibration and drift control for high accuracy [27].

2.2. Grating Interferometry

Grating interferometry employs the interference between diffracted beams from periodic structures to achieve displacement measurement with sub-nanometer resolution [28,29,30]. Due to its modular design, compact structure [31,32,33], and high degree-of-freedom (DOF) expandability [34], it has been increasingly adopted in semiconductor lithography [35,36,37], ultra-precision manufacturing [38,39,40], and optical encoder systems [41,42,43,44,45].

2.2.1. Single-DOF and Planar Systems

Initial implementations of grating interferometry were primarily focused on 1-DOF displacement measurement. By combining ±1st-order diffracted beams from a reflective or transmissive grating, phase shifts induced by linear displacement could be accurately resolved [46,47,48,49]. Commercial systems such as Magnescale’s laser encoders reach a resolution of 0.017 nm using small-period gratings and optical subdivision techniques. To enhance measurement stability and alignment tolerance, designs have evolved to include self-collimating incidence and quasi-common-path configurations. Hsieh et al. [50] proposed a quasi-common-path heterodyne system that achieved sub-3 nm resolution under practical operation conditions. Wu et al. [51] developed a Littrow configuration grating interferometer that expands Z-directional range and maintains high contrast fringe quality, making it well-suited for vertical stage calibration. For planar (2D) systems, Kimura et al. [52,53,54] constructed a reflective scale grating encoder capable of simultaneous X–Z detection with sub-nanometer resolution, particularly useful for high-speed precision stages. Yin et al. [55] and Yang et al. [56] designed heterodyne double-spatial systems and fiber-coupled grating interferometers for XY displacement tracking. Their systems achieved in-plane stability of 0.246 nm and out-of-plane resolution of 0.465 nm, even under environmental drift.

2.2.2. Three-DOF and Six-DOF Systems

As metrology demands increase for wafer stages, air-bearing platforms, and robotic motion systems, grating interferometry has expanded into multi-dimensional configurations [57,58,59]. These systems not only track translational motion (X, Y, Z) but also rotational and angular displacements (pitch, yaw, roll) [60]. Compact 3-DOF systems have been realized using pyramid-prism beam shaping and planar cross-gratings [61]. Wang [62] et al. demonstrated a compact 3D encoder with better than 500 nm accuracy across three axes. Kimura et al. introduced a sub-nanometer resolution three-axis surface encoder using a hybrid reflective planar grating and collimated sensing beam.
For full 6-DOF tracking, Hsieh et al. [63] proposed a Michelson-grating hybrid interferometer integrating grating shearing and heterodyne detection. The system achieved 2 nm displacement resolution and a 0.05 μrad angular accuracy. Lin et al. improved the optical diffraction efficiency and SNR of 2D/3D configurations using gold-coated cross gratings with theoretical signal contrast near 100%. Cui et al. [64] report an integrated zero-dead-zone heterodyne grating interferometer that enables simultaneous 3-DOF atomic-level displacement measurement. Utilizing a dual-frequency laser and a symmetrical optical path, the system eliminates dead-zone error and achieves sub-nanometer resolution (0.25 nm in X/Y, 0.3 nm in Z), high linearity ( 10 5 ), and 0.8 nm repeatability, as shown in Figure 2b. The design significantly improves precision and environmental stability in multi-axis metrology. These developments are critical for next-generation photolithography systems [65,66] and high-precision robotic metrology [67,68], where six-dimensional feedback is a necessity.

2.2.3. Multi-Optical-Head Architectures

To expand spatial measurement volume and address complex motion control systems, multi-head grating interferometers are implemented. These systems consist of several spatially distributed optical heads referencing a single grating target or surface, enabling comprehensive rigid-body pose reconstruction [69,70,71,72]. One prominent industrial example is ASML’s 6-DOF metrology platform, which employs multiple grating sensors across a wafer stage. The redundancy and spatial diversity of measurements enable high accuracy (sub-micron linear, sub-arcsecond angular) across meter-scale motion envelopes. Recent multi-head implementations also integrate real-time thermal compensation [73], modular readout electronics [74], and parallel FPGA-based signal processing [75], extending their use in precision gantry systems, coordinate measuring machines (CMMs) [76], and space-constrained environments such as in-line inspection arms.
While grating interferometers offer numerous advantages, they are not without challenges [77,78]. Common issues include phase errors from grating non-uniformity [79,80,81], alignment sensitivity [82], thermal expansion of scale substrates, and difficulty in demodulating multi-DOF interference signals with high fidelity [83,84,85]. Emerging solutions focus on improved grating fabrication via nanoimprint lithography or interference lithography [86,87,88,89,90,91,92], error modeling using phase chain compensation, and signal enhancement through machine learning-assisted demodulation algorithms [93]. There is also increasing interest in integrating smart sensors and edge-AI modules directly into encoder heads to support adaptive correction and in-situ diagnostics.
In particular, grating parameters significantly affect the accuracy and resolution of grating interferometers [94,95,96,97,98]. Cheng Xinbin et al. [99] proposed an atomic lithography technology, through which one-dimensional and two-dimensional chromium self-traceable gratings are prepared. The grating pitch is directly traced back to the chromium atomic transition frequency (natural constant), and the pitch accuracy reaches the picometer level (0.001 nm), which may provide a new solution for further grating interferometry.

2.3. Optical Frequency Comb-Based Interferometry

Optical frequency combs (OFCs), consisting of a series of equally spaced, phase-coherent optical modes, have revolutionized the field of precision measurement [100,101]. When coupled with photodetector-based interferometric techniques, OFCs enable absolute, high-precision displacement and distance measurements with remarkable resolution and traceability to primary time and length standards. Their intrinsic frequency stability and ultra-broadband spectral characteristics allow for applications that are challenging or inaccessible to traditional single-wavelength interferometers [102,103].
Optical frequency combs (OFCs) have significantly advanced precision measurement techniques by offering a highly stable and spectrally coherent reference composed of evenly spaced optical frequencies. Unlike traditional interferometry that typically employs a single frequency or a limited set of discrete wavelengths, frequency comb interferometry uses hundreds or thousands of frequency lines simultaneously, each precisely defined and traceable to fundamental atomic standards. The fundamental principle involves interfering comb pulses or comb lines after they have traveled different optical paths. Photodetectors then convert this optical interference into electrical signals containing information about the relative optical delays, directly translating into absolute displacement or distance measurements [7,104].

2.3.1. Absolute Distance Measurement

The narrow linewidth, evenly spaced comb-tooth structure of an optical frequency comb, together with its direct traceability to radio-frequency (RF) or optical standards, makes it an important light source for high-precision absolute distance measurement. In early work, Minoshima and Matsumoto [105] employed a train of femtosecond laser pulses as a time ruler to implement a time-of-flight (TOF)-based absolute ranging method, achieving an accuracy of 3 μm over a range of 240 m, and improving real-time performance via multi-channel synchronization. Subsequently, Coddington et al. [106] proposed the dual-comb interferometry (DCI) method, in which two Er-doped fiber optical frequency combs were employed with a repetition rate difference of approximately 1 kHz, both locked to the same frequency standard. By generating an equivalent low-frequency interference signal through beat detection, mechanical scanning was eliminated, enabling ranging accuracy better than 1 μm over distances from 1 mm to 1 km. Lee et al. [107] combined an optical frequency comb with a tunable narrow-linewidth laser to generate multiple synthetic wavelengths, effectively extending the non-ambiguity range and achieving sub-micrometer accuracy over a measuring range of 100 m.

2.3.2. Dynamic Measurement and High-Speed Profiling

In dynamic measurement and high-speed profilometry, dual-comb interferometry offers a promising approach for non-contact measurements with high sampling rates and high precision. Ideguchi et al. [108] proposed a time-stretch dual-comb interferometry technique, utilizing two optical frequency combs with a stable repetition rate difference to acquire interference signals from the surface of an object in real time. This enabled contour measurement at a rate of 25 kHz, applicable to monitoring high-speed vibrations or dynamic processes. Kobayashi et al. [109] built a dual-comb system capable of position tracking with sub-micrometer resolution for rapidly moving targets at a sampling rate of 500 Hz, suitable for online monitoring in precision manufacturing processes. In addition, Hase et al. [110] combined dual-comb interferometry with optical coherence tomography (OCT), achieving a high-speed tomographic measurement at a 1.2 MHz A-scan rate based on an Er-doped fiber comb at 1.55 μm. This setup simultaneously acquired surface and internal depth information of a structure, providing an effective solution for rapid three-dimensional imaging of complex structures.
Despite their advantages, OFC-based interferometric systems face technical challenges that currently restrict widespread adoption, particularly in industrial settings. Among these challenges, the complexity associated with stabilizing key comb parameters remains significant. Achieving the necessary frequency stability typically requires complex stabilization schemes involving precision electronic control, ultra-stable laser cavities, and optical frequency references, which increases both cost and system complexity.

2.4. CCD-Based Optical Interferometry

CCD-based precision measurement techniques utilize charge-coupled device (CCD) image sensors to directly record and analyze interference fringes, diffraction patterns, or holographic wavefronts, offering substantial advantages in spatial resolution, dynamic measurement capability, and non-contact surface characterization. In contrast to PD-based methods that primarily rely on single-point or limited-channel detection, CCD-based methods inherently provide full-field measurements, enabling simultaneous capture of extensive surface or volumetric information. Prominent examples within this category include white-light Fizeau interferometry and digital holographic microscopy (DHM), each characterized by distinct measurement principles and application contexts. This section comprehensively reviews the principles, recent technological advances, and representative applications of these two critical CCD-based precision measurement methods.

2.4.1. Fizeau Interferometry

In the field of precision optical metrology, the Fizeau interferometer, invented by Hippolyte Fizeau in 1862, has become a cornerstone measurement tool owing to its common-path configuration, simplified optical alignment, and high stability [111,112]. The fundamental principle, as illustrated, involves splitting a light beam emitted from a source to illuminate both a reference surface and the test specimen. The reflected beams interfere, producing fringe patterns from which the surface morphology and minute deformations of the specimen can be precisely extracted through phase analysis.
To overcome mechanical limitations inherent in traditional phase-shifting techniques, Gary Sommargren proposed Wavelength Tuned Phase Shifting Interferometry (WPSI), wherein phase shifts are induced by tuning the laser wavelength [113]. The amount of phase shift is directly related to the number of reflections, optical thickness of the test sample, and frequency offset of the laser source, enabling enhanced precision and stability in measurements of large-aperture and complex optical components. For instance, at the United States National Ignition Facility (NIF), a 600 mm aperture WPSI Fizeau interferometer was employed to measure large laser amplifier plates sized 460 × 810 mm. Utilizing a Littman–Metcalf configuration laser enabling mode-hop-free wavelength tuning combined with cavity locking techniques to suppress vibration-induced errors, the system successfully fulfilled the stringent requirements for high-precision, large-scale optical metrology. This implementation demonstrates the significant potential and broad applicability of WPSI in large-aperture, high-accuracy optical measurements [114,115].
Recent advances have addressed longstanding challenges in Fizeau interferometry related to phase-shift accuracy, imaging errors, noise floor reduction, and measurement range extension. Xu et al. introduced a dual-stage correction method for phase shifters, combining an ultra-high linearity phase shifter with an auto velocity iterative correction, achieving nanometer-level displacement accuracy and sub-0.1% nonlinearity, thus significantly improving phase-shifting interferometry (PSI) precision [116]. Complementing this, Morrow et al. developed an empirical model to correct retrace and system imaging errors inherent in non-null Fizeau measurements of aspheric and curved optics, demonstrating sub-2 nm RMS accuracy for full-aperture, single-shot measurements critical in X-ray mirror fabrication [117].
Noise suppression has seen notable breakthroughs through anisotropic spatial-coherence engineering, as demonstrated by Li et al., who utilized illumination modulation to reduce mid-spatial-frequency noise below the sub-nanometer level, thereby enhancing sensitivity and measurement fidelity [118]. This methodological innovation provides a rapid, full-aperture solution for detecting subtle surface errors on high-precision optics.
Integration of novel beam structures has also advanced the field. Lu et al. explored the use of orbital angular momentum (OAM) beams within Fizeau interferometers. Their common-path OAM interferometric schemes combined with azimuthal phase demodulation enabled stable, compact measurement systems capable of resolving displacements down to tens of picometers without requiring traditional phase-shifting devices, thus offering robust and efficient phase retrieval methods [119,120].
From an instrumentation perspective, Kühnel et al. demonstrated a scanning differential interferometer-based profilometer capable of sub-nanometer 3D topography measurements on freeform optics with steep local slopes (up to 7 mrad), supporting large apertures up to 100 × 100 mm2. The system’s precision and repeatability were validated on silicon mirrors, confirming its suitability for advanced optical manufacturing [121].
Furthermore, Da Silva et al. developed a Fizeau interferometry stitching system to characterize large X-ray mirrors with sub-nanometer height errors [122]. By combining multiple overlapping sub-aperture scans using advanced stitching algorithms, the system achieved exceptional spatial resolution and reproducibility, critical for synchrotron and free electron laser applications requiring ultra-precise surface metrology.
The application scope of Fizeau interferometry has been considerably expanded in recent years, particularly in large-area precision manufacturing contexts. For instance, Zygo’s MST series employs an array of synchronized CCD-based interferometric sensors, enabling simultaneous high-resolution measurement of large optical components, such as telescope mirror segments [123]. Similarly, Taylor Hobson developed a large-area CCD-based profiler capable of rapidly inspecting precision-engineered surfaces, achieving nanometric resolutions over extensive surface areas [124].
Fizeau interferometry’s capability to measure complex surfaces with nanometric precision, combined with emerging techniques in noise suppression and phase demodulation, solidify its position in optical manufacturing, X-ray optics characterization, and freeform surface metrology. Remaining challenges involve further integration of correction algorithms, expansion of measurement ranges, and reduction of system complexity to facilitate wider industrial adoption.

2.4.2. Digital Holographic Interferometry

Digital Holographic Interferometry (DHI) is a non-contact optical method that integrates holographic recording with interferometric measurement. Its principle involves forming interference fringes on the detector plane by the superposition of the object wave and the reference wave, capturing these fringes with a digital sensor (such as a CCD/CMOS), and reconstructing the complex amplitude distribution of the object via numerical algorithms. This allows for the extraction of both amplitude and phase information, enabling quantitative measurements of object morphology, refractive index distribution and dynamic variations [125,126]. Compared with conventional interferometric techniques, DHI can acquire three-dimensional information of a sample in a non-destructive, full-field manner; does not require chemical development; and facilitates real-time or quasi-real-time processing [127]. Reconstruction is typically based on the Fresnel–Kirchhoff diffraction integral via Fourier transform, yet recent advances include the formulation of reconstruction as an optimization problem, the application of Gerchberg–Saxton iterative algorithms, as well as the use of compressive sensing to enhance accuracy [128,129]. These methods are capable of suppressing twin-image noise, improving phase retrieval stability, and performing robustly under low signal-to-noise conditions.
The configuration of the illumination source is a key factor in enhancing DHI performance. Conventional systems employ single-wavelength coherent sources. To eliminate the 2 π phase ambiguity and extend the measurable range, Min et al. [130] proposed a dual-wavelength slightly off-axis digital holographic microscopy method, which was later adapted by Di et al. [131] into a common-path configuration, thereby improving robustness and suitability for biological specimens. In recent years, a novel approach has emerged involving a single broadband source capable of simultaneously emitting numerous narrow laser lines [127], which maintains coherence while improving stability. In multi-wavelength interferometry, Σ Λ synthetic wavelength techniques [132] can reduce the equivalent wavelength to nearly half of a single wavelength, thereby enhancing measurement precision; when combined with the original Δ Λ synthetic wavelength method, a large measurement range can be preserved. However, in practice, such schemes suffer from the conjugate wavefront (twin-image) problem, which can compromise demodulation accuracy when multiplexed holograms are used [133].
In terms of optical layouts and system architecture, researchers are continuously exploring optimizations of off-axis holography, common-path arrangements, and dual-wavelength multiplexing to balance system stability and spatial resolution. For example, by encoding the sampling function with a spatial light modulator (SLM) and combining it with computational holographic reconstruction using a single-pixel detector, object information can be transmitted and recovered through complex scattering media [134].
In data processing and reconstruction algorithms, Asundi and Singh proposed one of the early methods for amplitude and phase analysis in digital dynamic holography [135]. With the advent of artificial intelligence in optical imaging, Shimobaba et al. developed a deep neural network-based approach for three-dimensional particle volume reconstruction in digital holography, significantly improving reconstruction speed and noise robustness [136]. Ren et al. further proposed an end-to-end deep learning reconstruction framework that does not require paired training data, enabling direct prediction of high-quality quantitative phase maps and substantially reducing the computation time of traditional iterative methods [137]. Furthermore, wavelet transforms, such as Cohen–Daubechies 9/7 and 17/11, have been applied for efficient compression of holographic data [138,139], considerably reducing storage and transmission demands while preserving measurement precision, thus paving the way for real-time remote holographic metrology.
Driven by these advances, DHI has expanded its applications across diverse domains. In biomedical imaging, Park et al. provided a comprehensive review of the role of DHI in label-free three-dimensional quantitative phase imaging, highlighting that holographic tomography (HT) can reconstruct the three-dimensional refractive index distribution of living cells and tissues, enabling quantitative analysis of cellular structure and functional changes [140]. Lee et al. employed three-dimensional label-free digital holographic microscopy to monitor cell migration during wound healing in real time, providing a novel tool for cell dynamics studies [141]. In fluid mechanics research, DHI combined with particle image velocimetry (PIV) and particle tracking velocimetry (PTV) enables four-dimensional reconstruction of velocity fields and particle trajectories, offering unique advantages for turbulence analysis and multiphase flow studies [142].
With its non-contact, label-free, full-field, and quantitative three-dimensional measurement capabilities, DHI has progressed from laboratory validation to engineering implementation and biomedical applications. Nevertheless, challenges remain in phase recovery accuracy under high-noise conditions, image quality for complex scattering samples, and computational demands for real-time high-resolution acquisition. Future developments are likely to focus on diversified, broadband, and coherence-controlled illumination sources; rapid, high-precision reconstruction algorithms combining deep learning with physical models; low-data-volume imaging via compressive sensing and hardware optimization; and multimodal integration (e.g., with fluorescence microscopy and photoacoustic microscopy) to obtain comprehensive datasets with structural and functional information. Breakthroughs in these directions will further expand the applicability of DHI to life sciences, materials characterization, and on-site industrial monitoring.

2.5. Summary

Interferometry is one of the most established optical metrology methods, relying on the phase difference generated by the interference of coherent light waves to achieve ultra-high precision displacement, surface, and refractive index measurements. Owing to its sub-nanometer accuracy and excellent sensitivity, it is widely used in fields such as semiconductor manufacturing, micro-optics fabrication, and precision surface metrology. However, interferometric techniques are highly sensitive to environmental disturbances such as vibration and temperature fluctuation, and their measurement range and robustness remain constrained. The development of phase-shifting algorithms and common-path configurations has partially mitigated these issues, but achieving both high precision and stability remains a persistent challenge.
Interferometric metrology can be effectively combined with imaging and spectroscopic methods to overcome its intrinsic limitations. For instance, integrating interferometric phase sensing with imaging modalities enables simultaneous acquisition of topographic and spatial information, while coupling with spectroscopy introduces wavelength-resolved material characterization. The convergence of interferometry with computational imaging and data-driven analysis opens new possibilities for multimodal, self-calibrating, and adaptive optical measurement systems that bridge precision metrology and intelligent perception.

3. Optical Imaging-Based Metrology

The technique of measuring and estimating physical properties at multiple spatial locations to form a two-dimensional or three-dimensional spatial map is referred to as imaging, which is thus distinguished from sensing, the latter being primarily focused on single-point measurements. Optical imaging acquires object information through the interaction between light and matter. Based on the properties of light, it can be further categorized into: (1) geometric optical imaging, including TOF, laser triangulation, and structured light; (2) computational imaging methods, including compressive sensing imaging; and (3) microscopic imaging methods.

3.1. Geometric Optical Imaging

3.1.1. Laser Triangulation

Laser triangulation imaging is a non-contact measurement technique based on the principles of geometrical optics and triangulation. Figure 3a shows its basic principle. A laser beam is projected onto the surface of the object under inspection, and an imaging system—typically composed of a condensing lens, imaging lens, and CCD/CMOS sensor—detects the positional shift of the reflected laser spot [143], from which the spatial displacement and three-dimensional profile of the surface are calculated [144,145,146]. This approach generally employs a fixed geometric baseline, and its measurement accuracy is highly dependent on parameters such as the laser incidence angle, imaging angle, and system calibration [147]. According to the optical path configuration, laser triangulation can be classified into two types: the direct configuration, in which the optical axis is perpendicular to the measurement direction and the geometrical relationship is straightforward; and the oblique configuration, in which the relative angle between the laser source and the camera is adjusted to enhance spatial resolution and extend the measurement range, albeit with more demanding requirements for calibration and optical distortion correction [148].
The core principle relies on triangulation geometry. A laser beam illuminates the target surface, and the scattered light is imaged onto a detector (e.g., CMOS/CCD). The displacement of the laser spot on the detector correlates with the target’s vertical movement.
For a direct triangulation setup
Δ z = b Δ x f s i n θ + Δ x c o s θ
where Δ z is the target displacement, Δ x is the spot displacement on the detector, b is the baseline distance between the laser and lens, f is the lens focal length, and θ is the triangulation angle.
In recent years, significant progress has been made in error modeling and compensation strategies to address challenges in the measurement of complex surfaces, such as diverse reflective properties, color variations, and surface inclination. Ding et al. [144] proposed a two-stage compensation approach based on the Lambert illumination model, achieving measurement error reductions of 71.5% and 91.9% in the coarse and fine stages, respectively. Li et al. [149] established an optical error compensation model using least squares and functional libraries, reducing the measurement error by approximately 17%, although the method remained limited for surfaces deviating from ideal diffuse reflection. Yu et al. [150] employed the Phong reflection model to account for both specular and diffuse reflection components, enabling more precise compensation for complex reflective surfaces. Hao et al. [147] investigated the influence of rough-surface scattering on displacement measurement accuracy in position-sensitive detector (PSD)-based triangulation systems and proposed optimization strategies for the detection principle. In addition, studies have shown that dual-view triangulation can partially suppress speckle noise during scanning operations, thereby improving measurement robustness [145].
With advances in computational technology, conventional geometric modeling and optical compensation approaches are increasingly combined with intelligent algorithms. For example, convolutional neural networks (CNN) and deep learning-based methods have been introduced for feature extraction and signal fitting [151], allowing for robust performance under varying material properties and surface morphologies with associated intensity fluctuations and noise. Moreover, the application of this technique in on-machine measurement (OMM) has expanded substantially. By integrating OMM with multi-axis machine tool platforms and optimizing measurement path planning [152,153,154], not only has measurement efficiency been enhanced, but coverage and accuracy for large-scale freeform surfaces and complex structural components have also been improved.
Benefiting from advantages such as non-contact operation, high resolution, and rapid response, laser triangulation imaging has found widespread applications in precision manufacturing, optical inspection, and large-scale structural metrology. Nevertheless, its performance remains constrained by the stability of optical components, environmental influences (e.g., vibration and temperature variation), and the optical properties of the measured surfaces [147,149]. Future developments are likely to focus on multi-modal hybrid metrology (combining structured light, interferometry, and other modalities), intelligent error compensation (leveraging AI to optimize modeling and data processing), and customized system architectures for specific application scenarios, with the aim of achieving higher accuracy, broader adaptability, and superior capability for real-time, in-process measurement.
Time-of-Flight (ToF) imaging is an active three-dimensional (3D) sensing technology that estimates the distance to an object by emitting a light signal with known modulation characteristics and measuring the propagation time of its reflection from the target. This technique does not rely on surface texture features, can operate under low-light or even no-light conditions, and is capable of capturing full-scene depth information within a single frame.
Depending on the measurement principle, ToF imaging can be categorized into direct time-of-flight (dToF) and indirect time-of-flight (iToF) methods. Figure 3b shows the principle difference between iToF and dToF. The dToF approach employs pulsed light and precisely measures the time delay between emission and reception, whereas iToF uses continuously modulated light and derives distance from the measured phase shift. These two techniques differ significantly in light source design, detector architecture, signal processing methods, and performance metrics. However, with recent advancements in semiconductor devices, light source modulation strategies, and computational imaging algorithms, their application scenarios and performance boundaries have been continuously expanded, and a trend toward technological convergence has emerged.
dToF technology determines the distance by directly measuring the round-trip time delay of a light pulse, expressed as
d = c · Δ t 2
where c denotes the speed of light and Δ t is the round-trip propagation delay of the light signal. A typical hardware architecture includes a pulsed laser source, optical transmitter, optical receiver, high-sensitivity photodetector, and a high-resolution time-to-digital converter (TDC). In recent years, single-photon avalanche diodes (SPADs) have been widely adopted in the receiver front end of dToF systems. When combined with picosecond-pulsed lasers, SPADs significantly improve detection sensitivity and timing resolution, enabling millimeter-level or even micrometer-level ranging precision [155,156].
In high-performance dToF research, a 256 × 192-pixel CMOS receiver using a current-integrating transimpedance amplifier (CI-TIA) as an analog front-end can suppress the direct current component caused by strong background light during pulse detection, achieving < cm constant error and <0.09% precision over a 240 m ranging distance, while maintaining 30 frames/s imaging capability and offering strong performance in high dynamic range and long-distance scenarios [157]. For medium-range, high-frame-rate applications, combining a two-dimensional mechanical scanning architecture with pseudo-random coding achieves 16 frames/s at 4.5 m range, with 0.2% precision, and suppresses interference levels by more than 15.29 dB [158]. To further reduce power consumption and system complexity, hybrid architectures have been proposed that integrate partial histogram accumulation, adaptive pulse-width adjustment, and nonlinear spatiotemporal coincidence detection (NSCD), enabling sustained 60 m measurement under 60 klux background light with sub-centimeter precision, while compressing per-pixel histogram storage to just 180 bits [159].
Exploration of ultimate dToF precision has demonstrated that, by tuning SPAD bias voltage, laser repetition rate, and integration time, micrometer-level ranging precision can be achieved [156]. Overall, dToF offers high accuracy, excellent sensitivity, and long-range measurement capability, making it particularly suitable for LiDAR, precision mapping, and low-illumination 3D imaging. However, the use of high-speed TDCs, high-peak-power pulsed light sources, and high-speed readout chains contributes to higher costs and power consumption, posing challenges for large-scale miniaturization in consumer electronics.

3.1.2. Time-of-Flight Imaging

iToF technology employs a continuous-wave modulated light source, where distance is derived from the measured phase shift ϕ between the emitted and received optical signals:
d = c 4 π f m ϕ
Here, f m is the modulation frequency. Based on lock-in detection principles, iToF pixels can be fully integrated in CMOS processes with per-pixel phase measurement capability, thereby capturing two-dimensional phase information with relatively low cost and power. This makes iToF popular for medium- to short-range applications such as consumer electronics, robotic vision, and indoor mapping [155].
Recent studies have advanced iToF accuracy and robustness. In hardware, the proposed eight-window continuous-wave iToF (cw-iToF) method replaces the conventional sinusoidal modulation with a square-wave source and extends the number of integration windows from four to eight. This approach reduces phase nonlinearity error significantly, resulting in a worst-case error of 1.2 mm over 0.75 m range, with more than 90% of the measurement range keeping errors below 1 mm, and achieving 0.35 mm precision with a 100 ms frame time [160]. A parallel-phase-demodulated AMCW scanning sensor can dramatically shorten total integration time while maintaining high demodulation contrast. Using only 30 mW of optical power, it achieves a relative noise of 0.056% at a 1.5 m reference distance, and enables 1920 × 1080 full-HD 3D depth imaging within just 800 ns of integration [161].
To address multipath interference (MPI), a key weakness of iToF, researchers have proposed synthetic dataset generation using an accurate light propagation model, combined with machine learning. A Bayesian-optimized XGBoost model trained on such data reduces MPI mean absolute error (MAE) in coaxial-scanning AMCW LiDAR to the millimeter scale, maintaining an MAE of just 2.8 mm in sharp-corner object scenes [162]. Furthermore, data-driven post-processing for error compensation has been shown to significantly improve ranging accuracy under indoor conditions, outperforming traditional correction approaches without adding optical or hardware complexity [163].
In summary, iToF systems offer excellent hardware integrability, low cost, and high-frame-rate imaging, well suited for high-resolution, medium- to short-range 3D sensing. However, their performance is inherently limited by modulation frequency, phase ambiguity, and MPI sensitivity, making them less stable than dToF in long-range or high-interference scenarios. Nevertheless, recent advances in multi-window demodulation, parallel processing, high-frequency square-wave modulation, and AI-based error compensation point towards promising routes for bridging the performance gap with dToF.

3.1.3. Stereo Vision

Binocular vision is a passive three-dimensional perception technique based on bionic principles. It simulates human binocular disparity using a pair of cameras and applies triangulation to compute depth information. The core of this method lies in reconstructing the three-dimensional structure of the scene by disparity matching between left and right images, without the need for projecting an active light source. As such, it is well-suited for dynamic scenes and complex environments.
In a parallel stereo camera configuration, intrinsic parameters (focal length, distortion coefficients) and extrinsic parameters (rotation matrix R, translation vector T) are obtained through calibration. Feature point matching (e.g., SIFT/SURF) is then performed to generate a disparity map, which is finally used for 3D reconstruction. The principle follows:
Z = B F d
where Z is the target depth, B is the baseline (distance between the optical centers of the two cameras), F is the focal length, and d is the disparity between corresponding points in the left and right images. Pixel similarity is typically computed using SAD, SSD, or Census Transform, with disparity refinement achieved via guided filtering, dynamic programming, and other approaches. Disparity is then computed by applying Winner-Takes-All (WTA) or global optimization algorithms.
A stereo vision system consists of two cameras and a synchronized triggering device. In industrial scenarios, sub-millimeter accuracy (±0.1 mm) can be achieved; for example, PCB pin-tip 3D reconstruction can reach an accuracy of ±0.05 mm. In virtual reality (VR), conventional binocular vision enables the generation of high-accuracy 3D environment models.
Conventional stereo vision is a mature technology, with standardized algorithms (e.g., BM, SGM) supporting large-scale applications. However, its accuracy depends heavily on calibration, and environmental temperature variations can induce baseline drift, requiring frequent recalibration. Moreover, in low-texture regions—such as walls or specular surfaces—matching errors are likely to occur.
To address the limitations of conventional stereo vision, several variants have emerged, such as mirror-based stereo vision [164] and stereo zoom super-resolution. Mirror-based stereo vision uses a single camera combined with a reflective mirror assembly (e.g., curved mirrors, prisms) to simulate two viewpoints. Through optical design, system volume can be reduced by more than 50%, and the configuration offers resistance to vibration and tolerance to extreme temperatures, making it suitable for confined spaces in UAVs, in-vehicle systems, and dynamic industrial robotic scenarios. However, distortion correction is challenging due to non-linear image distortion caused by mirror surface shape errors, which necessitate complex calibration algorithms. In addition, the optical path alignment is highly sensitive, as assembly deviations in the mirror system directly affect measurement accuracy.
Stereo zoom super-resolution systems dynamically adjust the focal length of cameras and combine this with super-resolution algorithms (e.g., SRCNN, ESRGAN) to enhance the resolution of distant details. Advantages include high-precision long-range measurement, support for wide-angle environmental perception, and local detail magnification, making it suitable for traffic analysis [165]. Disadvantages include high computational complexity—super-resolution algorithms may require over 100 TOPS of processing power, limiting real-time performance—and high hardware cost, as precision zoom lenses are typically 3–5 times more expensive than fixed-focus lenses.
At present, AI-enhanced techniques represent a major development trend in binocular vision. End-to-end stereo matching networks based on convolutional neural networks (CNNs), such as GCNet and PSMNet, have significantly improved matching accuracy in low-texture regions. In parallel, lightweight design and real-time optimization are also key directions. Researchers aim to package stereo systems into portable modules, combining optical flow with stereo matching to address motion blur, thereby extending applications to industrial inspection and medical navigation.

3.1.4. Structured Light 3D Reconstruction

Structured-light-based three-dimensional (3D) reconstruction is an important active optical measurement technique [166,167]. By projecting specifically designed structured gratings or fringe patterns onto the surface of the measured object, and acquiring their deformed images via a camera, the object’s 3D shape can be computationally recovered [168,169,170,171]. Compared with passive stereo vision methods, which rely on natural textures and feature extraction, structured light maintains high robustness and measurement accuracy even under conditions of low texture, poor illumination, and occlusion [172,173]. Consequently, it has been widely applied in industrial surface inspection, facial recognition, and 3D modeling, among other fields [174,175,176,177].
Among the various implementation approaches, fringe-based structured light 3D reconstruction has become the dominant technique owing to its high resolution, high accuracy, and flexible system configuration. Typical methods include Fringe Projection Profilometry (FPP) and Phase Measuring Deflectometry (PMD).
FPP is a phase-encoding-based active optical 3D measurement method, suitable for high-precision measurement of diffuse reflective surfaces [178,179,180]. In FPP, periodic sinusoidal fringe patterns are projected onto the object’s surface under known geometric conditions. The fringes are distorted by the surface geometry, and the deformed patterns are captured by a camera. By decoding the fringe phase, the 3D geometry of the object can be reconstructed [181]. To establish the mapping between the extracted phase and the real-world 3D coordinates, two common calibration models are employed: the phase–height model and the triangulation model. In the phase–height model, multiple reference planes with known heights are used to directly establish a functional relationship between phase and height, making it suitable for relatively flat surface measurements. Common mathematical forms include linear, inverse-linear, and polynomial models [182,183,184,185]. The triangulation model, on the other hand, requires precise calibration of the camera and projector to recover the object’s 3D coordinates using the principles of triangulation. Depending on projection mechanisms, FPP systems can be implemented in various ways. The most widely used approach relies on Digital Light Processing (DLP) projectors, which offer high pattern quality and fast refresh rates, making them a primary choice. However, as DLP projection relies on focusing optics, it inherently suffers from a limited depth of field. To address this limitation, more compact Micro-Electro-Mechanical Systems (MEMS)-based systems have been developed, which scan laser beams to form patterns, allowing large-depth-of-field projection without the need for focusing optics [167,186].
PMD is a technique specifically designed for the 3D reconstruction of specular or highly reflective surfaces. A typical PMD system consists of a liquid crystal display (LCD), a camera, and a computer. The computer generates sinusoidal fringe patterns on the LCD, which are reflected by the object’s specular surface and then captured by the camera. Due to the geometric modulation caused by surface normals, the recorded images encode phase distortions that correspond to local surface slope variations [187,188]. In the reconstruction process, wrapped phases are first extracted from the captured fringe patterns, typically using phase-shifting methods. These phases are then converted into surface gradient data through the use of geometric models and calibration parameters [189]. Since the phase distortion is proportional to the reflected ray deflection angle, PMD yields a gradient field representing the surface slope. To recover a complete 3D surface, the gradient field must be numerically integrated over the 2D image plane, producing the relative height map [190]. To simplify this process, Direct Phase Measuring Deflectometry (DPMD) has been proposed, which employs a dual-LCD and dual-camera setup to simultaneously capture fringes reflected from a reference plane and from the measured specular surface. By comparing the phase difference along the two reflection paths, height variations can be directly derived, thus avoiding the complex numerical integration steps required in conventional PMD, and enabling direct height reconstruction for specular targets [191,192].
Despite differences in system architectures, both FPP and PMD operate by projecting or displaying sinusoidal fringe patterns and exploiting the modulation effects imposed by the measured object to recover 3D shape information [193,194]. Once the fringes are distorted by the object’s surface, the phase variations in the patterns directly encode spatial geometric features. In FPP, the extracted phase is directly related to the object’s depth, whereas in PMD, it corresponds to the object’s surface gradients. High-precision phase extraction and phase unwrapping are therefore essential for both techniques. Common wrapped phase retrieval methods include phase-shifting [195], wavelet transform [196], and Fourier transform methods [197]. Based on the dimensional domain of information used in the unwrapping process, phase unwrapping approaches in structured light 3D reconstruction can be broadly categorized into temporal phase unwrapping (TPU) and spatial phase unwrapping (SPU). TPU relies on projecting multiple fringe patterns with different frequencies or coding schemes, and computes the absolute phase for each pixel independently from intensity variations over time. Since TPU does not rely on spatial continuity, it exhibits high robustness when dealing with surface discontinuities, abrupt depth changes, or occlusions [198]. In contrast, SPU depends on the phase relationships between neighboring pixels, progressively removing the 2 π discontinuities to recover the true phase. SPU methods can be classified as either local or global approaches [199]. However, noise in SPU can propagate from high-noise areas to low-noise ones, potentially degrading accuracy.
Traditional fringe-projection structured light systems often suffer from reduced measurement accuracy when dealing with objects having non-uniform surface reflectivity, highly complex geometry [200], or severe occlusion [201]. In recent years, deep learning—demonstrated to be powerful in feature extraction and nonlinear modeling—has been successfully applied in numerous domains [65,202,203,204]. For fringe-projection-based structured light systems, deep learning offers promising solutions to improve measurement accuracy, accelerate reconstruction speed, and enhance robustness [65,202,205,206]. Nevertheless, structured light 3D reconstruction continues to face multiple challenges in practical applications, including: maintaining high accuracy and stability in complex and dynamic environments [207,208]; mitigating measurement errors caused by overexposure or underexposure under high dynamic range conditions [209]; achieving real-time reconstruction under limited depth of field and fast motion; and enhancing computational efficiency without sacrificing accuracy to meet the demands of industrial online inspection and intelligent manufacturing.

3.2. Computational Optical Imaging

Computational Imaging (CI) is an imaging paradigm that integrates optical design with computational post-processing. By jointly optimizing optical hardware and algorithms, CI overcomes the physical limitations of conventional imaging systems. Its core idea is to reconstruct high-dimensional, high-resolution, or attribute-specific images from indirect or incomplete measurements through physical modeling and mathematical optimization [194,210]. CI methods can mitigate optical noise, diffraction limits, or dynamic range constraints, while also simplifying hardware complexity via computational compensation, and further enabling tasks unattainable by traditional imaging (e.g., imaging through scattering media, light-field reconstruction).
Its mathematical model can be expressed as:
y = Φ x + n
where Φ denotes the measurement matrix, and n represents noise.
Computational imaging encompasses a wide range of techniques. Existing surveys categorize them according to encoding/modulation strategies, dimensional expansion, or application-driven classification. As illustrated in Figure 4, there are numerous types of CI applications. This work focuses on industrial and entertainment fields, highlighting three representative subcategories.

3.2.1. Compressed Imaging

Compressed imaging is based on the theory of Compressed Sensing (CS), whose central premise is to exploit the sparsity of a signal in a certain transform domain to achieve image acquisition and reconstruction at sampling rates significantly below the Nyquist rate. Let the original image x R N be represented under a sparsifying basis Ψ as x = Ψ s , where the coefficient vector s is sparse or approximately sparse. Compressed measurements are obtained via a measurement matrix Φ R M × N ( M N ):
y = Φ x = Φ Ψ s
Image reconstruction is typically accomplished by solving a constrained optimization problem:
min s s 1     subject   to     y Φ Ψ s 2 ϵ ,
where ϵ denotes the noise tolerance. This framework overcomes the limitations imposed by traditional imaging systems on sampling rates and detector requirements.
A typical compressive imaging system is the single-pixel camera (SPC), which employs a digital micromirror device (DMD) to spatially encode the incident optical field [211]. For each measurement, a single photodetector records the corresponding linear combination of the encoded light intensity, thereby producing the compressed sampling result. This architecture is particularly advantageous in scenarios where the detected signal intensity is severely attenuated due to scattering or absorption, such as in biomedical imaging [212,213] or long-range three-dimensional imaging [214,215].
In recent years, extensive research has been conducted on both algorithms and applications of compressive imaging. On the algorithmic side, beyond traditional convex optimization and greedy algorithms [216,217], deep learning-based approaches have become increasingly dominant. Reconstruction methods leveraging deep neural networks, such as the deep residual reconstruction network (DR2-Net) and the bidirectional recurrent neural network architecture (BIRNAT), have demonstrated significant improvements in both reconstruction speed and accuracy compared to conventional approaches [218,219].
On the application side, compressive imaging has found broad adoption in hyperspectral imaging [213,220], holography [221], polarimetry [222], and multi-modality imaging [223]. For instance, in hyperspectral imaging, Hahn et al. proposed a method that integrates compressive sensing (CS) with adaptive direct sampling (ADS), enabling the acquisition of high-quality hyperspectral images with substantially fewer samples than the number of pixels. Their experiments on real datasets showed that even with only ∼40% of the samples, excellent image quality and classification accuracy could still be achieved [220]. In the field of holography, Clemente et al. introduced the concept of single-pixel imaging into digital holography by combining phase-shifting interferometry with a single-pixel detector. Using Hadamard patterns to compressively encode the object’s diffraction field within a Mach–Zehnder interferometer, they employed back-propagation algorithms to reconstruct the complex amplitude of the object and successfully demonstrated phase distribution measurements of ophthalmic lenses [221].
Benefiting from its advantages in hardware simplification, reduced sampling requirements, and multi-dimensional imaging capability, compressive imaging has emerged as a key research direction in modern optical metrology. Future developments are expected to focus on the design of efficient and low-complexity reconstruction algorithms, the integration of deep learning with physics-based models, and the expansion of compressive imaging toward high-dimensional and multi-modal optical measurement tasks.

3.2.2. Light Field Imaging

A light field provides a comprehensive description of the propagation state of light in space, encompassing not only intensity but also directional information. When represented using the Two-Plane Parameterization (2PP) model, a light field can be described by a four-dimensional function L ( u , v , s , t ) , where ( u , v ) denote the coordinates on the camera imaging plane, and ( s , t ) correspond to the intersection points on the object plane, enabling a full capture of both spatial position and propagation direction. This property confers inherent advantages for acquiring three-dimensional structural information, optical wavefront data, and post-capture refocusing with extended depth-of-field.
Microlens array light field cameras introduce a microlens array between the main lens and the image sensor, allowing angular resolution of incident light rays [224,225]. This approach offers a compact structure and full light field capture, making it widely applicable in digital photography and microscale 3D imaging [226,227]. However, the trade-off between spatial and angular resolution, constrained by microlens size and sensor pixel density, remains a primary limitation affecting measurement accuracy.
Camera arrays capture images from multiple spatially distributed viewpoints to reconstruct dense light fields [228,229]. This method can provide high spatial and angular resolution, suitable for large-scale 3D scene reconstruction and dynamic scene capture [230]. Nevertheless, camera arrays face challenges in hardware cost, system calibration, and data synchronization, limiting their adoption in precision metrology.
Coded aperture and computational light field imaging employ spatial light modulators (SLMs), phase masks, or specially designed coded optical elements to embed angular information into captured images, followed by computational inversion to reconstruct the light field [231,232]. Compared with conventional approaches, coded light field imaging can achieve relatively high spatial and angular resolution without significantly increasing hardware complexity and provides flexible imaging modes [233]. Recently, deep learning-based reconstruction methods have further enhanced both reconstruction accuracy and real-time performance of coded light field systems [234].
In optical metrology, light field imaging enables the acquisition of viewpoint variation and depth cues inaccessible to conventional imaging. Combined with algorithms such as refocusing, parallax estimation, and epipolar plane image (EPI) feature extraction, it can achieve high-precision three-dimensional surface reconstruction and profilometry [235,236,237].
Since its introduction by Ng et al. in 2005, light field imaging has been gradually applied in computational photography [224,238]. Ihrke et al. provided a comprehensive review of 25 years of light field research, highlighting the importance of joint optimization of optical design and computational reconstruction [239]. Hu et al. systematically reviewed the development of light field cameras for metrology, discussing imaging principles, calibration, reconstruction algorithms, and applications in 3D measurement and micro/nanoscale inspection [240]. In fluid mechanics, Shi et al. summarized applications of light field cameras for volumetric flow measurements and turbulent combustion diagnostics, emphasizing their compact structure and capability to acquire full 3D information in a single exposure, thereby addressing limitations of conventional stereo vision and multi-camera array systems [241]. Broxton et al. extended light field microscopy to micrometer-scale 3D imaging of biological specimens [226], while Prevedel et al. combined light field capture with computational reconstruction to record rapid neuronal activity in three dimensions [227].
Despite its advantages in non-contact 3D measurement, freeform surface inspection, and reconstruction of transparent and reflective materials, light field imaging still faces challenges. Spatial and angular resolution limitations constrain high-precision applications, while the large volume of light field data demands high computational efficiency and real-time processing. To address these issues, several strategies have been proposed:
High-resolution and super-resolution reconstruction: To mitigate resolution loss in microlens array cameras caused by effective pixel sparsity, sparse reconstruction and deep learning-based super-resolution techniques have been developed. For example, combining convolutional neural networks (CNNs) with sparse coding allows enhanced spatial sampling while preserving angular information, enabling sub-pixel 3D reconstruction [234,242].
Computational light field and deep learning inversion: Advances in computational optics and deep learning enable embedding light field information in hardware via coded apertures or phase masks, followed by rapid, high-precision reconstruction and depth estimation using deep neural networks, such as EPI convolution networks or NeRF-based implicit representations [233,243].
Dynamic light field capture and online metrology: For real-time 3D inspection of dynamic objects in industrial production, high-speed light field capture systems using multi-view arrays and temporal encoding have been developed, leveraging GPU parallelization and edge computing to achieve high-frequency 3D measurement and defect detection [244,245].
Light field and wavefront sensing integration: Chen et al. proposed a wavefront measurement method based on an optimized light field camera, achieving high-precision wavefront acquisition in a single exposure without complex interferometric modulation, suitable for freeform optical element characterization [246].
Cross-scale and multi-modal fusion metrology: Integration of light field imaging with structured illumination, confocal microscopy, and interferometry enables high-precision 3D measurements across micro-to-nanoscale, improving robustness for complex surfaces and transparent or reflective materials [247,248,249].
In summary, while light field imaging offers significant advantages in optical metrology, challenges remain, including the trade-off between spatial and angular resolution, large data volume and processing demands, and hardware scalability. Future developments in super-resolution reconstruction, intelligent hardware, and computational optics, combined with AI-augmented physics-based reconstruction methods, are expected to expand its applications in micro/nano manufacturing, precision metrology, and biomedical imaging, enhancing both measurement accuracy and system performance.

3.3. Super-Resolution Imaging

In conventional optical microscopy, image resolution is fundamentally constrained by the diffraction limit. When a point source is imaged through an ideal lens, it produces a finite-sized intensity distribution on the image plane, known as the point spread function (PSF). The lateral resolution of a microscope is typically characterized by the full width at half maximum (FWHM) of the PSF, which can be approximated as FWHM 0.61 λ / NA , where λ denotes the wavelength of light and NA represents the numerical aperture of the objective lens. Within the visible spectrum, the use of high-NA oil immersion objectives (e.g., NA = 1.40) yields a conventional optical resolution of approximately 200 nm. However, in fields such as nanostructure characterization and biological imaging, the demand for higher resolution has become increasingly pressing. To overcome the diffraction limit, a variety of super-resolution (SR) imaging techniques have been developed, enabling enhanced lateral and axial resolution beyond that of traditional optical systems [250,251,252].

3.3.1. Near-Field Super-Resolution Imaging

Near-field scanning optical microscopy (NSOM) represents one of the earliest optical techniques to overcome the diffraction limit. Its fundamental principle lies in probing the evanescent field in the immediate vicinity of the sample surface, thereby retrieving high-spatial-frequency information. Conventional optical systems are unable to capture such information; however, by positioning a miniature probe (e.g., a tapered optical fiber or metallic tip) within the evanescent field, the non-radiative light can be converted into detectable radiation via scattering or coupling, enabling nanometer-scale imaging [253,254].
The most common implementations of NSOM include the use of aperture probes based on tapered optical fibers [255], vibrating metallic tips [256], and apertureless approaches such as photon scanning tunneling microscopy (PSTM) [257]. These approaches employ different near-field detection strategies to achieve spatial resolutions far beyond those of conventional optical systems. In particular, aperture-type SNOM systems have achieved lateral resolutions on the order of tens of nanometers or better. Heinzelmann et al. demonstrated detection at incidence angles larger than the critical angle for total internal reflection, thus overcoming conventional optical limits and achieving improved imaging performance [258]. Hecht and co-workers subsequently provided a systematic overview of aperture-based SNOM imaging principles, outlining multiple system configurations and optimization strategies [255]. To further enhance resolution and extend applicability, researchers have developed apertureless probe techniques, including metallic tips exploiting localized field enhancement [259], tetrahedral probes [260], and non-contact scanning systems integrated with interferometric detection. These apertureless SNOM platforms can achieve spatial resolutions down to 10 Å, representing the state of the art in optical microscopy.
In the field of manufacturing metrology, NSOM has been successfully applied to the characterization of sub-20 nm semiconductor structures and to measurements of residual layer thickness in nanoimprint lithography. For instance, Takahashi et al. demonstrated the use of metallic tips exploiting near-field enhancement for precise detection of residual layers at the 10 nm scale [261]. Ohtsu et al. reported a photon scanning tunneling system employing a nanofiber probe, which achieved axial resolutions of only a few nanometers [262].
With advances in probe fabrication, probe–sample distance control, and vibration isolation systems, commercial instruments are now capable of achieving resolutions on the order of 10 nm. Notably, the neaSNOM system developed by Neaspec integrates multiple near-field detection modes and has been widely employed in materials science and semiconductor research [263]. Despite its remarkable spatial resolution, NSOM remains limited by several inherent challenges: (1) the imaging area is constrained by the short interaction range of the near field (typically only tens of nanometers); (2) the imaging speed is restricted by the point-by-point scanning mechanism; and (3) system stability and tip degradation remain persistent engineering concerns. Consequently, NSOM is currently regarded more as a high-precision complementary technique for micro- and nanostructure characterization rather than a replacement for large-area optical metrology tools.

3.3.2. Pupil-Filtering Confocal Super-Resolution Imaging

Aperture-filtering imaging technology originates from the engineered design of the point spread function (PSF) in Fourier optics. It was first proposed by Di Francia in 1952, who pointed out that by controlling the propagation characteristics of incident light in the frequency domain, it is possible to overcome the diffraction limit of conventional optical systems and thereby enhance resolution [264]. Typically, this approach employs specially designed diaphragms or phase plates—such as annular apertures or segmented masks—placed in the Fourier plane of the optical system to modulate the amplitude or phase of the light field. Such modulation compresses the main lobe and suppresses side lobes of the PSF, resulting in improved lateral resolution [265,266].
In recent years, aperture-filtering techniques have been increasingly integrated with confocal microscopy. Owing to its superior optical sectioning capability and effective background suppression, the confocal microscope provides an excellent platform for super-resolution imaging. When combined with aperture-filtering structures, it can substantially sharpen image features and further enhance resolution. For example, Zhao et al. proposed a bipolar annular aperture coupled with an absolute-differential confocal detection scheme, which effectively narrowed the PSF main lobe and achieved resolution improvements beyond 0.2 μm [267]. The critical advantage of this approach lies in its simultaneous suppression of side lobes and enhancement of axial contrast, making it highly suitable for precise measurements of fine structures. Similarly, Tang et al. demonstrated that radially polarized illumination combined with a multi-zone phase plate can produce a tighter focal spot and extended depth of focus [266]. This unique combination of tight focusing and long focal depth makes the technique advantageous for applications in three-dimensional structural inspection and deep-tissue imaging. Comparable strategies have also been applied in high-precision domains such as MEMS displacement metrology and curved-surface profilometry [268,269].
On the instrumentation side, Arrasmith et al. developed a MEMS-based handheld confocal microscope incorporating an aperture-modulation module, which enabled in vivo skin imaging and demonstrated the potential of aperture-filtered confocal microscopy for portable medical devices [270].
At the theoretical level, Sun and Liu proposed that phase modulation combined with polarization control can be used to design optical fields that generate ultrafine focal spots while maintaining extended depth of focus, thereby providing theoretical guidance for subsequent aperture-filter design [271].
Overall, aperture-filtered confocal imaging provides an effective route to overcoming the resolution limitations of conventional confocal microscopy by manipulating light propagation in the spatial frequency domain. Nevertheless, several challenges remain, including the inherent trade-off between main-lobe compression and side-lobe enhancement, the increased complexity of the optical setup, and significant light throughput loss, which can ultimately degrade the signal-to-noise ratio in imaging.

3.3.3. Structured Illumination Microscopy

Structured illumination microscopy (SIM) is among the most widely applied super-resolution imaging techniques, particularly demonstrating outstanding performance in live-cell imaging and micro/nanofabrication inspection. The fundamental principle of SIM lies in projecting a predefined spatially modulated illumination pattern (e.g., sinusoidal fringes, gratings, or lattices) onto the specimen. This process induces Moiré interference, which mixes high-frequency sample information—originally beyond the optical system’s passband—into the lower-frequency domain, thereby enabling its retrieval during subsequent image reconstruction [272,273].
Compared with point-scanning-based super-resolution methods such as Stimulated Emission Depletion Microscopy (STED) and Photoactivated Localization Microscopy (PALM)/Stochastic Optical Reconstruction Microscopy (STORM), Structured Illumination Microscopy (SIM) offers advantages including faster imaging speed, reduced phototoxicity, and suitability for thick specimens and live-cell observations. Moreover, since the excitation intensity required for SIM is significantly lower than that of STED, it is particularly advantageous for long-term imaging and the recording of dynamic processes. In the field of materials science and micro/nanomanufacturing, SIM has also been employed for high-precision detection of nanoscale targets such as surface structures, subwavelength defects, and lithography residues. For instance, Takahashi et al. utilized infrared standing-wave structured illumination to achieve 100 nm resolution in the detection of nanoscale defects on silicon wafer surfaces [274].
To further enhance resolution, researchers have developed advanced approaches such as saturated structured illumination microscopy (SSIM) and nonlinear SIM. These methods exploit the saturation response or nonlinear modulation of fluorophores to extend the system’s frequency support beyond the range accessible to conventional SIM, theoretically pushing the resolution below 50 nm. For example, the SSIM model proposed by Gustafsson has been experimentally demonstrated to surpass the spatial resolution of linear SIM by more than twofold [275]. More recently, deep learning and data-driven strategies have been integrated into SIM systems to reduce the number of raw frames required for reconstruction, improve imaging quality under low signal-to-noise conditions, and accelerate acquisition speed. Such approaches not only simplify experimental implementation but also enhance system robustness against noise and misalignment [273,276].
Despite its success, the practical deployment of SIM in engineering applications still faces several challenges. The illumination patterns typically rely on interference fringes or spatial light modulators (SLMs), requiring high optical stability and precise system calibration. In addition, the reconstruction process involves Fourier-domain processing and multi-frame fusion, which are not inherently fault-tolerant. Furthermore, high-resolution reconstructions often require multiple phase-shifted and angularly rotated images, resulting in relatively long acquisition times that limit the applicability to high-throughput inspection. Consequently, current research has focused on developing novel SIM variants that enable high speed, reduced frame acquisition, and stronger resilience to disturbances. Approaches such as adaptive fringe illumination, compressed-sensing-based reconstruction, and AI-assisted image recovery are being actively explored to meet the demands of industrial inspection and real-time monitoring [276].

3.3.4. Micro-Object-Based SR Imaging

Since the concept of the Photonic Nanojet (PNJ) was first introduced by Chen et al. in 2004, this technique has rapidly advanced in the field of super-resolution imaging [277]. A PNJ refers to a highly localized, high-intensity, non-diffracting light beam formed on the shadow side of a micron-scale transparent dielectric microparticle (e.g., a glass microsphere) when illuminated by an incident light. Unlike conventional near-field optical methods, PNJ-based imaging does not rely on probes or nanoscopic apertures; instead, it exploits the capability of microstructures to manipulate the optical field, enabling high-resolution focusing directly in the far-field region. This approach combines ease of system implementation with high imaging precision. The focusing performance of PNJs is influenced by parameters such as the refractive index and size of the microsphere, as well as the surrounding substrate medium, all of which can be optimized through engineering [278,279].
Simulations based on Mie scattering theory and angular spectrum propagation theory indicate that PNJs can achieve focal spots with full width at half maximum (FWHM) below 0.5 λ , while maintaining high focal intensity and strong energy localization [278,279]. Ferrand et al. experimentally visualized the spatial distribution and focusing characteristics of PNJs, confirming their practical potential [280]. Wang et al. developed a white-light nanoscopy system using 3 μm-diameter silica microspheres, achieving approximately 50 nm lateral resolution under far-field conditions, representing a notable demonstration of PNJ-based imaging in practice [281]. Furthermore, researchers have extended the approach at the material and structural level; for example, Lee et al. proposed using self-assembled microsphere arrays to realize near-field focusing and magnification, further enhancing imaging resolution and field of view [282].
In practical applications, PNJ technology has been widely employed for biological sample imaging, subsurface defect inspection, nanolithography, and surface-enhanced Raman spectroscopy (SERS) [252]. Its far-field nature facilitates integration into existing optical systems, offering high portability and commercial potential. Compared with conventional super-resolution methods, microstructure-based imaging provides advantages such as minimal requirement for complex nanofabrication, low cost, and flexible system assembly, making it particularly suitable for integration with standard microscopy platforms [252]. Nevertheless, this approach also has certain limitations. The imaging performance is highly sensitive to the contact state between the microsphere and the sample, favoring predominantly two-dimensional specimens, and precise positioning and manipulation of the microspheres still rely on mechanical methods, limiting high-precision automation.

3.4. Summary

Imaging-based optical metrology techniques rely on the analysis of intensity or phase information in captured images to reconstruct geometric and spatial characteristics of objects. Typical methods include confocal microscopy, digital holography, and structured-light 3D measurement. These techniques offer high flexibility, wide field of view, and non-contact measurement capabilities, making them well suited for surface profiling, biological imaging, and industrial inspection. Compared to interferometric approaches, imaging-based methods are more robust against environmental perturbations but generally exhibit lower precision and depend heavily on image quality and computational reconstruction algorithms.
Integrating imaging-based metrology with interferometry and spectroscopy can substantially enhance its measurement capability. For example, interferometry-assisted imaging can provide quantitative phase information to complement intensity-based image data, while spectroscopic integration can reveal compositional and material-related parameters. Moreover, embedding AI-driven image reconstruction and data fusion models can transform imaging-based metrology into a perception-oriented framework, enabling real-time, multi-parameter sensing for dynamic and complex environments.

4. Spectroscopy-Based Metrology

Spectroscopy-based metrology encompasses a range of optical measurement techniques that extract structural, dimensional, or compositional information by analyzing the wavelength-dependent interaction of light with matter. Representative approaches include absorption and transmission spectroscopy [283], reflection spectroscopy, as well as Raman and fluorescence spectroscopy [284,285,286]. These representative technologies and their corresponding technical characteristics are shown in Table 2. Each method has found specific niches in scientific research and industrial metrology; However, limitations such as low spatial resolution, sensitivity to environmental disturbances, or restrictions on sample material and geometry often constrain their applicability in high-precision, non-contact measurement scenarios.
However, for the measurement of geometrical parameters and surface reconstruction of micro–nano structures, conventional techniques still face limitations in terms of spatial resolution, measurement speed, and adaptability to complex surfaces. Against this backdrop, confocal techniques—including chromatic confocal microscopy [287] and laser confocal microscopy [288]—have emerged as powerful tools, offering high axial resolution and non-contact measurement capabilities. At the same time, scatterometry, a diffraction-based optical metrology technique, has attracted significant attention owing to its advantages in large-area inspection and rapid parameter inversion. This chapter provides a systematic review of these two approaches, highlighting their fundamental principles, recent advances, and application prospects.

4.1. Confocal Optical Metrology

4.1.1. Chromatic Confocal Technology

Chromatic confocal technology (CCT) originates from the confocal optical systems developed by Winston et al. in the 1940s–1950s [289,290], and evolved into a distinct branch with the incorporation of chromatic dispersion principles. Its fundamental concept is to use a dispersive objective to focus different wavelengths at distinct axial positions along the optical axis, thus encoding spatial position into wavelength. By analyzing the focal wavelength of the reflected light from the target surface and referring to a pre-calibrated wavelength–position curve, surface position can be determined with high precision. The basic principle of this method is shown in Figure 5a. It requires no axial scanning, is insensitive to source fluctuations and ambient light, and has become an international standard for surface profilometry. With technological advancements, commercial sensors from companies such as Stil (France), Precitec (Germany), Micro-Epsilon (Germany), ThinkFocus (China), and LightE-Technology (China) now achieve nanometer-scale resolution and linear accuracy, suitable for tilted surfaces and high-frequency measurements.
In applications, chromatic confocal sensors can rapidly acquire surface profiles through lateral scanning, enabling three-dimensional measurement of complex surfaces such as coins and aspheric lenses [291,292,293,294,295], and the obtained profile data can be further used for surface roughness evaluation [296]. With multi-sensor cooperation, multi-dimensional shape reconstruction can be achieved [297,298,299]. This technology has also been integrated into laser processing systems for inline surface inspection [300], gap monitoring in high-speed rotating machinery [301], microsphere position detection in optical traps [302], deformation monitoring of nuclear fuel in radiation environments [303], membrane deformation measurement in filtration [304], dynamic monitoring of vibrating tuning forks [305], and atomic force microscope scanning analysis [306], demonstrating strong adaptability to various objects and environments.
For thickness measurement, chromatic confocal methods include single-sensor step-height measurement, dual-sensor configurations, and multi-interface reflection analysis, applicable to transparent films, tempered glass, and lenses [307,308,309,310,311]. In particular, the full-spectrum fitting method based on thin-film interference theory has achieved nanometer-level thickness precision with micrometer-scale lateral resolution [312]. Overall, chromatic confocal technology combines high precision, strong adaptability, simple integration, and high efficiency, making it a vital tool in surface profile and thickness measurement with broad prospects in industrial inspection and scientific research.
Building upon the diverse applications outlined above, significant research has been devoted to advancing the key components and processing techniques of chromatic confocal systems, aiming to improve axial resolution, expand measurement range, enhance environmental robustness, and enable high-speed multi-point detection. This section reviews recent progress in four critical areas: broadband light sources, dispersive objectives, conjugate pinholes, and spectral detection with signal processing.
Broad spectrum light source
Spectral confocal technology relies on broadband light sources to achieve axial position encoding through wavelength dispersion; thus, the continuity and stability of the light source directly determine the measurement resolution and stability. Early implementations primarily employed incandescent lamps, which were subsequently replaced by halogen lamps, xenon lamps, and light-emitting diodes (LEDs). Among them, white LEDs, with a moderate spectral range (380–760 nm), high optical energy utilization, and ease of integration, have been widely adopted in the development of spectral confocal sensors [313,314]. In contrast, although halogen and xenon lamps can also generate broadband white light, they suffer from significant intensity fluctuations in the visible region, leading to a non-uniform signal-to-noise ratio (SNR) across the full measurement range. Moreover, their long preheating time and short operational lifetime further limit practical applications.
Based on optical nonlinear effects, pulsed lasers can be modulated to generate broadband supercontinuum (SC) light sources with high brightness and stability, which have emerged as a research hotspot in recent years. As shown in Figure 6a, Shi et al. [315] employed photonic crystal fibers (PCFs) to produce a supercontinuum spanning 350–1750 nm, which, when applied to spectral confocal systems, significantly enhanced illumination efficiency and SNR. Similarly, Minoni et al. [316] reported a 488–1064 nm SC source based on microstructured optical fibers (MOFs), featuring smooth spectral continuity and stable intensity, enabling a displacement measurement repeatability of 0.36%. Liu et al. [317], Johnson et al. [318], and Matsukuma et al. [105] further demonstrated SC sources covering 400–2400 nm for spectral confocal imaging, achieving a broader dispersion range and thereby extending the measurable displacement range.
In addition, Chen et al. [319] utilized a mode-locked femtosecond laser operating in the 1.46–1.64 μm infrared band for spectral confocal displacement measurements, achieving an axial resolution of 30 nm with excellent system stability. However, the dispersion range was limited to 40 μm due to the relatively narrow spectral bandwidth. Overall, both SC sources and mode-locked femtosecond lasers exhibit advantages in spectral stability and measurement performance. Nevertheless, their reliance on high-power lasers results in complex structures, high costs, and potential safety concerns, restricting their application mainly to laboratory environments. By contrast, simple, cost-effective, and stable white LEDs and halogen sources remain the mainstream choice in commercial products.
Dispersive objective lens
The dispersive objective lens is a key component of spectral confocal systems, as it directly determines critical performance parameters such as the measurement range, allowable surface tilt, and light collection efficiency. Current designs can be broadly categorized into refractive optical elements and diffractive optical elements (DOEs).
For refractive optics, Molesini et al. [320] employed a plano-convex lens to introduce dispersion after collimating white light and subsequently focused it with a microscope objective. However, since dispersion was already introduced at the collimation stage and a single lens could not simultaneously optimize both chromatic and other aberrations, the resulting spot quality was limited. Shi et al. [315] and Zhang et al. [321] adopted a pair of thin convex lenses to achieve dispersion after collimation, but the improvement remained modest. Li et al. [322], Niu et al. [323], Liu et al. [324,325], and Shao et al. [326] designed integrated dispersive objectives composed of multiple lenses using Zemax, thereby achieving dispersion while simultaneously optimizing aberrations. Wang et al. [327] and Yang et al. [328] extended the dispersion range to 30 mm using a four-stage cascaded lens group. However, the excessive number of cemented doublets increased system complexity, while random reflections from multiple lens surfaces reduced the overall signal-to-noise ratio (SNR).
For diffractive optics, Dobson et al. [329], Garzón et al. [330], Rayer et al. [331], and Liu et al. [332] employed Fresnel-type DOE lenses to generate dispersion and systematically investigated the influence of diffraction efficiency on the wavelength–position response and resolution. Hillenbrand et al. [333,334] compared the imaging performance of pure DOEs with DOE–lens hybrid objectives, while Jin et al. [335] also proposed hybrid structures. Pruss et al. [336], Fleischle et al. [337], Ruprecht et al. [338], Park et al. [339], and Luecke et al. [340] fabricated DOE lenses with millimeter-scale outer diameters (Figure 6b) for inner-diameter inspection of micro-holes. Hillenbrand et al. [334] further introduced a segmented DOE design that enabled simultaneous displacement measurements at three lateral positions, thereby improving multi-point detection efficiency. Nevertheless, additional reflections and energy loss caused by multiple diffraction orders and double-pass diffraction inherently limit the return efficiency and SNR of DOE-based systems compared with their refractive counterparts.
In summary, refractive optical components remain the mainstream solution in both research and commercialization due to their superior design flexibility, stray-light suppression, and higher return efficiency. By contrast, DOEs offer unique advantages in compactness and array integration, but efficiency and accuracy constraints remain significant bottlenecks to broader adoption.
Conjugate aperture
The conjugate pinhole is the core structure of confocal technology, enabling axial resolution and suppression of out-of-focus light. Its aperture size directly influences both the system resolution and the signal-to-noise ratio (SNR) of the reflected signal. Ruprecht et al. [341] analyzed the impact of aperture size on the performance of spectral confocal sensors, highlighting the trade-off between resolution and light collection efficiency.
To improve measurement efficiency, array-based pinhole designs have been proposed. Tiziani et al. [342,343] and Ang et al. [344] employed a Nipkow disk positioned between the beamsplitter and dispersive objective, enabling the same element to function as both the source and detection pinhole, thereby achieving rapid profilometric scanning. Hwang et al. [345] utilized a rotating disk with pinholes for optical frequency modulation, which significantly enhanced the contrast of three-dimensional imaging. Hillenbrand et al. [346], Chanbai et al. [347], and Hu et al. [348] applied pinhole arrays to split broadband illumination and performed spectral detection through conjugated pinhole arrays, enabling simultaneous measurement of multipoint surface profiles. Cui et al. [349] replaced physical pinholes with two liquid crystal display (LCD) matrices, allowing computer-controlled positioning of measurement points and enhancing the adaptability of the system to complex surface morphologies.
Beyond traditional diaphragm-based pinholes, multimode fibers are also widely used as substitutes for conjugate pinholes. With core diameters ranging from tens to hundreds of micrometers, multimode fibers simplify the optical configuration, reduce alignment complexity, and offer the advantages of low transmission loss and high stability. Luo et al. [350], Chen et al. [319], Bai et al. [314], and Chen et al. [351] introduced multimode fibers or multimode fiber couplers into spectral confocal systems, which not only preserved the confocal effect but also facilitated modular design and long-distance signal transmission.
In summary, pinhole arrays enable rapid multipoint imaging and improved measurement efficiency, but their resolution is limited by pinhole spacing and scattering interference. Multimode fibers, in contrast, are well suited for high-precision single-point detection. Although less efficient, they offer unique advantages in modularity and long-distance measurement.
Spectral detection and signal processing
In spectral confocal systems, the axial displacement is inferred from the focal wavelength of the reflected light, making spectral detection and signal processing critical components. The conventional approach involves dispersing polychromatic light into monochromatic components using dispersive elements such as prisms or gratings, with photodetectors recording the intensity of each wavelength. Molesini et al. [320] and Shi et al. [352] employed triangular prisms for spectral analysis, while Shi et al. [315] also utilized a diffraction grating. Taphanel et al. [353,354] adopted a color camera for coarse spectral measurements. The commercial spectrometers used by Minoni et al. [316] and Luo et al. [350] similarly relied on prisms or gratings as their core dispersive components.
To enhance detection speed, Kim et al. [355] exploited the relationship between wavelength and filter transmittance, calibrating axial position directly from transmittance. This method required only two photomultiplier tubes and avoided temporal integration, achieving measurement rates up to hundreds of MHz (as illustrated in Figure 6c). Chen et al. [356] used dual-color CCDs to acquire differential signals corresponding to positions instead of explicit wavelengths, thereby improving detection efficiency. However, both methods are limited to specific wavelength bands and constrained by relatively low spectral resolution, restricting further improvements in displacement accuracy.
To address the effects of source spectrum inconsistency and spectral jitter, various normalization schemes for reflected spectra have been proposed. For example, Luo et al. [350] directly adopted the source spectrum as a reference, Minoni et al. [316] used the reflected spectrum obtained without the dispersive objective as reference, Yu et al. [357] applied focal wavelength compensation for surfaces of different colors, and Bai et al. [358] introduced a pre-scanning self-referencing method, which improved the universality of the focal wavelength–position response curve (as shown in Figure 6d).
In terms of peak-search algorithms, Molesini et al. [320] adopted parabolic fitting, Shi et al. [352] directly used the maximum intensity, and Tan et al. [359] applied a sinc2 curve fitting. Ruprecht et al. [360] and Deng et al. [361] implemented centroid-based methods, which offer high computational efficiency. Niu et al. [362] and Luo et al. [350] compared centroid, Gaussian fitting, thresholded centroid, and thresholded Gaussian fitting approaches, concluding that Gaussian fitting achieves the highest accuracy while centroid methods provide the best efficiency. Chen et al. [351] and Lu et al. [363] further proposed modified differential fitting and mean-shift algorithms, achieving accuracy comparable to Gaussian fitting with significantly improved computational efficiency.
Overall, spectral detection and signal processing are progressing toward higher speed, broader applicability, and greater intelligence. In the future, integration with machine learning techniques may enable adaptive peak-searching and high-precision displacement reconstruction under complex operating conditions.

4.1.2. Confocal Laser Scanning Microscopy

Laser Confocal Microscopy (LCM), also referred to as Confocal Laser Scanning Microscopy (CLSM), is a high-resolution three-dimensional imaging and measurement technique based on optical focusing and spatial filtering. Compared with conventional optical microscopy (Figure 7), its core advantage lies in the use of a confocal pinhole to effectively suppress out-of-focus light, thereby enabling the acquisition of high-contrast, high-resolution images and three-dimensional reconstructions [364]. In an CLSM system, a point light source (typically a laser) illuminates the sample surface. The reflected or transmitted light is collected by the objective lens and directed toward a pinhole placed in front of the detector. The pinhole only permits light from the focal plane to pass through while blocking scattered light from out-of-focus regions. As a result, the system produces optical sectioning images with superior spatial resolution. By sequential point scanning and signal reconstruction, three-dimensional imaging and surface topography measurement of the sample can be achieved.
Since its introduction, CLSM has evolved into a powerful imaging modality with broad applications across biology, materials science, nanotechnology, and clinical diagnostics. The early foundation of CLSM was established by Sheppard and Wilson, who demonstrated that spatial filtering with a pinhole could effectively suppress out-of-focus light and achieve optical sectioning with sub-micron resolution, far surpassing conventional wide-field microscopy [365]. Pawley and collaborators further refined laser scanning configurations, enabling accurate three-dimensional reconstructions in biological tissues [366].
In biological imaging, fluorescence-based CLSM marked a significant breakthrough when Tsien and coworkers combined fluorescent dyes with confocal detection, providing real-time visualization of live-cell dynamics with high spatial resolution [367]. Later, Denk et al. pioneered the integration of multiphoton excitation with CLSM, which greatly improved penetration depth in thick tissues while reducing scattering and phototoxicity [368]. Computational strategies such as deconvolution and adaptive optics were subsequently incorporated, allowing aberration correction and further enhancing image sharpness [369].
In materials science, Schubert et al. demonstrated that CLSM could probe thin-film morphology and characterize surface roughness with depth selectivity, establishing its value in semiconductor and coating analysis [370]. More recent progress in nanophotonics has enabled CLSM to be combined with plasmonic substrates for surface-enhanced chemical sensing and nanomaterial characterization, thereby extending its applications to nanoscale metrology [371].
In the clinical domain, CLSM has been successfully translated into dermatological diagnostics. Rajadhyaksha et al. developed handheld CLSM devices for in vivo skin imaging, achieving cellular-level resolution for non-invasive detection of basal cell carcinoma and melanoma [372]. Clinical studies have demonstrated diagnostic sensitivities and specificities exceeding 90%, confirming its utility as a non-invasive adjunct to histopathology in skin cancer management [373,374].
Overall, CLSM has progressed from a fundamental optical innovation to a versatile and multidisciplinary imaging platform. With continuing advances in laser sources, detection schemes, and computational integration, CLSM is expected to further expand its role in nanometrology, biomedical imaging, and precision clinical diagnostics.
Differential Confocal Microscopy
Differential confocal microscopy (DCM) is an optical technique based on the confocal principle, enabling high-precision axial localization and surface profilometry [375]. By employing a differential detection mechanism, DCM reconstructs the axial response characteristics of conventional confocal microscopy, with its core lying in the construction of a dual-channel differential optical path. In its basic configuration, two detectors are symmetrically placed before and after the confocal plane to capture the reflected intensity from the sample. When the sample is exactly at the confocal plane, the two detectors receive equal intensities; as the sample moves axially, the detector responses change in opposite directions. By calculating the difference (or normalized difference) between the two signals, a response curve with monotonicity and a zero-crossing feature is obtained, enabling nanometer-scale axial localization accuracy [376].
Compared with conventional confocal imaging, DCM not only significantly improves axial resolution but also demonstrates strong noise immunity and a wide linear measurement range. It has thus been widely applied in microstructure profile reconstruction, thickness evaluation, and precision displacement control for advanced metrology. In 2000, Wang et al. first introduced the concept of DCM, achieving an axial resolution of 2 nm (with a measurement range of 100 μm). This established the fundamental differential detection structure, offering both absolute measurement and focus-tracking capability [377]. Over the subsequent two decades, researchers have continuously refined DCM techniques: Sun et al. [378] developed axial high-resolution DCM (AHDCM), in which sample position is determined through energy curves from three optical paths; Yun et al. [379,380] introduced D-shaped pupil DCM (DDCM), achieving 5 nm axial resolution at a 3.1 mm working distance with a threefold increase in imaging speed; Zou et al. [381] combined D-shaped pupils with radially birefringent filters to improve lateral resolution; Wang et al. [382] applied radially polarized illumination and pupil filtering for high-precision measurements.
In practice, DCM faces two major challenges: (i) test surfaces are often non-ideal planes [383]; and (ii) when employed as a profilometric probe, a trade-off exists between accuracy and scanning efficiency [384]. Although high-NA objectives can enhance axial resolution [385], sloped surfaces reduce optical energy collection, lowering SNR and resolution while introducing aberrations and artifacts. Mauch et al. [386] confirmed focal shift phenomena when using microscope objectives in freeform surface measurements. To address this, Wang et al. [387] proposed a slope measurement system and algorithm based on dual cylindrical lenses, enabling simultaneous acquisition of both spatial position and slope at the sample surface. For slope-related errors, Cacace et al. [388] employed position-sensitive detectors (PSDs) in 2009, but did not account for increased localization error and reduced resolution at large slopes. More recently, Wang et al. [376] introduced a Pearson correlation coefficient (PCC) compensation strategy based on peak clustering, which was applied to a DCM probe system. By increasing the sampling interval, this method enhanced scanning efficiency while maintaining preset accuracy, offered greater tolerance to localization errors, and compensated for the effects of non-normal incidence, thereby improving 3D measurement accuracy and system robustness.
With its superior axial resolution and structural flexibility, DCM offers unique advantages in high-precision surface metrology. Through the integration of novel structural designs, adaptive scanning strategies, and advanced signal processing algorithms, this technique continues to overcome the limitations of traditional confocal approaches, making it increasingly applicable to diverse micro- and nanometrology scenarios. Looking ahead, the combination of DCM with artificial intelligence-based image processing, active feedback control, and multimodal integration technologies is expected to further enhance imaging speed, system intelligence, and adaptability, establishing DCM as a key component of next-generation hybrid optical metrology systems.
Overall, research progress in these five areas has significantly expanded the capabilities of chromatic confocal technology, enabling higher resolution, larger ranges, faster acquisition rates, and broader application adaptability, thereby reinforcing its role as a critical tool in precision metrology and industrial inspection. Despite significant progress, the technology still faces three core challenges: designing dispersive optics with ultra-long measurement ranges (>10 mm), achieving dynamic measurement stability for highly reflective and multilayer transparent materials, and unifying precision across scales from nanometers to millimeters. Future development efforts are focused on the following areas: (1) integration of intelligent algorithms; (2) multimodal technology integration; (3) miniaturization and industrialization.

4.2. Optical Scatterometry

Optical scatterometry is a non-destructive metrology technique based on the interaction between light and periodic nanostructures. Its fundamental principle involves analyzing the intensity, polarization state, or phase variations of diffracted light to invert the geometric parameters of the nanostructures, such as critical dimension (CD), height (H), and sidewall angle (SWA). The Rigorous Coupled-Wave Analysis (RCWA) method, proposed by Moharam and Gaylord in the early 1980s [389], provides a solid mathematical foundation for this technique. RCWA solves Maxwell’s equations to establish a quantitative mapping between the structural geometric parameters and the optical diffraction response. Owing to its high throughput, non-contact nature, and low cost, optical scatterometry has become one of the core online metrology tools in semiconductor manufacturing.
Essentially, optical scatterometry is a model-based metrology (MBM) approach [390], and its measurement process represents a typical inverse problem. Its success relies on two critical technical aspects: first, accurate forward modeling of optical properties and efficient solution methods; second, robust algorithms for the inversion of geometric feature parameters. Therefore, advancing effective optical modeling and simulation methods for nanostructures, as well as developing fast and robust inversion algorithms for geometric parameters, has become a research focus in recent years to enhance measurement accuracy and expand the applicability of optical scatterometry.
Compared with CD-SEM and CD-AFM, optical scatterometry based on spectroscopic ellipsometry offers advantages such as high measurement speed, low cost, non-destructive operation, and ease of online integration. Beyond its widespread use in characterizing the optical properties and film thickness of thin-film materials [391,392], spectroscopic ellipsometry began to be applied around the year 2000 for critical dimension (CD) measurements of subwavelength nanostructures [393,394,395,396].
Spectroscopic ellipsometry (SE) measures the change in polarization of reflected light, expressed by the amplitude ratio ( Ψ ) and phase difference ( Δ ) between p- and s-polarized components:
tan ψ e i Δ = r p r s
Here, r p and r s denote the complex reflection coefficients for p-polarized and s-polarized light, respectively. Using this relationship, one can invert the material’s optical constants (refractive index n and extinction coefficient k) or the thickness of thin films.
Spectroscopic ellipsometry (SE) is widely employed for thin-film thickness measurements in solar cells and microelectronic devices. It has also been applied in early optical critical dimension (OCD) measurements of nanogratings, although its accuracy was limited by the insufficient dimensionality of the measured parameters.
SE exhibits extremely high sensitivity to thin-film thickness, with measurement precision reaching the angstrom (Å) scale. For example, Fujitsu Laboratories integrated SE into an atomic layer deposition (ALD) system to monitor film growth rates in real time, achieving control precision of ±0.1 nm [397]. SE is well suited for rapid characterization of isotropic thin films, such as SiO2 and photoresists. However, as it measures only the two parameters Ψ and Δ , its sensitivity to nanostructures with complex cross-sectional morphologies (e.g., gratings with slanted sidewalls) is limited. In addition, the measurement spot size is relatively large (typically >100 μm), making it difficult to resolve structural variations in small regions.

4.2.1. Mueller Matrix Ellipsometer

To overcome the limited information content of conventional spectroscopic ellipsometry (SE), Mueller matrix ellipsometry (MME) measures the complete 4 × 4 Mueller matrix, providing 16 independent parameters that fully characterize the sample’s polarization response, including anisotropic effects and depolarization phenomena [392]. Its implementation relies on modulating multiple incident polarization states and analyzing the corresponding scattered light. Since its inception, generalized ellipsometers have been widely applied in the characterization of anisotropic materials and in the measurement of geometric parameters of nanostructures [398,399].
At the Interface and Thin Film Physics Laboratory of CNRS-LPICM, Novikova and colleagues have been investigating the use of Mueller matrix ellipsometry for grating critical dimension (CD) and overlay error measurements since 2005 [400]. They highlighted that MME provides richer information than conventional spectroscopic ellipsometry and emphasized the importance of varying the azimuthal angle to achieve conical diffraction [401]. In 2007, Schuster et al. at the University of Stuttgart proposed introducing azimuthal scanning ( ϕ -scan) into conventional spectroscopic ellipsometry [402]. By recording the Mueller matrix as a function of azimuth at a fixed wavelength, simulations showed enhanced sensitivity to two-dimensional asymmetric hole arrays compared with wavelength scanning ( λ -scan) or incident angle scanning ( θ -scan), particularly for matrix elements m32 and m43. More recently, Nanometrics reported that MME exhibits high sensitivity to both direction and magnitude of grating overlay errors and asymmetry, whereas conventional SE is largely insensitive [403]. They also suggested that analyzing the relationship between specific matrix elements and structural features could enable rapid measurements.
Collins proposed a dual-rotating compensator configuration in 1999 [404], in which two compensators rotate synchronously at a frequency ratio of 5:3 to efficiently acquire the full Mueller matrix. This design covers a broad spectral range of 193–1000 nm and achieves calibration accuracy better than 0.005°. Liu et al. extended this design for generalized ellipsometers, implementing a dual-rotating compensator architecture with three degrees of freedom—wavelength, incident angle, and azimuth—enabling the acquisition of all 16 parameters in a single measurement. The resulting Mueller matrix element (MME) showed threefold higher sensitivity to line-edge roughness (LER) than SE, making it suitable for critical dimension measurements at sub-10 nm nodes [405]. Garcia-Caurel et al. confirmed that MME can accurately analyze optical anisotropy in complex samples such as liquid crystals and crystals [406].

4.2.2. Imaging Ellipsometer

Mueller Matrix Imaging Ellipsometry (MMIE) combines microscopic imaging with Mueller matrix ellipsometry (MME) to effectively overcome the spatial resolution and efficiency limitations of conventional scatterometry. In 2015, Liu et al. at Huazhong University of Science and Technology integrated MME with microscopic imaging, developing a Mueller matrix imaging ellipsometer. Compared with traditional spectroscopic ellipsometers, this system provides richer information, higher sensitivity, and the ability to vary the azimuthal angle to achieve conical diffraction [401], enabling large-area, rapid, and accurate measurements of nanoscale structural parameters. The system employs dual-rotating compensators, offering broad spectral measurement capability while simplifying system calibration and data processing. This approach not only retains the advantages of conventional MME but also incorporates the high spatial resolution of microscopic imaging, covering a spectral range of 190–1000 nm. In 2016, the group completed the development of China’s first high-precision, broadband Mueller matrix imaging ellipsometer, with the ellipsometric imaging configuration illustrated in Figure 8 [407,408].
In 2018, Hanyang University in South Korea proposed a dual-reflection ellipsometric imaging system [409], which generates differential images by comparing the polarization state changes between a reference and a test sample. This approach eliminates the need for zeroing, achieves measurement rates up to the camera’s maximum frame rate, reduces human error, and does not require uniform illumination, allowing real-time detection of surface defects and contamination. In 2019, Tohoku University in Japan developed a three-step phase-shifting ellipsometric imaging technique [410,411], combining a quarter-wave plate (QWP) and a rotating linear polarizer (RLP) to implement phase shifts. Images are captured at three equally spaced azimuth angles during constant-speed rotation, enabling nanoscale material thickness measurements.
Also in 2019, Huazhong University of Science and Technology developed a vertically oriented, liquid-crystal-based Mueller matrix imaging ellipsometer [412], addressing the limited depth of field and narrow field of view associated with oblique imaging, and achieving high-resolution, wide-field measurements suitable for nanoscale thin-film geometric characterization. In 2018, Chosun University in South Korea proposed a large-area spectroscopic imaging ellipsometer [413,414], integrating a broadband light source with an imaging spectrometer (400–800 nm) and expanding the beam to 30 mm before low-magnification lens imaging. This system captures spectral–spatial intensity maps of polarization variations, enabling large-area thin-film thickness profiling with lateral resolution down to 4 μm, thus advancing ellipsometric imaging toward wide-field, broadband applications.

4.3. Summary

Spectroscopy-based metrology determines physical or chemical properties by analyzing the interaction between light and matter across different wavelengths. Techniques such as absorption, reflection, fluorescence, and Raman spectroscopy are widely used for material identification, thin-film measurement, and microstructure characterization. The method provides excellent sensitivity to material composition and enables non-destructive, label-free analysis. However, its spatial resolution and measurement speed are typically lower than those of imaging or interferometric approaches, and the complexity of spectral data processing often limits its real-time performance.
Combining spectroscopy with imaging or interferometric modalities allows for the simultaneous capture of both structural and compositional information, enhancing the overall diagnostic capability of optical systems. For instance, integrating spectroscopy with interferometry can improve phase stability through wavelength tuning, while coupling with imaging enables spatially resolved spectral mapping. The integration of broadband light sources, computational spectroscopy, and machine-learning-driven spectral analysis represents a promising direction toward intelligent hybrid metrology systems capable of high-precision, multi-dimensional perception.

5. Hybrid & Frontier Metrology

The emergence of hybrid and frontier metrology signifies a paradigm shift from precision-oriented measurement to perception-oriented sensing. By incorporating computational intelligence, multimodal fusion, and real-time decision-making capabilities, these systems bridge the gap between traditional metrology and perceptual understanding, forming the core of next-generation optical technologies.

5.1. Hyperspectral Imaging Metrology

Hyperspectral imaging (HSI) integrates imaging and spectroscopic measurement into a single system. By acquiring high spectral resolution samples of the target’s reflectance or radiance across tens to hundreds of contiguous narrow spectral bands (typically spanning the visible to near-infrared range of 0.4–2.5 μm), it enables the identification of subtle material characteristics. The basic principle is to use prisms, gratings, or tunable filters to disperse light by wavelength, and then employ a two-dimensional detector to acquire spectral and spatial information simultaneously. System architectures mainly include push-broom, step-scan, snapshot, and point-scanning modes, among which the push-broom configuration is widely applied in remote sensing and industrial domains due to its high spectral resolution and broad coverage [415].
In recent years, rapid advances in optical device fabrication and data processing algorithms have driven significant breakthroughs in HSI with respect to spatial resolution, imaging speed, and intelligent data processing. On one hand, compact HSI systems based on distributed gratings and micro/nano-optical components have emerged, greatly improving miniaturization and portability to meet the needs of applications such as industrial in-line inspection and handheld measurement [416,417,418]. For example, compact hyperspectral cameras based on on-chip thin-film filters and quantum dot materials (e.g., Specim IQ) have been deployed in industrial online inspection and handheld terminal devices [419]. On the other hand, novel HSI methods that integrate computational imaging with compressed sensing theory have become a research frontier: by combining optical encoding with reconstruction algorithms, these approaches dramatically reduce data acquisition volume and transmission load, enabling efficient, real-time hyperspectral data acquisition [420]. Furthermore, deep learning-driven spectral feature extraction and object recognition techniques are continuously advancing, significantly enhancing HSI’s detection and classification capabilities in complex scenarios. Convolutional neural networks (CNNs), transformer models, and other architectures have been widely applied to hyperspectral image feature extraction, denoising, and super-resolution reconstruction, demonstrating outstanding performance in applications such as surface defect detection in precision manufacturing and quality assessment of semiconductor materials [421,422].
In agriculture and food inspection, HSI has been widely used for crop growth monitoring, early-stage pest and disease diagnosis, and quality grading of agricultural products. By mounting hyperspectral sensors on UAVs, precision agricultural management can be carried out at the field scale, improving the accuracy of crop classification and disease detection. In the food industry, HSI has been applied to egg freshness prediction, as well as classification and inspection of agricultural products such as pine nuts and coffee beans; its non-destructive, full-field measurement capability has made it an important tool in modern food quality control [423]. In industrial non-destructive testing and intelligent manufacturing, HSI has shown great potential for detecting adhesive residue, surface cracks, and latent defects in carbon fiber composite production, effectively supporting the establishment of automated inspection and quality control systems [424]. In the field of medical diagnosis, with the establishment of hyperspectral brain imaging databases and advances in real-time data annotation techniques, HSI is increasingly being applied in intraoperative assistance for brain tumor surgery, enabling real-time tumor boundary recognition and visual guidance based on spectral differences between tissues [425]. Moreover, with the growing demand for global environmental monitoring, hyperspectral remote-sensing satellites such as Germany’s EnMAP and India’s Pixxel commercial satellite projects have expanded HSI applications into broader domains including terrestrial ecosystem monitoring, mineral resource exploration, and atmospheric environment observation, pushing the technology from laboratories into diverse industrial fields [426,427]. Waste recycling and intelligent sorting are also emerging application areas, where HSI’s high sensitivity to the spectral characteristics of different plastics and solid waste materials is reshaping the smart recycling industry chain, providing strong support for green manufacturing and sustainable development [428].
Despite its advantage of high resolution, HSI still faces challenges such as data redundancy, high cost, and lack of standardization. Future technological developments will focus on sensor miniaturization, deep integration with AI, multi-modal sensing, and standardization efforts. With advances in MEMS technology and quantum dot materials, HSI is expected to further penetrate consumer electronics and smart devices, while breakthroughs in quantum sensing may enable single-molecule-level spectral detection, opening new fields for nanoscale material analysis [429,430].
Overall, hyperspectral imaging technology is evolving toward the “three-high” direction of high-spectral, high-spatial, and high-temporal resolution, and is progressively integrating with multi-modal detection methods such as Raman spectroscopy, photoacoustic imaging, and infrared thermography. Its application boundaries in precision manufacturing, life sciences, resource and environmental fields, and intelligent manufacturing are continually expanding. In the future, with the deep integration of edge computing and artificial intelligence into hyperspectral data processing, HSI will achieve an integrated closed-loop workflow from high-speed data acquisition to real-time intelligent analysis, facilitating the widespread adoption of optical metrology methods in next-generation advanced manufacturing systems.
From an integration perspective, hyperspectral imaging serves as a natural bridge between imaging-based and spectroscopy-based metrology. Its capability to capture both spatial and spectral dimensions enables multimodal fusion with complementary techniques such as Raman spectroscopy, interferometry, and infrared thermography. The main challenges include balancing spectral and spatial resolution, managing large data volumes, and maintaining cross-modal calibration. Future developments will emphasize miniaturized sensors, AI-assisted data reconstruction, and edge-computing-based fusion pipelines that enable real-time analysis and interpretation. Through such integration, hyperspectral imaging will evolve from a stand-alone modality toward a core component of intelligent, perception-oriented optical metrology systems.

5.2. Optical Vortex-Based Metrology

Optical vortices, as light beams carrying orbital angular momentum (OAM), have attracted considerable attention in recent years within the field of high-precision optical metrology. Compared with conventional Gaussian beams, they possess more intricate phase structures and exhibit unique advantages in applications such as angular displacement sensing, micro-force measurement, and phase imaging. First introduced by Allen et al., optical vortices are characterized by a helical phase term expressed as exp ( i l ϕ ) , where l denotes the topological charge, representing the quantum number of the orbital angular momentum carried by the beam [431]. This helical phase distribution results in a hollow dark core at the beam center, endowing the light field with both spatial tunability and exceptional sensitivity to minute angular and displacement variations, thereby establishing it as a powerful tool in contemporary optical metrology.
From a fundamental perspective, the metrological capability of optical vortices primarily arises from their distinctive response to rotationally symmetric perturbations. Due to the strict orthogonality among vortex beams with different topological charges, system measurements can be achieved by detecting variations in interference fringes, OAM mode transitions, or wavefront distortions to infer rotational, deformation, or displacement information of the target. For example, when an optical vortex propagates through a slightly rotated object or phase plate, its phase distribution undergoes a characteristic distortion that can be precisely captured through interferometric techniques, enabling high-accuracy angular sensing [432,433]. In optical trapping systems, the orbital angular momentum of vortex beams can impart nanoscale torques for cellular manipulation [434]. Furthermore, in multimode OAM systems, vortex beams of different orders are exploited for parallel encoding, thereby extending the channel capacity of optical communication systems [435].
In terms of implementation, optical vortex metrology has evolved along multiple technological pathways. On the generation side, diverse methods have been developed, including spatial light modulators (SLMs), spiral phase plates, metasurfaces, and integrated optical devices, all capable of producing OAM beams of varying orders [436,437,438,439]. On the detection side, various schemes based on interferometry, Fourier mode decomposition, and machine learning have been introduced, leading to significant improvements in the accuracy of OAM state recognition and retrieval [440,441,442]. Vortex-based interferometers have also emerged as a research hotspot. For instance, OAM interferometers modified from Mach–Zehnder configurations have demonstrated measurement precision at the nanometer scale for linear displacement and at the picoradian scale for angular displacement [443]. In surface imaging, the helical phase of vortex beams can serve as a wavefront probe, enhancing both the sensitivity and resolution of surface profile and roughness characterization [444,445].
Despite their promising prospects, several challenges remain in the practical deployment of optical vortex metrology. The generation and control of OAM beams demand stringent experimental conditions and high-precision components, with devices such as SLMs and phase plates being costly and system architectures often complex. Moreover, vortex beams are susceptible to turbulence and system-induced errors during propagation, which can compromise stability. In addition, the demultiplexing and recognition of higher-order OAM modes still rely on sophisticated algorithms and multidimensional detection architectures, limiting real-time performance and portability. Nevertheless, owing to their non-contact nature, high sensitivity, and capacity to encode high-dimensional information, optical vortex metrology has become an essential complement to existing optical measurement techniques. It holds significant potential for applications in precision manufacturing, biomedical imaging, and emerging quantum sensing technologies [446].
Integrating vortex-based metrology with other optical modalities—such as interferometry and holography—can provide complementary phase, angular, and structural information, extending its applicability in precision manufacturing and biomedical imaging. However, challenges remain in maintaining OAM stability under turbulence, simplifying high-order mode control, and achieving real-time demultiplexing for dynamic measurements. Future research may focus on adaptive wavefront correction, learning-based OAM state recognition, and integrated photonic devices for compact, vibration-tolerant implementations. These advances will enhance robustness and scalability, supporting the deployment of vortex-assisted hybrid metrology in complex industrial environments.

5.3. AI-Assisted Optical Metrology

In recent years, artificial intelligence (AI) has demonstrated disruptive potential in the field of optical metrology, offering new pathways toward achieving higher accuracy, greater efficiency, and enhanced adaptability. Traditional optical metrology techniques have largely relied on rigorous physical modeling, such as Rigorous Coupled-Wave Analysis (RCWA), the Finite-Difference Time-Domain method (FDTD), or phase retrieval algorithms. While these approaches are powerful, they often involve heavy computational loads and exhibit high sensitivity to noise. By contrast, AI methods based on machine learning (ML) and deep learning (DL) provide data-driven alternatives that directly map optical signals to target metrological parameters, thereby substantially reducing computational costs and improving robustness under experimental uncertainties [447,448,449].
One of the most significant applications of AI in optical metrology lies in solving inverse problems, namely reconstructing structural or material parameters from measured optical responses. Neural networks trained on simulated or experimental datasets enable rapid extraction of nanoscale parameters, thereby overcoming the bottlenecks of traditional iterative optimization techniques. For instance, convolutional neural networks (CNNs) have been employed in scatterometry for estimating critical dimensions and sidewall angles, as well as in nanoscale tomographic modeling, demonstrating superior accuracy and noise tolerance compared with conventional regression methods [450,451,452,453]. Furthermore, physics-informed neural networks (PINNs), which embed Maxwell’s equations into the learning process, have shown enhanced generalization capabilities beyond the training domain [454,455].
Another rapidly advancing direction is AI-enabled image-based optical metrology, particularly in microscopy, interferometry, and holography [456,457,458]. Deep learning models have been widely applied to phase retrieval, holographic reconstruction denoising, and super-resolution imaging, effectively overcoming the diffraction limit. In digital holographic microscopy, AI-driven phase retrieval methods eliminate the need for complex iterative algorithms, enabling real-time imaging of biological specimens and nanostructures [459]. Similarly, in interferometric profilometry, generative adversarial networks (GANs) have been utilized for speckle noise suppression and high-dynamic-range reconstruction, significantly improving measurement accuracy under challenging conditions [460,461].
AI has also facilitated the development of adaptive and autonomous optical metrology systems. Reinforcement learning (RL) has been introduced to optimize measurement configurations, such as incident angle, polarization state, wavelength selection, and sensor alignment, enabling systems to autonomously adapt to varying sample properties [462,463,464]. This adaptive paradigm not only enhances efficiency but also broadens the applicability of optical metrology in complex industrial environments, particularly in scenarios characterized by high sample diversity and stringent speed requirements.
Nevertheless, the application of AI in optical metrology faces several challenges. Its “black-box” nature has raised concerns, particularly in high-stakes applications such as semiconductor manufacturing and biomedical diagnostics, where model interpretability and reliability remain critical. Moreover, the dependence on large, representative datasets limits its scalability, especially in cases where experimental data acquisition is costly. To address these issues, hybrid approaches combining physical modeling with AI—such as physics-guided deep learning—have emerged as effective strategies, ensuring both predictive accuracy and physical interpretability [465].
In summary, AI-assisted optical metrology is driving a paradigm shift from model-driven to data-driven methodologies. Current studies have already demonstrated notable advantages in terms of speed, precision, and adaptability. Future research is expected to focus on developing interpretable AI frameworks, enhancing cross-domain transferability, and establishing standardized validation benchmarks. As AI technologies continue to mature, they are poised to become the central driving force behind next-generation optical metrology systems, meeting the stringent demands of advanced manufacturing, materials science, and biomedical imaging.
Beyond standalone applications, AI also plays a central role in connecting heterogeneous metrology modalities into unified, perception-oriented frameworks. Key challenges lie in ensuring cross-domain data consistency, model interpretability, and generalization under unseen conditions. Physics-informed learning and multimodal training strategies—integrating interferometric phase, spectral, and imaging data—offer promising routes toward robust fusion models. Furthermore, embedding AI algorithms into optical hardware platforms such as FPGA or neuromorphic processors can enable real-time adaptive metrology. These developments represent a crucial step toward autonomous optical measurement systems capable of continuous optimization and intelligent decision-making.

6. Conclusions and Perspective

Optical metrology and sensing constitute a critical field that has already played, and will continue to play, a pivotal role in science and technology. Given its increasing impact, it is necessary to provide a comprehensive overview—from fundamental principles to advanced techniques and future perspectives. This work systematically reviews multimodal approaches in optical metrology and sensing, and proposes a four-category framework consisting of interferometry, imaging, spectroscopy, and hybrid & frontier technologies. For each category, a three-dimensional analytical model—principle, scenario adaptation, and latest developments—is constructed to map the technological evolution and interdisciplinary connections. The review covers applications such as industrial inspection (e.g., nanoscale defect detection in semiconductors) and consumer electronics (e.g., AR/VR spatial tracking), with the aim of providing guidance for technology selection.
Looking ahead, optical metrology exhibits several development trends.
(1) Hardware innovation aims to realize compact, stable, and high-performance systems within physical and environmental limits. While progress in miniaturized light sources, EUV optics, and metasurface components has been remarkable, challenges remain in scaling integration without compromising stability or yield. Future work will focus on CMOS–MEMS on-chip integration, high-uniformity EUV gratings, noise-resilient quantum emitters, and reconfigurable nanophotonic devices to achieve scalable and reliable hardware foundations.
(2) Fusion of fundamental and emerging paradigms targets unified multimodal frameworks that combine interferometric, imaging, spectroscopic, and quantum modalities for multidimensional sensing. Core challenges include cross-domain calibration, temporal synchronization, and the interpretability of AI-driven fusion models. Promising directions include hyperspectral imaging with joint AI modeling, quantum-enhanced sensing, frequency-comb-based references, and wide-FOV AR/VR systems. Establishing unified standards and physically interpretable fusion models will be crucial for dependable integration.
(3) Signal processing and intelligent analysis seek to transform optical data into actionable insights through real-time, automated interpretation. Despite significant advances in AI-assisted inspection, adaptive multi-sensor fusion, and all-optical processing, issues of data reliability, interpretability, and resource efficiency persist. Future developments will emphasize physics-informed learning, edge–cloud collaboration, and co-design of optical–electronic architectures to ensure accurate, traceable, and energy-efficient measurement.
In summary, the future of optical metrology lies in the convergence of precision, intelligence, and integration. Progress in low-noise quantum sources, stable EUV systems, and intelligent data architectures will drive the transition from measurement-oriented systems to perception-oriented optical metrology capable of understanding and adapting to complex environments.

Author Contributions

Conceptualization and investigation, S.S., F.Z., Z.L., L.L. and X.L.; writing—original draft, S.S., F.Z., Z.L. and L.L.; writing—review and editing, S.S. and F.Z.; supervision, X.L.; project administration, X.L.; funding acquisition, X.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by National Natural Science Foundation of China General Project, Grant Number 62575157; Shenzhen Fundamental Research Funding Supported by Shenzhen Science and Technology Program, Grant Number JCYJ20241202125343058; Shenzhen Stable Supporting Program Supported by Shenzhen Science and Technology Program, Grant Number WDZC20231124201906001; National Natural Science Foundation of China General Project, Grant Number 62275142; Shenzhen Science and Technology Program (General Program), Grant Number JCYJ20240813112003005.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors would like to thank all laboratory members and collaborators who have contributed to the research efforts and discussions that supported the development of this work. During the preparation of this manuscript, the authors used OpenAI’s ChatGPT (GPT-5, 2025 version) for language polishing and improvement of clarity. The authors have carefully reviewed, revised, and verified all AI-generated content and take full responsibility for the final version of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zolfaghari, A.; Chen, T.; Allen, Y.Y. Additive manufacturing of precision optics at micro and nanoscale. Int. J. Extrem. Manuf. 2019, 1, 012005. [Google Scholar] [CrossRef]
  2. Qin, X.; Zhong, B.; Xu, H.; Jackman, J.A.; Xu, K.; Cho, N.J.; Lou, Z.; Wang, L. Manufacturing high-performance flexible sensors via advanced patterning techniques. Int. J. Extrem. Manuf. 2025, 7, 032003. [Google Scholar] [CrossRef]
  3. Xue, B.; Yan, H.; Liu, Z.; Yan, Y.; Geng, Y. Achieving tip-based down-milling forming of nanograting structures with variable heights through precise control of nano revolving trajectories. Int. J. Extrem. Manuf. 2025, 7, 055101. [Google Scholar] [CrossRef]
  4. Liu, X.; Huang, R.; Yu, Z.; Peng, K.; Pu, H. A high-accuracy capacitive absolute time-grating linear displacement sensor based on a multi-stage composite method. IEEE Sens. J. 2021, 21, 8969–8978. [Google Scholar] [CrossRef]
  5. Yuan, Z.; Zhou, S.; Hong, C.; Xiao, Z.; Zhang, Z.; Chen, X.; Zeng, L.; Wu, J.; Wang, Y.; Li, X. Piezo-actuated smart mechatronic systems for extreme scenarios. Int. J. Extrem. Manuf. 2024, 7, 022003. [Google Scholar] [CrossRef]
  6. Yoshizawa, T. Handbook of Optical Metrology: Principles and Applications; CRC Press: Boca Raton, FL, USA, 2009. [Google Scholar]
  7. Gao, W.; Kim, S.; Bosse, H.; Minoshima, K. Dimensional metrology based on ultrashort pulse laser and optical frequency comb. CIRP Ann. 2025, 74, 993–1018. [Google Scholar] [CrossRef]
  8. Xia, H.; Fei, Y.; Wang, Z. Research on the planar nanoscale displacement measurement system of 2-D grating. J. Hefei Univ. Technol. (Nat. Sci. Ed.) 2007, 30, 529–532. [Google Scholar]
  9. Zhang, S.; Yan, L.; Chen, B.; Xu, Z.; Xie, J. Real-time phase delay compensation of PGC demodulation in sinusoidal phase-modulation interferometer for nanometer displacement measurement. Opt. Express 2017, 25, 472–485. [Google Scholar] [CrossRef]
  10. Renishaw plc. XL-80 Laser Interferometer System. Available online: https://www.renishaw.com/en/xl-80-laser-system–8268 (accessed on 11 August 2025).
  11. Automated Precision Inc. XD Laser 5 D Laser Interferometer System. Available online: https://www.apimetrology.com/xd-laser-2020/ (accessed on 11 August 2025).
  12. Keem, T.; Gonda, S.; Misumi, I.; Huang, Q.; Kurosawa, T. Simple, real-time method for removing the cyclic error of a homodyne interferometer with a quadrature detector system. Appl. Opt. 2005, 44, 3492–3498. [Google Scholar] [CrossRef]
  13. Joo, K.N.; Ellis, J.D.; Spronck, J.W.; van Kan, P.J.; Schmidt, R.H.M. Simple heterodyne laser interferometer with subnanometer periodic errors. Opt. Lett. 2009, 34, 386–388. [Google Scholar] [CrossRef]
  14. Joo, K.N.; Ellis, J.D.; Buice, E.S.; Spronck, J.W.; Schmidt, R.H.M. High resolution heterodyne interferometer without detectable periodic nonlinearity. Opt. Express 2010, 18, 1159–1165. [Google Scholar] [CrossRef] [PubMed]
  15. Leirset, E.; Engan, H.E.; Aksnes, A. Heterodyne interferometer for absolute amplitude vibration measurements with femtometer sensitivity. Opt. Express 2013, 21, 19900–19921. [Google Scholar] [CrossRef]
  16. Yan, H.; Duan, H.Z.; Li, L.T.; Liang, Y.R.; Luo, J.; Yeh, H.C. A dual-heterodyne laser interferometer for simultaneous measurement of linear and angular displacements. Rev. Sci. Instrum. 2015, 86, 123102. [Google Scholar] [CrossRef] [PubMed]
  17. Dong Nguyen, T.; Higuchi, M.; Tung Vu, T.; Wei, D.; Aketagawa, M. 10-pm-order mechanical displacement measurements using heterodyne interferometry. Appl. Opt. 2020, 59, 8478–8485. [Google Scholar] [CrossRef]
  18. Le Floch, S.; Salvadé, Y.; Droz, N.; Mitouassiwou, R.; Favre, P. Superheterodyne configuration for two-wavelength interferometry applied to absolute distance measurement. Appl. Opt. 2010, 49, 714–717. [Google Scholar] [CrossRef]
  19. Yin, Z.; Li, F.; Sun, Y.; Zou, Y.; Wang, Y.; Yang, H.; Hu, P.; Fu, H.; Tan, J. High synchronization absolute distance measurement using a heterodyne and superheterodyne combined interferometer. Chin. Opt. Lett. 2024, 22, 011204. [Google Scholar] [CrossRef]
  20. Liu, Y.; Qu, S. Optical fiber Fabry–Perot interferometer cavity fabricated by femtosecond laser-induced water breakdown for refractive index sensing. Appl. Opt. 2014, 53, 469–474. [Google Scholar] [CrossRef]
  21. Xu, B.; Liu, Y.; Wang, D.; Li, J. Fiber Fabry–Pérot interferometer for measurement of gas pressure and temperature. J. Light. Technol. 2016, 34, 4920–4925. [Google Scholar] [CrossRef]
  22. Albert, A.; Donati, S.; Lee, S.L. Self-mixing interferometry on long distance: Theory and experimental validation. IEEE Trans. Instrum. Meas. 2024, 73, 1009808. [Google Scholar] [CrossRef]
  23. Zhang, M.; Li, J.; Xie, Z.; Xia, W.; Guo, D.; Wang, M. Laser self-mixing interferometry for dynamic displacement sensing of multiple targets. Opt. Eng. 2024, 63, 016105. [Google Scholar] [CrossRef]
  24. Herman, D.I.; Walsh, M.; Genest, J. Mode-resolved optical frequency comb fixed point localization via dual-comb interferometry. Opt. Lett. 2024, 49, 7098–7101. [Google Scholar] [CrossRef]
  25. Deng, Z.; Liu, Y.; Zhu, Z.; Luo, D.; Gu, C.; Zuo, Z.; Xie, G.; Li, W. Achieving precise spectral analysis and imaging simultaneously with a mode-resolved dual-comb interferometer. Sensors 2021, 21, 3166. [Google Scholar] [CrossRef] [PubMed]
  26. Zhang, S.; Jin, H.; Zhang, L.; Yan, L. Absolute distance measurement using frequency sweeping interferometry with large swept range and target drift compensation. Meas. Sci. Technol. 2023, 34, 085014. [Google Scholar] [CrossRef]
  27. Coggrave, C.; Ruiz, P.; Huntley, J.; Nolan, C.; Gribble, A.; Du, H.; Banakar, M.; Yan, X.; Tran, D.; Littlejohns, C. Adaptive delay lines implemented on a photonics chip for extended-range, high-speed absolute distance measurement. In Proceedings of the Emerging Applications in Silicon Photonics III, Birmingham, UK, 6–9 December 2022; SPIE: Bellingham, WA, USA, 2023; Volume 12334, pp. 22–27. [Google Scholar]
  28. Lawall, J. Interferometry for accurate displacement metrology. Opt. Photonics News 2004, 15, 40–45. [Google Scholar] [CrossRef]
  29. Zeng, Z.; Qu, X.; Tan, Y.; Tan, R.; Zhang, S. High-accuracy self-mixing interferometer based on single high-order orthogonally polarized feedback effects. Opt. Express 2015, 23, 16977–16983. [Google Scholar] [CrossRef] [PubMed]
  30. Yu, H.; Chen, X.; Liu, C.; Cai, G.; Wang, W. A survey on the grating based optical position encoder. Opt. Laser Technol. 2021, 143, 107352. [Google Scholar] [CrossRef]
  31. Li, X.; Shi, Y.; Wang, P.; Ni, K.; Zhou, Q.; Wang, X. A Compact Design of Optical Scheme for a Two-Probe Absolute Surface Encoders. In Proceedings of the Tenth International Symposium on Precision Engineering Measurements and Instrumentation, Kunming, China, 8–10 August 2018; p. 155. [Google Scholar] [CrossRef]
  32. Li, X.; Xiao, X.; Lu, H.; Ni, K.; Zhou, Q.; Wang, X. Design and Testing of a Compact Optical Prism Module for Multi-Degree-of Freedom Grating Interferometry Application. In Proceedings of the Tenth International Symposium on Precision Engineering Measurements and Instrumentation, Kunming, China, 8–10 August 2018; p. 165. [Google Scholar] [CrossRef]
  33. Liao, B.; Wang, S.; Lin, J.; Dou, Y.; Wang, X.; Li, X. A Research on Compact Short-Distance Grating Interferometer Based on Ridge Prism. In Proceedings of the 2021 International Conference on Optical Instruments and Technology: Optoelectronic Measurement Technology and Systems, Online Only, China, 8–10 April 2022; p. 95. [Google Scholar] [CrossRef]
  34. Luo, L.; Gao, L.; Wang, S.; Deng, F.; Wu, Y.; Li, X. An Ultra-Precision Error Estimation for a Multi-Axes Grating Encoder Using Quadrant Photodetectors. In Proceedings of the Optical Metrology and Inspection for Industrial Applications IX, Online Only, China, 5–11 December 2022; p. 12. [Google Scholar] [CrossRef]
  35. Zhu, J.; Wang, G.; Xue, G.; Zhou, Q.; Li, X. Heterodyne Three-Degree-of-Freedom Grating Interferometer for Ultra-Precision Positioning of Lithography Machine. In Proceedings of the 2021 International Conference on Optical Instruments and Technology: Optoelectronic Measurement Technology and Systems, Online Only, China, 8–10 April 2022; p. 113. [Google Scholar] [CrossRef]
  36. Zhu, J.; Wang, S.; Li, X. Ultraprecision Grating Positioning Technology for Wafer Stage of Lithography Machine. Laser Optoelectron. Prog. 2022, 59, 0922019. (In Chinese) [Google Scholar]
  37. Li, Y.; Yang, L.; Wang, X.; Shan, S.; Deng, F.; He, Z.; Liu, Z.; Li, X. Overlay Metrology for Lithography Machine. Laser Optoelectron. Prog. 2022, 59, 0922023. (In Chinese) [Google Scholar]
  38. Zhong, Z.; Li, J.; Lu, T.; Li, X. High dynamic wavefront stability control for high-uniformity periodic microstructure fabrication. Precis. Eng. 2025, 93, 216–223. [Google Scholar] [CrossRef]
  39. Shao, C.; Li, X. Technologies for Fabricating Large-Size Diffraction Gratings. Sensors 2025, 25, 1990. [Google Scholar] [CrossRef]
  40. Gao, X.; Li, J.; Zhong, Z.; Li, X. Global alignment reference strategy for laser interference lithography pattern arrays. Microsyst. Nanoeng. 2025, 11, 41. [Google Scholar] [CrossRef] [PubMed]
  41. Cui, C.; Li, X.; Wang, X. Grating interferometer: The dominant positioning strategy in atomic and close-to-atomic scale manufacturing. J. Manuf. Syst. 2025, 82, 1227–1251. [Google Scholar]
  42. Huang, G.; Cui, C.; Lei, X.; Li, Q.; Yan, S.; Li, X.; Wang, G. A Review of Optical Interferometry for High-Precision Length Measurement. Micromachines 2024, 16, 6. [Google Scholar] [CrossRef] [PubMed]
  43. Luo, L.; Shan, S.; Li, X. A review: Laser interference lithography for diffraction gratings and their applications in encoders and spectrometers. Sensors 2024, 24, 6617. [Google Scholar] [CrossRef] [PubMed]
  44. Zhao, M.; Yuan, Y.; Luo, L.; Li, X. A Review: Absolute Linear Encoder Measurement Technology. Sensors 2025, 25, 5997. [Google Scholar] [CrossRef]
  45. Luo, L.; Zhao, M.; Li, X. A Review: Grating Encoder Technologies for Multi-Degree-of-Freedom Spatial Measurement. Sensors 2025, 25, 6071. [Google Scholar]
  46. Shi, Y.; Zhou, Q.; Li, X.; Ni, K.; Wang, X. Design and Testing of a Linear Encoder Capable of Measuring Absolute Distance. Sens. Actuators A Phys. 2020, 308, 111935. [Google Scholar] [CrossRef]
  47. Li, X.; Xiao, X.; Ni, K.; Zhou, Q.; Wang, H.; Wang, X. A Precise Reference Position Detection Method for Linear Encoders by Using a Coherence Function Algorithm. In Proceedings of the SPIE/COS Photonics Asia, Beijing, China, 14–16 October 2023; p. 100231D. [Google Scholar] [CrossRef]
  48. Lin, J.; Guan, J.; Wen, F.; Tan, J. High-resolution and wide range displacement measurement based on planar grating. Opt. Commun. 2017, 404, 132–138. [Google Scholar] [CrossRef]
  49. Ye, G.; Liu, H.; Wang, Y.; Lei, B.; Shi, Y.; Yin, L.; Lu, B. Ratiometric-linearization-based high-precision electronic interpolator for sinusoidal optical encoders. IEEE Trans. Ind. Electron. 2018, 65, 8224–8231. [Google Scholar] [CrossRef]
  50. Hsieh, H.; Lee, J.; Wu, W.; Chen, J.; Deturche, R.; Lerondel, G. Quasi-common-optical-path heterodyne grating interferometer for displacement measurement. Meas. Sci. Technol. 2010, 21, 115304. [Google Scholar] [CrossRef]
  51. Wu, C.C.; Hsu, C.C.; Lee, J.Y.; Chen, Y.Z. Heterodyne common-path grating interferometer with Littrow configuration. Opt. Express 2013, 21, 13322–13332. [Google Scholar] [CrossRef]
  52. Kimura, A.; Gao, W.; Arai, Y.; Lijiang, Z. Design and construction of a two-degree-of-freedom linear encoder for nanometric measurement of stage position and straightness. Precis. Eng. 2010, 34, 145–155. [Google Scholar] [CrossRef]
  53. Kimura, A.; Gao, W.; Lijiang, Z. Position and out-of-straightness measurement of a precision linear air-bearing stage by using a two-degree-of-freedom linear encoder. Meas. Sci. Technol. 2010, 21, 054005. [Google Scholar] [CrossRef]
  54. Kimura, A.; Hosono, K.; Kim, W.; Shimizu, Y.; Gao, W.; Zeng, L. A two-degree-of-freedom linear encoder with a mosaic scale grating. Int. J. Nanomanuf. 2011, 7, 73–91. [Google Scholar] [CrossRef]
  55. Yin, Y.; Liu, Z.; Jiang, S.; Wang, W.; Yu, H.; Jiri, G.; Hao, Q.; Li, W. High-precision 2D grating displacement measurement system based on double-spatial heterodyne optical path interleaving. Opt. Lasers Eng. 2022, 158, 107167. [Google Scholar] [CrossRef]
  56. Yang, H.; Yin, Z.; Yang, R.; Hu, P.; Li, J.; Tan, J. Design for a highly stable laser source based on the error model of high-speed high-resolution heterodyne interferometers. Sensors 2020, 20, 1083. [Google Scholar] [CrossRef]
  57. Li, X.; Gao, W.; Muto, H.; Shimizu, Y.; Ito, S.; Dian, S. A Six-Degree-of-Freedom Surface Encoder for Precision Positioning of a Planar Motion Stage. Precis. Eng. 2013, 37, 771–781. [Google Scholar] [CrossRef]
  58. Gao, Z.; Hu, J.; Zhu, Y.; Duan, G. A new 6-degree-of-freedom measurement method of X-Y stages based on additional information. Precis. Eng. 2013, 37, 606–620. [Google Scholar] [CrossRef]
  59. Zhang, W.; Yang, Y.; Zhang, Y.; Wu, P.; Xiong, X.; Du, H. A rotational angle detection method for reflective gratings based on interferometry. Laser Optoelectron. Prog. 2019, 56, 110501. (In Chinese) [Google Scholar] [CrossRef]
  60. Wang, S.; Zhu, J.; Shi, N.; Luo, L.; Wen, Y.; Li, X. Modeling and Test of an Absolute Four-Degree-of-Freedom (DOF) Grating Encoder. In Proceedings of the Optical Metrology and Inspection for Industrial Applications IX, Online Only, China, 5–11 December 2022; p. 15. [Google Scholar] [CrossRef]
  61. Zhu, J.; Wang, G.; Wang, S.; Li, X. A Reflective-Type Heterodyne Grating Interferometer for Three-Degree-of-Freedom Subnanometer Measurement. IEEE Trans. Instrum. Meas. 2022, 71, 7007509. [Google Scholar] [CrossRef]
  62. Wang, S.; Liao, B.; Shi, N.; Li, X. A Compact and High-Precision Three-Degree-of-Freedom Grating Encoder Based on a Quadrangular Frustum Pyramid Prism. Sensors 2023, 23, 4022. [Google Scholar] [CrossRef]
  63. Hsieh, H.L.; Pan, S.W. Development of a grating-based interferometer for six-degree-of-freedom displacement and angle measurements. Opt. Express 2015, 23, 2451–2465. [Google Scholar] [CrossRef]
  64. Cui, C.; Gao, L.; Zhao, P.; Yang, M.; Liu, L.; Ma, Y.; Huang, G.; Wang, S.; Luo, L.; Li, X. Towards multi-dimensional atomic-level measurement: Integrated heterodyne grating interferometer with zero dead-zone. Light Adv. Manuf. 2025, 6, 40. [Google Scholar]
  65. Wang, S.; Luo, L.; Li, X. Design and parameter optimization of zero position code considering diffraction based on deep learning generative adversarial networks. Nanomanuf. Metrol. 2024, 7, 2. [Google Scholar] [CrossRef]
  66. Yu, K.; Zhu, J.; Yuan, W.; Zhou, Q.; Xue, G.; Wu, G.; Wang, X.; Li, X. Two-Channel Six Degrees of Freedom Grating-Encoder for Precision-Positioning of Sub-Components in Synthetic-Aperture Optics. Opt. Express 2021, 29, 21113. [Google Scholar] [CrossRef] [PubMed]
  67. Wang, G.; Gao, L.; Huang, G.; Lei, X.; Cui, C.; Wang, S.; Yang, M.; Zhu, J.; Yan, S.; Li, X. A Wavelength-Stabilized and Quasi-Common-Path Heterodyne Grating Interferometer With Sub-Nanometer Precision. IEEE Trans. Instrum. Meas. 2024, 73, 7002509. [Google Scholar] [CrossRef]
  68. Ni, K.; Wang, H.; Li, X.; Wang, X.; Xiao, X.; Zhou, Q. Measurement Uncertainty Evaluation of the Three Degree of Freedom Surface Encoder. In Proceedings of the SPIE/COS Photonics Asia, Beijing, China, 12–14 October 2016; p. 100230Z. [Google Scholar] [CrossRef]
  69. Ye, W.; Zhang, M.; Zhu, Y.; Wang, L.; Hu, J.; Li, X.; Hu, C. Ultraprecision real-time displacements calculation algorithm for the grating interferometer system. Sensors 2019, 19, 2409. [Google Scholar] [CrossRef]
  70. Kang, H.J.; Chun, B.J.; Jang, Y.S.; Kim, Y.J.; Kim, S.W. Real-time compensation of the refractive index of air in distance measurement. Opt. Express 2015, 23, 26377–26385. [Google Scholar] [CrossRef]
  71. Liu, H.; Xiang, H.; Chen, J.; Yang, R. Measurement and compensation of machine tool geometry error based on Abbe principle. Int. J. Adv. Manuf. Technol. 2018, 98, 2769–2774. [Google Scholar] [CrossRef]
  72. Chen, G.; Zhang, L.; Wang, X.; Wang, C.; Xiang, H.; Tong, G.; Zhao, D. Modeling method of CNC tooling volumetric error under consideration of Abbé error. Int. J. Adv. Manuf. Technol. 2022, 119, 7875–7887. [Google Scholar] [CrossRef]
  73. Li, X.; Su, X.; Zhou, Q.; Ni, K.; Wang, X. A Real-Time Distance Measurement Data Processing Platform for Multi-Axis Grating Interferometry Type Optical Encoders. In Proceedings of the Tenth International Symposium on Precision Engineering Measurements and Instrumentation, Kunming, China, 8–10 August 2018; p. 170. [Google Scholar] [CrossRef]
  74. Shimizu, Y.; Ito, T.; Li, X.; Kim, W.; Gao, W. Design and Testing of a Four-Probe Optical Sensor Head for Three-Axis Surface Encoder with a Mosaic Scale Grating. Meas. Sci. Technol. 2014, 25, 094002. [Google Scholar] [CrossRef]
  75. Han, Y.; Ni, K.; Li, X.; Wu, G.; Yu, K.; Zhou, Q.; Wang, X. An fpga platform for next-generation grating encoders. Sensors 2020, 20, 2266. [Google Scholar] [CrossRef] [PubMed]
  76. Li, X.; Shimizu, Y.; Ito, T.; Cai, Y.; Ito, S.; Gao, W. Measurement of Six-Degree-of-Freedom Planar Motions by Using a Multiprobe Surface Encoder. Opt. Eng. 2014, 53, 122405. [Google Scholar] [CrossRef]
  77. Shimizu, Y.; Chen, L.C.; Kim, D.W.; Chen, X.; Li, X.; Matsukuma, H. An Insight on Optical Metrology in Manufacturing. Meas. Sci. Technol. 2020, 32, 042003. [Google Scholar] [CrossRef]
  78. Wang, S.; Luo, L.; Zhu, J.; Shi, N.; Li, X. An Ultra-Precision Absolute-Type Multi-Degree-of-Freedom Grating Encoder. Sensors 2022, 22, 9047. [Google Scholar] [CrossRef] [PubMed]
  79. Li, X.; Wang, H.; Ni, K.; Zhou, Q.; Mao, X.; Zeng, L.; Wang, X.; Xiao, X. Two-Probe Optical Encoder for Absolute Positioning of Precision Stages by Using an Improved Scale Grating. Opt. Express 2016, 24, 21378. [Google Scholar] [CrossRef]
  80. Li, X.; Shi, Y.; Xiao, X.; Zhou, Q.; Wu, G.; Lu, H.; Ni, K. Design and Testing of a Compact Optical Prism Module for Multi-Degree-of-Freedom Grating Interferometry Application. Appl. Sci. 2018, 8, 2495. [Google Scholar] [CrossRef]
  81. Matsukuma, H.; Ishizuka, R.; Furuta, M.; Li, X.; Shimizu, Y.; Gao, W. Reduction in Cross-Talk Errors in a Six-Degree-of-Freedom Surface Encoder. Nanomanuf. Metrol. 2019, 2, 111–123. [Google Scholar] [CrossRef]
  82. Shi, Y.; Ni, K.; Li, X.; Zhou, Q.; Wang, X. Highly Accurate, Absolute Optical Encoder Using a Hybrid-Positioning Method. Opt. Lett. 2019, 44, 5258. [Google Scholar] [CrossRef]
  83. Wang, S.; Luo, L.; Gao, L.; Ma, R.; Wang, X.; Li, X. Long Binary Coding Design for Absolute Positioning Using Genetic Algorithm. In Proceedings of the Optical Metrology and Inspection for Industrial Applications X, Beijing, China, 15–16 October 2023; p. 4. [Google Scholar] [CrossRef]
  84. Zhu, J.; Yu, K.; Xue, G.; Zhou, Q.; Wang, X.; Li, X. An Improved Signal Filtering Strategy Based on EMD Algorithm for Ultrahigh Precision Grating Encoder. In Proceedings of the Real-Time Photonic Measurements, Data Management, and Processing VI, Nantong, China, 10–12 October 2021; p. 51. [Google Scholar] [CrossRef]
  85. Zhu, J.; Yu, K.; Xue, G.; Shi, N.; Zhou, Q.; Wang, X.; Li, X. A Simplified Two-Phase Differential Decoding Algorithm for High Precision Grating Encoder. In Proceedings of the Optical Metrology and Inspection for Industrial Applications VIII, Nantong, China, 10–19 October 2021; p. 14. [Google Scholar] [CrossRef]
  86. Li, J.; Wang, S.; Li, X. Cross-scale structures fabrication via hybrid lithography for nanolevel positioning. Microsyst. Nanoeng. 2025, 11, 163. [Google Scholar] [CrossRef]
  87. Li, X.; Zhou, Q.; Zhu, X.; Lu, H.; Yang, L.; Ma, D.; Sun, J.; Ni, K.; Wang, X. Holographic Fabrication of an Arrayed One-Axis Scale Grating for a Two-Probe Optical Linear Encoder. Opt. Express 2017, 25, 16028. [Google Scholar] [CrossRef]
  88. Xue, G.; Lu, H.; Li, X.; Zhou, Q.; Wu, G.; Wang, X.; Zhai, Q.; Ni, K. Patterning Nanoscale Crossed Grating with High Uniformity by Using Two-Axis Lloyd’s Mirrors Based Interference Lithography. Opt. Express 2020, 28, 2179. [Google Scholar] [CrossRef]
  89. Li, X. A Two-Axis Lloyd’s Mirror Interferometer for Fabrication of Two-Dimensional Diffraction Gratings. CIRP Ann. 2014, 63, 461–464. [Google Scholar] [CrossRef]
  90. Li, X.; Lu, H.; Zhou, Q.; Wu, G.; Ni, K.; Wang, X. An Orthogonal Type Two-Axis Lloyd’s Mirror for Holographic Fabrication of Two-Dimensional Planar Scale Gratings with Large Area. Appl. Sci. 2018, 8, 2283. [Google Scholar] [CrossRef]
  91. Xue, G.; Zhai, Q.; Lu, H.; Zhou, Q.; Ni, K.; Lin, L.; Wang, X.; Li, X. Polarized holographic lithography system for high-uniformity microscale patterning with periodic tunability. Microsyst. Nanoeng. 2021, 7, 31. [Google Scholar] [CrossRef]
  92. Fan, Y.; Wang, C.; Sun, J.; Peng, X.; Tian, H.; Li, X.; Chen, X.; Chen, X.; Shao, J. Electric-driven flexible-roller nanoimprint lithography on the stress-sensitive warped wafer. Int. J. Extrem. Manuf. 2023, 5, 035101. [Google Scholar]
  93. Wang, Y.; Zhao, F.; Luo, L.; Li, X. A Review on Recent Advances in Signal Processing in Interferometry. Sensors 2025, 25, 5013. [Google Scholar] [CrossRef] [PubMed]
  94. Li, W.; Wang, X.; Bayanheshig; Liu, Z.; Wang, W.; Jiang, S.; Li, Y.; Li, S.; Zhang, W.; Jiang, Y.; et al. Controlling the wavefront aberration of a large-aperture and high-precision holographic diffraction grating. Light Sci. Appl. 2025, 14, 112. [Google Scholar] [CrossRef]
  95. Chang, D.; Sun, Y.; Wang, J.; Yin, Z.; Hu, P.; Tan, J. Multiple-beam grating interferometry and its general Airy formulae. Opt. Lasers Eng. 2023, 164, 107534. [Google Scholar] [CrossRef]
  96. Li, M.; Xiang, X.; Zhou, C.; Wei, C.; Jia, W.; Xiang, C.; Lu, Y.; Zhu, S. Two-dimensional grating fabrication based on ultra-precision laser direct writing system. Acta Opt. Sin. 2019, 39, 0905001. [Google Scholar]
  97. Chai, Y.; Li, F.; Wang, J.; Karvinen, P.; Kuittinen, M.; Kang, G. Enhanced sensing performance from trapezoidal metallic gratings fabricated by laser interference lithography. Opt. Lett. 2022, 47, 1009–1012. [Google Scholar] [CrossRef]
  98. Cunbao, L.; Shuhua, Y.; Fusheng, Y.; Zhiguang, D. Synthetical modeling and experimental study of fabrication and assembly errors of two-dimensional gratings. Infrared Laser Eng. 2016, 45, 1217005. [Google Scholar] [CrossRef]
  99. Deng, X.; Li, T.; Cheng, X. Self-traceable grating reference material and application. Opt. Precis. Eng. 2022, 30, 2608–2625. [Google Scholar] [CrossRef]
  100. Maksymov, I.S.; Huy Nguyen, B.Q.; Pototsky, A.; Suslov, S. Acoustic, phononic, Brillouin light scattering and Faraday wave-based frequency combs: Physical foundations and applications. Sensors 2022, 22, 3921. [Google Scholar] [CrossRef]
  101. Wang, G.; Tan, L.; Yan, S. Real-time and meter-scale absolute distance measurement by frequency-comb-referenced multi-wavelength interferometry. Sensors 2018, 18, 500. [Google Scholar] [CrossRef] [PubMed]
  102. Coluccelli, N.; Cassinerio, M.; Redding, B.; Cao, H.; Laporta, P.; Galzerano, G. The optical frequency comb fibre spectrometer. Nat. Commun. 2016, 7, 12995. [Google Scholar] [CrossRef]
  103. Fortier, T.; Baumann, E. 20 years of developments in optical frequency comb technology and applications. Commun. Phys. 2019, 2, 153. [Google Scholar] [CrossRef]
  104. Yu, H.; Ni, K.; Zhou, Q.; Li, X.; Wang, X.; Wu, G. Digital Error Correction of Dual-Comb Interferometer without External Optical Referencing Information. Opt. Express 2019, 27, 29425. [Google Scholar] [CrossRef] [PubMed]
  105. Matsukuma, H.; Sato, R.; Shimizu, Y.; Gao, W. Measurement range expansion of chromatic confocal probe with supercontinuum light source. Int. J. Autom. Technol. 2021, 15, 529–536. [Google Scholar] [CrossRef]
  106. Coddington, I.; Swann, W.C.; Newbury, N.R. Coherent multiheterodyne spectroscopy using stabilized optical frequency combs. Phys. Rev. Lett. 2008, 100, 013902. [Google Scholar] [CrossRef]
  107. Kubota, T.; Nara, M.; Yoshino, T. Interferometer for measuring displacement and distance. Opt. Lett. 1987, 12, 310–312. [Google Scholar] [CrossRef]
  108. Jang, Y.S.; Kim, S.W. Compensation of the refractive index of air in laser interferometer for distance measurement: A review. Int. J. Precis. Eng. Manuf. 2017, 18, 1881–1890. [Google Scholar] [CrossRef]
  109. Kajima, M.; Minoshima, K. Optical zooming interferometer for subnanometer positioning using an optical frequency comb. Appl. Opt. 2010, 49, 5844–5850. [Google Scholar] [CrossRef]
  110. Hofer, M.; Ober, M.; Haberl, F.; Fermann, M. Characterization of ultrashort pulse formation in passively mode-locked fiber lasers. IEEE J. Quantum Electron. 2002, 28, 720–728. [Google Scholar] [CrossRef]
  111. Fizeau, M. Sur les hypothèses relatives à l’éther lumineux, et sur une expérience qui parait démontrer que le mouvement des corps change la vitesse avec laquelle la lumière se propage dans leur intérieur. SPIE Milest. Ser. 1991, 28, 445–449. [Google Scholar]
  112. Schober, C.; Beisswanger, R.; Gronle, A.; Pruss, C.; Osten, W. Tilted Wave Fizeau Interferometer for flexible and robust asphere and freeform testing. Light Adv. Manuf. 2022, 3, 687–698. [Google Scholar] [CrossRef]
  113. Sommargren, G.E. Interferometric Wavefront Measurement. U.S. Patent 4,594,003, 10 June 1986. [Google Scholar]
  114. Aikens, D.M.; Roussel, A.; Bray, M. Derivation of preliminary specifications for transmitted wavefront and surface roughness for large optics used in inertial confinement fusion. In Proceedings of the Solid State Lasers for Application to Inertial Confinement Fusion (ICF), Monterey, CA, USA, 30 May–2 June 1995; SPIE: Bellingham, WA, USA, 1995; Volume 2633, pp. 350–360. [Google Scholar]
  115. Deck, L.L.; Soobitsky, J.A. Phase-shifting via wavelength tuning in very large aperture interferometers. In Proceedings of the Optical Manufacturing and Testing III, Denver, CO, USA, 20–23 July 1999; SPIE: Bellingham, WA, USA, 1999; Volume 3782, pp. 432–442. [Google Scholar]
  116. Xu, T.; Wang, Z.; Jia, Z.; Chen, J.; Feng, Z. A dual-stage correction approach for high-precision phase-shifter in Fizeau interferometers. Opt. Lasers Eng. 2024, 178, 108205. [Google Scholar] [CrossRef]
  117. Morrow, K.; da Silva, M.B.; Alcock, S. Correcting retrace and system imaging errors to achieve nanometer accuracy in full aperture, single-shot Fizeau interferometry. Opt. Express 2023, 31, 27654–27666. [Google Scholar] [CrossRef] [PubMed]
  118. Li, X.; Peng, S.; Xu, Z.; Bai, J.; Wu, L.; Shen, Y.; Liu, D. Breaking the mid-spatial-frequency noise floor to sub-nanometer in Fizeau interferometry via anisotropic spatial-coherence engineering. Opt. Lett. 2025, 50, 4410–4413. [Google Scholar] [CrossRef] [PubMed]
  119. Lu, H.; Huang, X.; Guo, C.; Xu, J.; Xu, J.; Hao, H.; Zhao, H.; Tang, W.; Wang, P.; Li, H. Orbital-angular-momentum beams-based Fizeau interferometer using the advanced azimuthal-phase-demodulation method. Appl. Phys. Lett. 2022, 121, 241102. [Google Scholar] [CrossRef]
  120. Lu, H.; Guo, C.; Huang, X.; Zhao, H.; Tang, W. Orbital angular momentum-based Fizeau interferometer measurement system. In Proceedings of the Optical Design and Testing XII, Online. 5–11 December 2022; SPIE: Bellingham, WA, USA, 2022; Volume 12315, pp. 342–346. [Google Scholar]
  121. Kühnel, M.; Langlotz, E.; Rahneberg, I.; Dontsov, D.; Probst, J.; Krist, T.; Braig, C.; Erko, A. Interferometrical profilometer for high precision 3D measurements of free-form optics topography with large local slopes. In Proceedings of the Eighth European Seminar on Precision Optics Manufacturing, Teisnach, Germany, 13–14 April 2021; SPIE: Bellingham, WA, USA, 2021; Volume 11853, pp. 51–58. [Google Scholar]
  122. Da Silva, M.B.; Alcock, S.G.; Nistea, I.T.; Sawhney, K. A Fizeau interferometry stitching system to characterize X-ray mirrors with sub-nanometre errors. Opt. Lasers Eng. 2023, 161, 107192. [Google Scholar] [CrossRef]
  123. Zygo Corporation. MST Series Optical Profilers; Zygo Corporation, 2023. Product Brochure. Available online: https://www.zygo.com/products/metrology-systems/laser-interferometers/verifire-mst (accessed on 11 August 2025).
  124. Taylor Hobson Ltd. Talysurf PGI Optics Surface Profiling System; Taylor Hobson Ltd., 2023. Product Datasheet. Available online: https://www.taylor-hobson.com/products/surface-analysis/talysurf-pgi-optics (accessed on 11 August 2025).
  125. Kreis, T. Handbook of Holographic Interferometry: Optical and Digital Methods; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
  126. Stetson, K.A. The discovery of holographic interferometry, its development and applications. Light Adv. Manuf. 2022, 3, 349–357. [Google Scholar] [CrossRef]
  127. Osten, W.; Faridian, A.; Gao, P.; Körner, K.; Naik, D.; Pedrini, G.; Singh, A.K.; Takeda, M.; Wilke, M. Recent advances in digital holography. Appl. Opt. 2014, 53, G44–G63. [Google Scholar] [CrossRef]
  128. Gao, Y.; Cao, L. Generalized optimization framework for pixel super-resolution imaging in digital holography. Opt. Express 2021, 29, 28805–28823. [Google Scholar] [CrossRef] [PubMed]
  129. Singh, M.; Khare, K. Single-shot full resolution region-of-interest (ROI) reconstruction in image plane digital holographic microscopy. J. Mod. Opt. 2018, 65, 1127–1134. [Google Scholar] [CrossRef]
  130. Min, J.; Yao, B.; Gao, P.; Guo, R.; Ma, B.; Zheng, J.; Lei, M.; Yan, S.; Dan, D.; Duan, T.; et al. Dual-wavelength slightly off-axis digital holographic microscopy. Appl. Opt. 2012, 51, 191–196. [Google Scholar] [CrossRef]
  131. Di, J.; Song, Y.; Xi, T.; Zhang, J.; Li, Y.; Ma, C.; Wang, K.; Zhao, J. Dual-wavelength common-path digital holographic microscopy for quantitative phase imaging of biological cells. Opt. Eng. 2017, 56, 111712. [Google Scholar] [CrossRef]
  132. Morimoto, Y.; Matui, T.; Fujigaki, M.; Kawagishi, N. Subnanometer displacement measurement by averaging of phase difference in windowed digital holographic interferometry. Opt. Eng. 2007, 46, 025603. [Google Scholar] [CrossRef]
  133. Everett, S.; Yanny, B.; Kuropatkin, N. Application of holographic interferometry for plasma diagnostics. Usp. Fiz. Nauk. 1986, 149, 105–138. [Google Scholar]
  134. Martínez-León, L.; Clemente, P.; Mori, Y.; Climent, V.; Lancis, J.; Tajahuerce, E. Single-pixel digital holography with phase-encoded illumination. Opt. Express 2017, 25, 4975–4984. [Google Scholar] [CrossRef]
  135. Asundi, A.; Singh, V.R. Amplitude and phase analysis in digital dynamic holography. Opt. Lett. 2006, 31, 2420–2422. [Google Scholar] [CrossRef] [PubMed]
  136. Shimobaba, T.; Takahashi, T.; Yamamoto, Y.; Endo, Y.; Shiraki, A.; Nishitsuji, T.; Hoshikawa, N.; Kakue, T.; Ito, T. Digital holographic particle volume reconstruction using a deep neural network. Appl. Opt. 2019, 58, 1900–1906. [Google Scholar] [CrossRef]
  137. Ren, Z.; Xu, Z.; Lam, E.Y. End-to-end deep learning framework for digital holographic reconstruction. Adv. Photonics 2019, 1, 016004. [Google Scholar] [CrossRef]
  138. Schretter, C.; Blinder, D.; Bettens, S.; Ottevaere, H.; Schelkens, P. Regularized non-convex image reconstruction in digital holographic microscopy. Opt. Express 2017, 25, 16491–16508. [Google Scholar] [CrossRef]
  139. Bettens, S.; Yan, H.; Blinder, D.; Ottevaere, H.; Schretter, C.; Schelkens, P. Studies on the sparsifying operator in compressive digital holography. Opt. Express 2017, 25, 18656–18676. [Google Scholar] [CrossRef] [PubMed]
  140. Park, Y.; Depeursinge, C.; Popescu, G. Quantitative phase imaging in biomedicine. Nat. Photonics 2018, 12, 578–589. [Google Scholar] [CrossRef]
  141. Lee, A.J.; Hugonnet, H.; Park, W.; Park, Y. Three-dimensional label-free imaging and quantification of migrating cells during wound healing. Biomed. Opt. Express 2020, 11, 6812–6824. [Google Scholar] [CrossRef]
  142. Sun, B.; Ahmed, A.; Atkinson, C.; Soria, J. A novel 4D digital holographic PIV/PTV (4D-DHPIV/PTV) methodology using iterative predictive inverse reconstruction. Meas. Sci. Technol. 2020, 31, 104002. [Google Scholar] [CrossRef]
  143. Li, J.; Zhou, Q.; Li, X.; Chen, R.; Ni, K. An Improved Low-Noise Processing Methodology Combined with PCL for Industry Inspection Based on Laser Line Scanner. Sensors 2019, 19, 3398. [Google Scholar] [CrossRef]
  144. Ding, D.; Zhao, Z.; Zhang, X.; Fu, Y.; Xu, J. Evaluation and compensation of laser-based on-machine measurement for inclined and curved profiles. Measurement 2020, 151, 107236. [Google Scholar] [CrossRef]
  145. Ibaraki, S.; Kitagawa, Y.; Kimura, Y.; Nishikawa, S. On the limitation of dual-view triangulation in reducing the measurement error induced by the speckle noise in scanning operations. Int. J. Adv. Manuf. Technol. 2017, 88, 731–737. [Google Scholar] [CrossRef]
  146. Chen, R.; Li, Y.; Xue, G.; Tao, Y.; Li, X. Laser Triangulation Measurement System with Scheimpflug Calibration Based on the Monte Carlo Optimization Strategy. Opt. Express 2022, 30, 25290. [Google Scholar] [CrossRef] [PubMed]
  147. Hao, C.; Jigui, Z.; Bin, X. Impact of rough surface scattering characteristics to measurement accuracy of laser displacement sensor based on position sensitive detector. Chin. J. Lasers 2013, 40, 0808003. [Google Scholar] [CrossRef]
  148. Wang, L.; Feng, Q.; Li, J. Tilt error analysis for laser triangulation sensor based on ZEMAX. In Proceedings of the Optical Sensing and Imaging Technologies and Applications, Beijing, China, 22–24 May 2018; SPIE: Bellingham, WA, USA, 2018; Volume 10846, pp. 161–168. [Google Scholar]
  149. Li, S.; Yang, Y.; Jia, X.; Chen, M. The impact and compensation of tilt factors upon the surface measurement error. Optik 2016, 127, 7367–7373. [Google Scholar] [CrossRef]
  150. Yu, P. The Analysis of Error and Study on Improvement Measures in the Detection of Displacement by Laser Triangulation; Changchun University of Technology: Changchun, China, 2013. [Google Scholar]
  151. Wei, J.; He, Y.; Wang, F.; He, Y.; Rong, X.; Chen, M.; Wang, Y.; Yue, H.; Liu, J. Convolutional neural network assisted infrared imaging technology: An enhanced online processing state monitoring method for laser powder bed fusion. Infrared Phys. Technol. 2023, 131, 104661. [Google Scholar] [CrossRef]
  152. Tsagaris, A.; Mansour, G. Path planning optimization for mechatronic systems with the use of genetic algorithm and ant colony. IOP Conf. Ser. Mater. Sci. Eng. 2019, 564, 012051. [Google Scholar] [CrossRef]
  153. Xie, G.; Du, X.; Li, S.; Yang, J.; Hei, X.; Wen, T. An efficient and global interactive optimization methodology for path planning with multiple routing constraints. ISA Trans. 2022, 121, 206–216. [Google Scholar] [CrossRef]
  154. Hui, D.; Li, D.; Wang, B.; Li, Y.; Ding, J.; Zhang, L.; Qiao, D. A MEMS grating modulator with a tunable sinusoidal grating for large-scale extendable apertures. Microsyst. Nanoeng. 2025, 11, 39. [Google Scholar] [CrossRef]
  155. Piron, F.; Morrison, D.; Yuce, M.R.; Redouté, J.M. A review of single-photon avalanche diode time-of-flight imaging sensor arrays. IEEE Sens. J. 2020, 21, 12654–12666. [Google Scholar] [CrossRef]
  156. Hsu, T.H.; Liu, C.H.; Lin, T.C.; Sang, T.H.; Tsai, C.M.; Lin, G.; Lin, S.D. High-precision pulsed laser ranging using CMOS single-photon avalanche diodes. Opt. Laser Technol. 2024, 176, 110921. [Google Scholar] [CrossRef]
  157. Zou, C.; Ou, Y.; Zhu, Y.; Martins, R.P.; Chan, C.H.; Zhang, M. A 256 × 192-Pixel Direct Time-of-Flight LiDAR Receiver With a Current-Integrating-Based AFE Supporting 240-m-Range Imaging. IEEE J. Solid-State Circuits 2024, 59, 3525–3537. [Google Scholar] [CrossRef]
  158. Dabidian, S.; Jami, S.T.; Kavehvash, Z.; Fotowat-Ahmady, A. Direct time-of-flight (d-ToF) pulsed LiDAR sensor with simultaneous noise and interference suppression. IEEE Sens. J. 2024, 24, 27578–27586. [Google Scholar] [CrossRef]
  159. Li, D.; Li, P.; Hu, J.; Wang, X.; Ma, R.; Zhu, Z. A SPAD-Based Hybrid Time-of-Flight Image Sensor With PWM and Non-Linear Spatiotemporal Coincidence. IEEE Trans. Circuits Syst. I Regul. Pap. 2025, 72, 5610–5619. [Google Scholar] [CrossRef]
  160. Piron, F.; Pierre, H.; Redouté, J.M. An 8-Windows Continuous-Wave Indirect Time-of-Flight Method for High-Frequency SPAD-Based 3-D Imagers in 0.18 μm CMOS. IEEE Sens. J. 2024, 24, 20495–20503. [Google Scholar] [CrossRef]
  161. Lee, S.H.; Kwon, W.H.; Lim, Y.S.; Park, Y.H. Highly precise AMCW time-of-flight scanning sensor based on parallel-phase demodulation. Measurement 2022, 203, 111860. [Google Scholar] [CrossRef]
  162. Lee, S.H.; Lim, Y.S.; Kwon, W.H.; Park, Y.H. Multipath Interference Suppression of Amplitude-Modulated Continuous Wave Coaxial-Scanning LiDAR Using Model-Based Synthetic Data Learning. IEEE Sens. J. 2023, 23, 23822–23835. [Google Scholar] [CrossRef]
  163. Lee, S.H.; Kwon, W.H.; Lim, Y.S.; Park, Y.H. Distance measurement error compensation using machine learning for laser scanning AMCW time-of-flight sensor. In Proceedings of the MOEMS and Miniaturized Systems XXI, San Francisco, CA, USA, 28 January–2 February 2017; SPIE: Bellingham, WA, USA, 2022; Volume 12013, pp. 9–14. [Google Scholar]
  164. Li, X.; Li, W.; Yin, X.; Ma, X.; Yuan, X.; Zhao, J. Camera-mirror binocular vision-based method for evaluating the performance of industrial robots. IEEE Trans. Instrum. Meas. 2021, 70, 5019214. [Google Scholar] [CrossRef]
  165. Zhang, Y.; Zhao, Z.; Zhao, C. Vehicle High Beam Light Detection Based on Zoom Binocular Camera and Light Sensor Array. In Proceedings of the 2021 6th International Conference on Control, Robotics and Cybernetics (CRC), Shanghai, China, 9–11 October 2021; pp. 176–180. [Google Scholar]
  166. Zhang, Z.; Wang, H.; Li, Y.; Li, Z.; Gui, W.; Wang, X.; Zhang, C.; Liang, X.; Li, X. Fringe-Based Structured-Light 3D Reconstruction: Principles, Projection Technologies, and Deep Learning Integration. Sensors 2025, 25, 6296. [Google Scholar] [CrossRef] [PubMed]
  167. Han, M.; Lei, F.; Shi, W.; Lu, S.; Li, X. Uniaxial MEMS-Based 3D Reconstruction Using Pixel Refinement. Opt. Express 2023, 31, 536. [Google Scholar] [CrossRef]
  168. Li, Y.; Li, Z.; Zhang, C.; Han, M.; Lei, F.; Liang, X.; Wang, X.; Gui, W.; Li, X. Deep Learning-Driven One-Shot Dual-View 3-D Reconstruction for Dual-Projector System. IEEE Trans. Instrum. Meas. 2024, 73, 5021314. [Google Scholar] [CrossRef]
  169. Li, Y.; Li, Z.; Liang, X.; Huang, H.; Qian, X.; Feng, F.; Zhang, C.; Wang, X.; Gui, W.; Li, X. Global Phase Accuracy Enhancement of Structured Light System Calibration and 3D Reconstruction by Overcoming Inevitable Unsatisfactory Intensity Modulation. Measurement 2024, 236, 114952. [Google Scholar] [CrossRef]
  170. Zhou, Q.; Qiao, X.; Ni, K.; Li, X.; Wang, X. Depth Detection in Interactive Projection System Based on One-Shot Black-and-White Stripe Pattern. Opt. Express 2017, 25, 5341. [Google Scholar] [CrossRef]
  171. Han, M.; Xing, Y.; Wang, X.; Li, X. Projection Superimposition for the Generation of High-Resolution Digital Grating. Optics Lett. 2024, 49, 4473. [Google Scholar] [CrossRef] [PubMed]
  172. Zheng, T.X.; Huang, S.; Li, Y.F.; Feng, M. Key techniques for vision based 3D reconstruction: A review. Acta Autom. Sin. 2020, 46, 631–652. [Google Scholar]
  173. Li, Y.; Chen, W.; Li, Z.; Zhang, C.; Wang, X.; Gui, W.; Gao, W.; Liang, X.; Li, X. SL3D-BF: A Real-World Structured Light 3D Dataset with Background-to-Foreground Enhancement. IEEE Trans. Circuits Syst. Video Technol. 2025, 35, 9850–9864. [Google Scholar] [CrossRef]
  174. Geng, J. Structured-light 3D surface imaging: A tutorial. Adv. Opt. Photonics 2011, 3, 128–160. [Google Scholar] [CrossRef]
  175. Zong, Y.; Duan, M.; Yu, C.; Li, J. Robust phase unwrapping algorithm for noisy and segmented phase measurements. Opt. Express 2021, 29, 24466–24485. [Google Scholar] [CrossRef]
  176. Zhu, Z.; Li, M.; Xie, Y.; Zhou, F.; Liu, Y.; Wang, W. The optimal projection intensities determination strategy for robust strip-edge detection in adaptive fringe pattern measurement. Optik 2022, 257, 168771. [Google Scholar] [CrossRef]
  177. Wang, H.; Lu, Z.; Huang, Z.; Li, Y.; Zhang, C.; Qian, X.; Wang, X.; Gui, W.; Liang, X.; Li, X. A High-Accuracy and Reliable End-to-End Phase Calculation Network and Its Demonstration in High Dynamic Range 3D Reconstruction. Nanomanuf. Metrol. 2025, 8, 5. [Google Scholar] [CrossRef]
  178. Han, M.; Shi, W.; Lu, S.; Lei, F.; Li, Y.; Wang, X.; Li, X. Internal–External Layered Phase Shifting for Phase Retrieval. IEEE Trans. Instrum. Meas. 2024, 73, 4501013. [Google Scholar] [CrossRef]
  179. Han, M.; Jiang, H.; Lei, F.; Xing, Y.; Wang, X.; Li, X. Modeling window smoothing effect hidden in fringe projection profilometry. Measurement 2025, 242, 115852. [Google Scholar] [CrossRef]
  180. Li, Z.; Chen, W.; Liu, C.; Lu, S.; Qian, X.; Wang, X.; Zou, Y.; Li, X. An efficient exposure fusion method for 3D measurement with high-reflective objects. In Proceedings of the Optoelectronic Imaging and Multimedia Technology XI, Nantong, China, 13–15 October 2024; SPIE: Bellingham, WA, USA, 2024; Volume 13239, pp. 348–356. [Google Scholar]
  181. Ri, S.; Takimoto, T.; Xia, P.; Wang, Q.; Tsuda, H.; Ogihara, S. Accurate phase analysis of interferometric fringes by the spatiotemporal phase-shifting method. J. Opt. 2020, 22, 105703. [Google Scholar] [CrossRef]
  182. Zhang, S.; Yau, S.T. High-resolution, real-time 3D absolute coordinate measurement based on a phase-shifting method. Opt. Express 2006, 14, 2644–2649. [Google Scholar] [CrossRef] [PubMed]
  183. Zhou, W.S.; Su, X.Y. A direct mapping algorithm for phase-measuring profilometry. J. Mod. Opt. 1994, 41, 89–94. [Google Scholar] [CrossRef]
  184. Huang, L.; Chua, P.S.; Asundi, A. Least-squares calibration method for fringe projection profilometry considering camera lens distortion. Appl. Opt. 2010, 49, 1539–1548. [Google Scholar] [CrossRef]
  185. Zhang, Z.; Ma, H.; Zhang, S.; Guo, T.; Towers, C.E.; Towers, D.P. Simple calibration of a phase-based 3D imaging system based on uneven fringe projection. Opt. Lett. 2011, 36, 627–629. [Google Scholar] [CrossRef]
  186. Hu, X.; Wang, G.; Zhang, Y.; Yang, H.; Zhang, S. Large depth-of-field 3D shape measurement using an electrically tunable lens. Opt. Express 2019, 27, 29697–29709. [Google Scholar] [CrossRef]
  187. Cheng, N.J.; Su, W.H. Phase-shifting projected fringe profilometry using binary-encoded patterns. Photonics 2021, 8, 362. [Google Scholar] [CrossRef]
  188. Xie, X.; Tian, X.; Shou, Z.; Zeng, Q.; Wang, G.; Huang, Q.; Qin, M.; Gao, X. Deep learning phase-unwrapping method based on adaptive noise evaluation. Appl. Opt. 2022, 61, 6861–6870. [Google Scholar] [CrossRef]
  189. Yu, J.; Da, F. Absolute phase unwrapping for objects with large depth range. IEEE Trans. Instrum. Meas. 2023, 72, 5013310. [Google Scholar] [CrossRef]
  190. Yue, M.; Wang, J.; Zhang, J.; Zhang, Y.; Tang, Y.; Feng, X. Color crosstalk correction for synchronous measurement of full-field temperature and deformation. Opt. Lasers Eng. 2022, 150, 106878. [Google Scholar] [CrossRef]
  191. Li, Z.; Gao, N.; Meng, Z.; Zhang, Z.; Gao, F.; Jiang, X. Aided imaging phase measuring deflectometry based on concave focusing mirror. Photonics 2023, 10, 519. [Google Scholar] [CrossRef]
  192. Wang, Y.; Xu, Y.; Zhang, Z.; Gao, F.; Jiang, X. 3D measurement of structured specular surfaces using stereo direct phase measurement deflectometry. Machines 2021, 9, 170. [Google Scholar] [CrossRef]
  193. Lei, F.; Ma, R.; Li, X. Use of Phase-Angle Model for Full-Field 3D Reconstruction under Efficient Local Calibration. Sensors 2024, 24, 2581. [Google Scholar] [CrossRef]
  194. Lei, F.; Han, M.; Jiang, H.; Wang, X.; Li, X. A Phase-Angle Inspired Calibration Strategy Based on MEMS Projector for 3D Reconstruction with Markedly Reduced Calibration Images and Parameters. Opt. Lasers Eng. 2024, 176, 108078. [Google Scholar] [CrossRef]
  195. Srinivasan, V.; Liu, H.C.; Halioua, M. Automated phase-measuring profilometry of 3-D diffuse objects. Appl. Opt. 1984, 23, 3105–3108. [Google Scholar] [CrossRef]
  196. Zhong, J.; Weng, J. Phase retrieval of optical fringe patterns from the ridge of a wavelet transform. Opt. Lett. 2005, 30, 2560–2562. [Google Scholar] [CrossRef] [PubMed]
  197. Su, X.; Chen, W. Fourier transform profilometry: A review. Opt. Lasers Eng. 2001, 35, 263–284. [Google Scholar] [CrossRef]
  198. Saldner, H.O.; Huntley, J.M. Temporal phase unwrapping: Application to surface profiling of discontinuous objects. Appl. Opt. 1997, 36, 2770–2775. [Google Scholar] [CrossRef] [PubMed]
  199. Zhong, J.; Zhang, Y. Absolute phase-measurement technique based on number theory in multifrequency grating projection profilometry. Appl. Opt. 2001, 40, 492–500. [Google Scholar] [CrossRef]
  200. Liu, C.; Zhang, C.; Liang, X.; Han, Z.; Li, Y.; Yang, C.; Gui, W.; Gao, W.; Wang, X.; Li, X. Attention Mono-depth: Attention-enhanced transformer for monocular depth estimation of volatile kiln burden surface. IEEE Trans. Circuits Syst. Video Technol. 2024, 35, 1686–1699. [Google Scholar] [CrossRef]
  201. Han, M.; Zhang, C.; Zhang, Z.; Li, X. Review of MEMS vibration-mirror-based 3D reconstruction of structured light. Opt. Precis. Eng 2025, 33, 1065–1090. [Google Scholar] [CrossRef]
  202. Sato, R.; Li, X.; Fischer, A.; Chen, L.C.; Chen, C.; Shimomura, R.; Gao, W. Signal processing and artificial intelligence for dual-detection confocal probes. Int. J. Precis. Eng. Manuf. 2024, 25, 199–223. [Google Scholar] [CrossRef]
  203. Li, C.; Yan, H.; Qian, X.; Zhu, S.; Zhu, P.; Liao, C.; Tian, H.; Li, X.; Wang, X.; Li, X. A domain adaptation YOLOv5 model for industrial defect inspection. Measurement 2023, 213, 112725. [Google Scholar] [CrossRef]
  204. Wang, H.; Zhang, Z.; Ma, R.; Zhang, C.; Liang, X.; Li, X. Correction of grating patterns for high dynamic range 3D measurement based on deep learning. In Proceedings of the Optoelectronic Imaging and Multimedia Technology XI, Nantong, China, 13–15 October 2024; SPIE: Bellingham, WA, USA, 2024; Volume 13239, pp. 317–325. [Google Scholar]
  205. Li, K.; Zhang, Z.; Lin, J.; Sato, R.; Matsukuma, H.; Gao, W. Angle measurement based on second harmonic generation using artificial neural network. Nanomanuf. Metrol. 2023, 6, 28. [Google Scholar] [CrossRef]
  206. Yin, W.; Che, Y.; Li, X.; Li, M.; Hu, Y.; Feng, S.; Lam, E.Y.; Chen, Q.; Zuo, C. Physics-informed deep learning for fringe pattern analysis. Opto-Electron. Adv. 2024, 7, 230034. [Google Scholar] [CrossRef]
  207. Han, M.; Kan, J.; Yang, G.; Li, X. Robust Ellipsoid Fitting Using Combination of Axial and Sampson Distances. IEEE Trans. Instrum. Meas. 2023, 72, 2526714. [Google Scholar] [CrossRef]
  208. Li, C.; Pan, X.; Zhu, P.; Zhu, S.; Liao, C.; Tian, H.; Qian, X.; Li, X.; Wang, X.; Li, X. Style Adaptation module: Enhancing detector robustness to inter-manufacturer variability in surface defect detection. Comput. Ind. 2024, 157, 104084. [Google Scholar] [CrossRef]
  209. Wang, H.; Zhang, C.; Qian, X.; Wang, X.; Gui, W.; Gao, W.; Liang, X.; Li, X. HDRSL Net for Accurate High Dynamic Range Imaging-based Structured Light 3D Reconstruction. IEEE Trans. Image Process. 2025, 34, 5486–5499. [Google Scholar] [CrossRef] [PubMed]
  210. Wang, H.; He, X.; Zhang, C.; Liang, X.; Zhu, P.; Wang, X.; Gui, W.; Li, X.; Qian, X. Accelerating surface defect detection using normal data with an attention-guided feature distillation reconstruction network. Measurement 2025, 246, 116702. [Google Scholar] [CrossRef]
  211. Duarte, M.F.; Davenport, M.A.; Takhar, D.; Laska, J.N.; Sun, T.; Kelly, K.F.; Baraniuk, R.G. Single-pixel imaging via compressive sampling. IEEE Signal Process. Mag. 2008, 25, 83–91. [Google Scholar] [CrossRef]
  212. Haldar, J.P.; Hernando, D.; Liang, Z.P. Compressed-sensing MRI with random encoding. IEEE Trans. Med. Imaging 2010, 30, 893–903. [Google Scholar] [CrossRef] [PubMed]
  213. Studer, V.; Bobin, J.; Chahid, M.; Mousavi, H.S.; Candes, E.; Dahan, M. Compressive fluorescence microscopy for biological and hyperspectral imaging. Proc. Natl. Acad. Sci. USA 2012, 109, E1679–E1687. [Google Scholar] [CrossRef] [PubMed]
  214. Yu, W.K.; Liu, X.F.; Yao, X.R.; Wang, C.; Zhai, Y.; Zhai, G.J. Complementary compressive imaging for the telescopic system. Sci. Rep. 2014, 4, 5834. [Google Scholar] [CrossRef] [PubMed]
  215. Gong, W.; Zhao, C.; Yu, H.; Chen, M.; Xu, W.; Han, S. Three-dimensional ghost imaging lidar via sparsity constraint. Sci. Rep. 2016, 6, 26133. [Google Scholar] [CrossRef]
  216. Candès, E.J.; Romberg, J.; Tao, T. Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 2006, 52, 489–509. [Google Scholar] [CrossRef]
  217. Tropp, J.A.; Gilbert, A.C. Signal recovery from random measurements via orthogonal matching pursuit. IEEE Trans. Inf. Theory 2007, 53, 4655–4666. [Google Scholar] [CrossRef]
  218. Yao, H.; Dai, F.; Zhang, S.; Zhang, Y.; Tian, Q.; Xu, C. Dr2-net: Deep residual reconstruction network for image compressive sensing. Neurocomputing 2019, 359, 483–493. [Google Scholar] [CrossRef]
  219. Cheng, Z.; Lu, R.; Wang, Z.; Zhang, H.; Chen, B.; Meng, Z.; Yuan, X. BIRNAT: Bidirectional recurrent neural networks with adversarial training for video snapshot compressive imaging. In Proceedings of the European Conference on Computer Vision, Glasgow, UK, 23–28 August 2020; Springer: Cham, Switzerland, 2020; pp. 258–275. [Google Scholar]
  220. Hahn, J.; Debes, C.; Leigsnering, M.; Zoubir, A.M. Compressive sensing and adaptive direct sampling in hyperspectral imaging. Digit. Signal Process. 2014, 26, 113–126. [Google Scholar] [CrossRef]
  221. Clemente, P.; Durán, V.; Tajahuerce, E.; Andrés, P.; Climent, V.; Lancis, J. Compressive holography with a single-pixel detector. Opt. Lett. 2013, 38, 2524–2527. [Google Scholar] [CrossRef]
  222. Durán, V.; Clemente, P.; Fernández-Alonso, M.; Tajahuerce, E.; Lancis, J. Single-pixel polarimetric imaging. Opt. Lett. 2012, 37, 824–826. [Google Scholar] [CrossRef]
  223. Zhang, Z.; Liu, S.; Peng, J.; Yao, M.; Zheng, G.; Zhong, J. Simultaneous spatial, spectral, and 3D compressive imaging via efficient Fourier single-pixel measurements. Optica 2018, 5, 315–319. [Google Scholar] [CrossRef]
  224. Ng, R.; Levoy, M.; Brédif, M.; Duval, G.; Horowitz, M.; Hanrahan, P. Light Field Photography with a Hand-Held Plenoptic Camera. Ph.D. Thesis, Stanford University, Stanford, CA, USA, 2005. [Google Scholar]
  225. Dansereau, D.G.; Pizarro, O.; Williams, S.B. Decoding, calibration and rectification for lenselet-based plenoptic cameras. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 1027–1034. [Google Scholar]
  226. Broxton, M.; Grosenick, L.; Yang, S.; Cohen, N.; Andalman, A.; Deisseroth, K.; Levoy, M. Wave optics theory and 3-D deconvolution for the light field microscope. Opt. Express 2013, 21, 25418–25439. [Google Scholar] [CrossRef]
  227. Prevedel, R.; Yoon, Y.G.; Hoffmann, M.; Pak, N.; Wetzstein, G.; Kato, S.; Schrödel, T.; Raskar, R.; Zimmer, M.; Boyden, E.S.; et al. Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy. Nat. Methods 2014, 11, 727–730. [Google Scholar] [CrossRef] [PubMed]
  228. Gortler, S.J.; Grzeszczuk, R.; Szeliski, R.; Cohen, M.F. The lumigraph. In Seminal Graphics Papers: Pushing the Boundaries, Volume 2; Association for Computing Machinery: New York, NY, USA, 2023; pp. 453–464. [Google Scholar]
  229. Wilburn, B.; Joshi, N.; Vaish, V.; Talvala, E.V.; Antunez, E.; Barth, A.; Adams, A.; Horowitz, M.; Levoy, M. High performance imaging using large camera arrays. ACM Trans. Graph. (TOG) 2005, 24, 765–776. [Google Scholar] [CrossRef]
  230. Wang, T.C.; Zhu, J.Y.; Kalantari, N.K.; Efros, A.A.; Ramamoorthi, R. Light field video capture using a learning-based hybrid imaging system. ACM Trans. Graph. (TOG) 2017, 36, 133. [Google Scholar] [CrossRef]
  231. Veeraraghavan, A.; Raskar, R.; Agrawal, A.; Mohan, A.; Tumblin, J. Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing. ACM Trans. Graph. 2007, 26, 69. [Google Scholar] [CrossRef]
  232. Marwah, K.; Wetzstein, G.; Bando, Y.; Raskar, R. Compressive light field photography using overcomplete dictionaries and optimized projections. ACM Trans. Graph. (TOG) 2013, 32, 46. [Google Scholar] [CrossRef]
  233. Mildenhall, B.; Srinivasan, P.P.; Tancik, M.; Barron, J.T.; Ramamoorthi, R.; Ng, R. Nerf: Representing scenes as neural radiance fields for view synthesis. Commun. ACM 2021, 65, 99–106. [Google Scholar] [CrossRef]
  234. Yoon, Y.; Jeon, H.G.; Yoo, D.; Lee, J.Y.; Kweon, I.S. Light-field image super-resolution using convolutional neural network. IEEE Signal Process. Lett. 2017, 24, 848–852. [Google Scholar] [CrossRef]
  235. Xiao, Z.; Shi, J.; Jiang, X.; Guillemot, C. Axial refocusing precision model with light fields. Signal Process. Image Commun. 2022, 106, 116721. [Google Scholar] [CrossRef]
  236. Wanner, S.; Goldluecke, B. Variational light field analysis for disparity estimation and super-resolution. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 36, 606–619. [Google Scholar] [CrossRef]
  237. Si, L.; Zhu, H.; Wang, Q. Epipolar Plane Image Rectification and Flat Surface Detection in Light Field. J. Electr. Comput. Eng. 2017, 2017, 6142795. [Google Scholar] [CrossRef]
  238. Ng, R. Digital Light Field Photography; Stanford University: Stanford, CA, USA, 2006. [Google Scholar]
  239. Ihrke, I.; Restrepo, J.; Mignard-Debise, L. Principles of light field imaging: Briefly revisiting 25 years of research. IEEE Signal Process. Mag. 2016, 33, 59–69. [Google Scholar] [CrossRef]
  240. Hu, X.; Li, Z.; Miao, L.; Fang, F.; Jiang, Z.; Zhang, X. Measurement technologies of light field camera: An overview. Sensors 2023, 23, 6812. [Google Scholar] [CrossRef]
  241. Shi, S.; New, T. Development and Application of Light-Field Cameras in Fluid Measurements; Springer: Cham, Switzerland, 2023. [Google Scholar]
  242. Zhang, H.; Zhou, W.; Lin, L.; Lumsdaine, A. Cascade residual learning based adaptive feature aggregation for light field super-resolution. Pattern Recognit. 2025, 165, 111616. [Google Scholar] [CrossRef]
  243. Sakai, K.; Takahashi, K.; Fujii, T.; Nagahara, H. Acquiring dynamic light fields through coded aperture camera. In Proceedings of the European Conference on Computer Vision, Glasgow, UK, 23–28 August 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 368–385. [Google Scholar]
  244. Yang, F.; Yan, W.; Tian, P.; Li, F.; Peng, F. Dynamic three-dimensional shape measurement based on light field imaging. In Proceedings of the 9th International Symposium on Advanced Optical Manufacturing and Testing Technologies: Meta-Surface-Wave and Planar Optics, Chengdu, China, 26–29 June 2018; SPIE: Bellingham, WA, USA, 2019; Volume 10841, pp. 90–96. [Google Scholar]
  245. Ma, H.; Qian, Z.; Mu, T.; Shi, S. Fast and accurate 3D measurement based on light-field camera and deep learning. Sensors 2019, 19, 4399. [Google Scholar] [CrossRef]
  246. Chen, X.; Xie, W.; Ma, H.; Chu, J.; Qi, B.; Ren, G.; Sun, X.; Chen, F. Wavefront measurement method based on improved light field camera. Results Phys. 2020, 17, 103007. [Google Scholar] [CrossRef]
  247. Zhou, P.; Zhang, Y.; Yu, Y.; Cai, W.; Zhou, G. 3D shape measurement based on structured light field imaging. Math. Biosci. Eng. 2020, 17, 654–668. [Google Scholar] [CrossRef] [PubMed]
  248. Vogt, N. Volumetric imaging with confocal light field microscopy. Nat. Methods 2020, 17, 956. [Google Scholar] [CrossRef]
  249. Kim, Y.; Jeong, J.; Jang, J.; Kim, M.W.; Park, Y. Polarization holographic microscopy for extracting spatio-temporally resolved Jones matrix. Opt. Express 2012, 20, 9948–9955. [Google Scholar] [CrossRef]
  250. Huang, B.; Bates, M.; Zhuang, X. Super-resolution fluorescence microscopy. Annu. Rev. Biochem. 2009, 78, 993–1016. [Google Scholar] [CrossRef]
  251. Schermelleh, L.; Ferrand, A.; Huser, T.; Eggeling, C.; Sauer, M.; Biehlmaier, O.; Drummen, G.P. Super-resolution microscopy demystified. Nat. Cell Biol. 2019, 21, 72–84. [Google Scholar] [CrossRef]
  252. Huszka, G.; Gijs, M.A. Super-resolution optical imaging: A comparison. Micro Nano Eng. 2019, 2, 7–28. [Google Scholar] [CrossRef]
  253. Harootunian, A.; Betzig, E.; Isaacson, M.; Lewis, A. Super-resolution fluorescence near-field scanning optical microscopy. Appl. Phys. Lett. 1986, 49, 674–676. [Google Scholar] [CrossRef]
  254. Betzig, E.; Isaacson, M.; Lewis, A. Collection mode near-field scanning optical microscopy. Appl. Phys. Lett. 1987, 51, 2088–2090. [Google Scholar] [CrossRef]
  255. Hecht, B.; Sick, B.; Wild, U.P.; Deckert, V.; Zenobi, R.; Martin, O.J.; Pohl, D.W. Scanning near-field optical microscopy with aperture probes: Fundamentals and applications. J. Chem. Phys. 2000, 112, 7761–7774. [Google Scholar] [CrossRef]
  256. Gleyzes, P.; Boccara, A.; Bachelot, R. Near field optical microscopy using a metallic vibrating tip. Ultramicroscopy 1995, 57, 318–322. [Google Scholar] [CrossRef]
  257. Reddick, R.; Warmack, R.; Ferrell, T. New form of scanning optical microscopy. Phys. Rev. B 1989, 39, 767. [Google Scholar] [CrossRef] [PubMed]
  258. Heinzelmann, H.; Hecht, B.; Novotny, L.; Pohl, D. Forbidden light scanning near-field optical microscopy. J. Microsc. 1995, 177, 115–118. [Google Scholar] [CrossRef]
  259. Zenhausern, F.; Martin, Y.; Wickramasinghe, H. Scanning interferometric apertureless microscopy: Optical imaging at 10 angstrom resolution. Science 1995, 269, 1083–1085. [Google Scholar] [CrossRef]
  260. Fischer, U.C.; Koglin, J.; Fuchs, H. The tetrahedral tip as a probe for scanning near-field optical microscopy at 30 nm resolution. J. Microsc. 1994, 176, 231–237. [Google Scholar] [CrossRef]
  261. Takahashi, S.; Ikeda, Y.; Takamasu, K. Study on nano thickness inspection for residual layer of nanoimprint lithography using near-field optical enhancement of metal tip. CIRP Ann. 2013, 62, 527–530. [Google Scholar] [CrossRef]
  262. Ohtsu, M. Progress of high-resolution photon scanning tunneling microscopy due to a nanometric fiber probe. J. Light. Technol. 1995, 13, 1200–1221. [Google Scholar] [CrossRef]
  263. Quantum Design GmbH. neaSNOM Scattering-Type Scanning Near-Field Optical Microscope. 2021. Available online: https://www.qd-china.com/zh/pro/detail/3/1912091147703?bing-pc-719-151amp;msclkid=aa7a5d9e593b177526543492148d8d7e (accessed on 8 August 2025).
  264. Di Francia, G.T. Super-gain antennas and optical resolving power. Il Nuovo C. 1952, 9, 426–438. [Google Scholar] [CrossRef]
  265. Okazaki, S. High resolution optical lithography or high throughput electron beam lithography: The technical struggle from the micro to the nano-fabrication evolution. Microelectron. Eng. 2015, 133, 23–35. [Google Scholar] [CrossRef]
  266. Tang, F.; Wang, Y.; Qiu, L.; Zhao, W.; Sun, Y. Super-resolution radially polarized-light pupil-filtering confocal sensing technology. Appl. Opt. 2014, 53, 7407–7414. [Google Scholar] [CrossRef] [PubMed]
  267. Zhao, W.; Tan, J.; Qiu, L. Bipolar absolute differential confocal approach to higher spatial resolution. Opt. Express 2004, 12, 5013–5021. [Google Scholar] [CrossRef]
  268. Li, Z.; Herrmann, K.; Pohlenz, F. Lateral scanning confocal microscopy for the determination of in-plane displacements of microelectromechanical systems devices. Opt. Lett. 2007, 32, 1743–1745. [Google Scholar] [CrossRef]
  269. Aguilar, J.F.; Lera, M.; Sheppard, C.J. Imaging of spheres and surface profiling by confocal microscopy. Appl. Opt. 2000, 39, 4621–4628. [Google Scholar] [CrossRef]
  270. Arrasmith, C.L.; Dickensheets, D.L.; Mahadevan-Jansen, A. MEMS-based handheld confocal microscope for in-vivo skin imaging. Opt. Express 2010, 18, 3805–3819. [Google Scholar] [CrossRef]
  271. Sun, C.C.; Liu, C.K. Ultrasmall focusing spot with a long depth of focus based on polarization and phase modulation. Opt. Lett. 2003, 28, 99–101. [Google Scholar] [CrossRef]
  272. Heintzmann, R.; Huser, T. Super-resolution structured illumination microscopy. Chem. Rev. 2017, 117, 13890–13908. [Google Scholar] [CrossRef]
  273. Wu, Y.; Shroff, H. Faster, sharper, and deeper: Structured illumination microscopy for biological imaging. Nat. Methods 2018, 15, 1011–1019. [Google Scholar] [CrossRef] [PubMed]
  274. Takahashi, S.; Kudo, R.; Usuki, S.; Takamasu, K. Super resolution optical measurements of nanodefects on Si wafer surface using infrared standing evanescent wave. CIRP Ann. 2011, 60, 523–526. [Google Scholar] [CrossRef]
  275. Gustafsson, M.G. Nonlinear structured-illumination microscopy: Wide-field fluorescence imaging with theoretically unlimited resolution. Proc. Natl. Acad. Sci. USA 2005, 102, 13081–13086. [Google Scholar] [CrossRef] [PubMed]
  276. Habuchi, S. Super-resolution molecular and functional imaging of nanoscale architectures in life and materials science. Front. Bioeng. Biotechnol. 2014, 2, 20. [Google Scholar] [CrossRef]
  277. Chen, Z.; Taflove, A.; Backman, V. Photonic nanojet enhancement of backscattering of light by nanoparticles: A potential novel visible-light ultramicroscopy technique. Opt. Express 2004, 12, 1214–1220. [Google Scholar] [CrossRef]
  278. Li, X.; Chen, Z.; Taflove, A.; Backman, V. Optical analysis of nanoparticles via enhanced backscattering facilitated by 3-D photonic nanojets. Opt. Express 2005, 13, 526–533. [Google Scholar] [CrossRef]
  279. Itagi, A.; Challener, W. Optics of photonic nanojets. J. Opt. Soc. Am. A 2005, 22, 2847–2858. [Google Scholar] [CrossRef]
  280. Ferrand, P.; Wenger, J.; Devilez, A.; Pianta, M.; Stout, B.; Bonod, N.; Popov, E.; Rigneault, H. Direct imaging of photonic nanojets. Opt. Express 2008, 16, 6930–6940. [Google Scholar] [CrossRef] [PubMed]
  281. Wang, Z.; Guo, W.; Li, L.; Luk’Yanchuk, B.; Khan, A.; Liu, Z.; Chen, Z.; Hong, M. Optical virtual imaging at 50 nm lateral resolution with a white-light nanoscope. Nat. Commun. 2011, 2, 218. [Google Scholar] [CrossRef]
  282. Lee, J.Y.; Hong, B.H.; Kim, W.Y.; Min, S.K.; Kim, Y.; Jouravlev, M.V.; Bose, R.; Kim, K.S.; Hwang, I.C.; Kaufman, L.J.; et al. Near-field focusing and magnification through self-assembled nanoscale spherical lenses. Nature 2009, 460, 498–501. [Google Scholar] [CrossRef]
  283. Berera, R.; van Grondelle, R.; Kennis, J.T. Ultrafast transient absorption spectroscopy: Principles and application to photosynthetic systems. Photosynth. Res. 2009, 101, 105–118. [Google Scholar] [CrossRef]
  284. Zacharioudaki, D.E.; Fitilis, I.; Kotti, M. Review of fluorescence spectroscopy in environmental quality applications. Molecules 2022, 27, 4801. [Google Scholar] [CrossRef]
  285. Yi, J.; You, E.M.; Hu, R.; Wu, D.Y.; Liu, G.K.; Yang, Z.L.; Zhang, H.; Gu, Y.; Wang, Y.H.; Wang, X.; et al. Surface-enhanced Raman spectroscopy: A half-century historical perspective. Chem. Soc. Rev. 2025, 54, 1453–1551. [Google Scholar] [CrossRef] [PubMed]
  286. Fathy, A.; Sabry, Y.M.; Nazeer, S.; Bourouina, T.; Khalil, D.A. On-chip parallel Fourier transform spectrometer for broadband selective infrared spectral sensing. Microsystems Nanoeng. 2020, 6, 10. [Google Scholar] [CrossRef]
  287. Bamji, C.; Godbaz, J.; Oh, M.; Mehta, S.; Payne, A.; Ortiz, S.; Nagaraja, S.; Perry, T.; Thompson, B. A review of indirect time-of-flight technologies. IEEE Trans. Electron Devices 2022, 69, 2779–2793. [Google Scholar] [CrossRef]
  288. Restelli, F.; Pollo, B.; Vetrano, I.G.; Cabras, S.; Broggi, M.; Schiariti, M.; Falco, J.; de Laurentis, C.; Raccuia, G.; Ferroli, P.; et al. Confocal laser microscopy in neurosurgery: State of the art of actual clinical applications. J. Clin. Med. 2021, 10, 2035. [Google Scholar] [CrossRef] [PubMed]
  289. Zhang, Y.; Yu, Q.; Shang, W.; Wang, C.; Liu, T.; Wang, Y.; Cheng, F. Chromatic confocal measurement system and its experimental study based on inclined illumination. Chin. Opt. 2022, 15, 514–524. [Google Scholar] [CrossRef]
  290. Wang, J.; Chen, F.; Liu, B.; Gan, Y.; Liu, G. White LED-based spectrum confocal displacement sensor. China Meas. Test 2017, 43, 69–73. [Google Scholar]
  291. Mullan, F.; Bartlett, D.; Austin, R.S. Measurement uncertainty associated with chromatic confocal profilometry for 3D surface texture characterization of natural human enamel. Dent. Mater. 2017, 33, e273–e281. [Google Scholar] [CrossRef] [PubMed]
  292. Li, J.; Ma, R.; Bai, J. High-precision chromatic confocal technologies: A review. Micromachines 2024, 15, 1224. [Google Scholar] [CrossRef]
  293. Wu, J.; Yao, J.; Ren, A.; Ju, Z.; Lin, R.; Wang, Z.; Zhu, W.; Ju, B.; Tsai, D.P. Meta-Dispersive 3D Chromatic Confocal Measurement. Adv. Sci. 2025, 12, e08774. [Google Scholar] [CrossRef]
  294. Zhou, R.; Shen, D.; Huang, P.; Kong, L.; Zhu, Z. Chromatic confocal sensor-based sub-aperture scanning and stitching for the measurement of microstructured optical surfaces. Opt. Express 2021, 29, 33512–33526. [Google Scholar] [CrossRef]
  295. Nadim, E.H.; Hichem, N.; Nabil, A.; Mohamed, D.; Olivier, G. Comparison of tactile and chromatic confocal measurements of aspherical lenses for form metrology. Int. J. Precis. Eng. Manuf. 2014, 15, 821–829. [Google Scholar] [CrossRef]
  296. Rishikesan, V.; Samuel, G. Evaluation of surface profile parameters of a machined surface using confocal displacement sensor. Procedia Mater. Sci. 2014, 5, 1385–1391. [Google Scholar] [CrossRef]
  297. Nouira, H.; El-Hayek, N.; Yuan, X.; Anwer, N. Characterization of the main error sources of chromatic confocal probes for dimensional measurement. Meas. Sci. Technol. 2014, 25, 044011. [Google Scholar] [CrossRef]
  298. Bai, J.; Wang, Y.; Wang, X.; Zhou, Q.; Ni, K.; Li, X. Three-probe error separation with chromatic confocal sensors for roundness measurement. Nanomanuf. Metrol. 2021, 4, 247–255. [Google Scholar] [CrossRef]
  299. Lan, H.; Lei, D.; Qian, L.; Xia, H.; Sun, S. A non contact system for measurement of rotating error based on confocal chromatic displacement sensor. Manuf. Technol. Mach. Tool 2017, 2017, 141–145. [Google Scholar]
  300. Zakrzewski, A.; Koruba, P.; Ćwikła, M.; Reiner, J. The determination of the measurement beam source in a chromatic confocal displacement sensor integrated with an optical laser head. Opt. Laser Technol. 2022, 153, 108268. [Google Scholar] [CrossRef]
  301. Bi, C.; Li, D.; Fang, J.; Zhang, B. Application of chromatic confocal displacement sensor in measurement of tip clearance. In Proceedings of the Optical Measurement Technology and Instrumentation, Beijing, China, 9–11 May 2016; SPIE: Bellingham, WA, USA, 2016; Volume 10155, pp. 455–462. [Google Scholar]
  302. Ueda, S.i.; Michihata, M.; Hayashi, T.; Takaya, Y. Wide-range axial position measurement for jumping behavior of optically trapped microsphere near surface using chromatic confocal sensor. Int. J. Optomechatronics 2015, 9, 131–140. [Google Scholar] [CrossRef]
  303. Agoyan, M.; Fourneau, G.; Cheymol, G.; Ladaci, A.; Maskrot, H.; Destouches, C.; Fourmentel, D.; Girard, S.; Boukenter, A. Toward confocal chromatic sensing in nuclear reactors: In situ optical refractive index measurements of bulk glass. IEEE Trans. Nucl. Sci. 2022, 69, 722–730. [Google Scholar] [CrossRef]
  304. Dai, W.; Liu, Y.; Su, F.; Wang, W. Chromatic confocal imaging based mechanical test platform for micro porous membrane. In Proceedings of the 2016 13th IEEE International Conference on Solid-State and Integrated Circuit Technology (ICSICT), Hangzhou, China, 25–28 October 2016; pp. 266–268. [Google Scholar]
  305. Berkovic, G.; Zilberman, S.; Shafir, E.; Cohen-Sabban, J. Vibrometry using a chromatic confocal sensor. In Proceedings of the 11th International Conference on Vibration Measurements by Laser and Noncontact Techniques-Aivela 2014: Advances and Applications, Ancona, Italy, 25–27 June 2014; Volume 1600, pp. 439–444. [Google Scholar]
  306. Yang, W.; Liu, X.; Lu, W.; Guo, X. Influence of probe dynamic characteristics on the scanning speed for white light interference based AFM. Precis. Eng. 2018, 51, 348–352. [Google Scholar] [CrossRef]
  307. Wang, J.; Liu, T.; Tang, X.; Hu, J.; Wang, X.; Li, G.; He, T.; Yang, S. Fiber-coupled chromatic confocal 3D measurement system and comparative study of spectral data processing algorithms. Acta Photonica Sin. 2021, 50, 1112001. [Google Scholar]
  308. Zhang, N.; Xu, X.; Wu, J.; Liu, Y.; Li, L. Study on Thickness Measurement System of Transparent Materials Based on Chromatic Confocal. J. Chang. Univ. Sci. Technol. (Nat. Sci. Ed.) 2013, 36, 1–6. (In Chinese) [Google Scholar]
  309. Yu, Q.; Zhang, K.; Cui, C.; Zhou, R.; Cheng, F.; Ye, R.; Zhang, Y. Method of thickness measurement for transparent specimens with chromatic confocal microscopy. Appl. Opt. 2018, 57, 9722–9728. [Google Scholar] [CrossRef]
  310. Chen, Y.C.; Dong, S.P.; Wang, C.C.; Kuo, S.H.; Wang, W.C.; Tai, H.M. Using chromatic confocal apparatus for in situ rolling thickness measurement in hot embossing process. In Proceedings of the Instrumentation, Metrology, and Standards for Nanomanufacturing IV, San Diego, CA, USA, 2–4 August 2010; SPIE: Bellingham, WA, USA, 2010; Volume 7767, pp. 128–134. [Google Scholar]
  311. Niese, S.; Quodbach, J. Application of a chromatic confocal measurement system as new approach for in-line wet film thickness determination in continuous oral film manufacturing processes. Int. J. Pharm. 2018, 551, 203–211. [Google Scholar] [CrossRef] [PubMed]
  312. Bai, J.; Li, J.; Wang, X.; Zhou, Q.; Ni, K.; Li, X. A new method to measure spectral reflectance and film thickness using a modified chromatic confocal sensor. Opt. Lasers Eng. 2022, 154, 107019. [Google Scholar] [CrossRef]
  313. Li, S.; Song, B.; Peterson, T.; Hsu, J.; Liang, R. MicroLED chromatic confocal microscope. Opt. Lett. 2021, 46, 2722–2725. [Google Scholar] [CrossRef]
  314. Bai, J.; Li, X.; Wang, X.; Zhou, Q.; Ni, K. Chromatic confocal displacement sensor with optimized dispersion probe and modified centroid peak extraction algorithm. Sensors 2019, 19, 3592. [Google Scholar] [CrossRef]
  315. Shi, K.; Li, P.; Yin, S.; Liu, Z. Chromatic confocal microscopy using supercontinuum light. Opt. Express 2004, 12, 2096–2101. [Google Scholar] [CrossRef]
  316. Minoni, U.; Manili, G.; Bettoni, S.; Varrenti, E.; Modotto, D.; De Angelis, C. Chromatic confocal setup for displacement measurement using a supercontinuum light source. Opt. Laser Technol. 2013, 49, 91–94. [Google Scholar] [CrossRef]
  317. Liu, H.; Wang, B.; Wang, R.; Wang, M.; Yu, D.; Wang, W. Photopolymer-based coaxial holographic lens for spectral confocal displacement and morphology measurement. Opt. Lett. 2019, 44, 3554–3557. [Google Scholar] [CrossRef]
  318. Reyes, J.G.; Meneses, J.; Plata, A.; Tribillon, G.M.; Gharbi, T. Axial resolution of a chromatic dispersion confocal microscopy. In Proceedings of the 5th Iberoamerican Meeting on Optics and 8th Latin American Meeting on Optics, Lasers, and Their Applications, Porlamar, Venezuela, 3–8 October 2004; SPIE: Bellingham, WA, USA, 2004; Volume 5622, pp. 766–771. [Google Scholar]
  319. Chen, X.; Nakamura, T.; Shimizu, Y.; Chen, C.; Chen, Y.L.; Matsukuma, H.; Gao, W. A chromatic confocal probe with a mode-locked femtosecond laser source. Opt. Laser Technol. 2018, 103, 359–366. [Google Scholar] [CrossRef]
  320. Molesini, G.; Pedrini, G.; Poggi, P.; Quercioli, F. Focus-wavelength encoded optical profilometer. Opt. Commun. 1984, 49, 229–233. [Google Scholar] [CrossRef]
  321. Zhang, Y.; Yu, Q.; Shang, W.; Wang, C.; Liu, T.; Wang, Y.; Cheng, F. Design and Experimental Study of an Oblique-Illumination Color Confocal Measurement System. Chin. Opt. 2022, 15, 514–524. (In Chinese) [Google Scholar]
  322. Li, Y.; Fan Sr, J.; Wang, J.; Wang, C.; Fan, H. Design research of chromatic lens in chromatic confocal point sensors. In Proceedings of the Sixth International Conference on Optical and Photonic Engineering (icOPEN 2018), Shanghai, China, 8–11 May 2018; SPIE: Bellingham, WA, USA, 2018; Volume 10827, pp. 73–76. [Google Scholar]
  323. Niu, C.; Li, X.; Lang, X. Design and Performance Optimization of Spectral Confocal Lens Assembly. J. Beijing Inf. Sci. Technol. Univ. 2013, 28, 42–45. (In Chinese) [Google Scholar]
  324. Liu, Q.; Yang, W.; Yuan, D.; Wang, Y. Design of Linearly Dispersive Objective for Spectral Confocal Microscopy. Opt. Precis. Eng. 2013, 21, 2473–2479. (In Chinese) [Google Scholar]
  325. Liu, Q.; Wang, Y.; Yang, W.; Yuan, D. Spectral Confocal Measurement Technique with Linear Dispersion Design. High Power Laser Part. Beams 2014, 26, 58–63. (In Chinese) [Google Scholar]
  326. Shao, T.; Guo, W.; Xi, Y.; Liu, Z.; Yang, K.; Xia, M. Design and Performance Evaluation of a Large-Range Spectral Confocal Displacement Sensor. Chin. J. Lasers 2022, 49, 1804002. (In Chinese) [Google Scholar]
  327. Wang, A.s.; Xie, B.; Liu, Z.w. Design of measurement system of 3D surface profile based on chromatic confocal technology. In Proceedings of the 2017 International Conference on Optical Instruments and Technology: Optical Systems and Modern Optoelectronic Instruments, Beijing, China, 28–30 October 2017; SPIE: Bellingham, WA, USA, 2018; Volume 10616, pp. 342–347. [Google Scholar]
  328. Yang, R.; Yun, Y.; Xie, B.; Wang, A.; Liu, Z.; Liu, C. Design of a Super-Dispersive Linear Objective Lens for Spectral Confocal 3D Profilometer. High Power Laser Part. Beams 2018, 30, 9–13. (In Chinese) [Google Scholar]
  329. Dobson, S.L.; Sun, P.c.; Fainman, Y. Diffractive lenses for chromatic confocal imaging. Appl. Opt. 1997, 36, 4744–4748. [Google Scholar] [CrossRef] [PubMed]
  330. Garzón, J.; Duque, D.; Alean, A.; Toledo, M.; Meneses, J.; Gharbi, T. Diffractive elements performance in chromatic confocal microscopy. J. Phys. Conf. Ser. 2011, 274, 012069. [Google Scholar] [CrossRef]
  331. Rayer, M.; Mansfield, D. Chromatic confocal microscopy using staircase diffractive surface. Appl. Opt. 2014, 53, 5123–5130. [Google Scholar] [CrossRef] [PubMed]
  332. Liu, T.; Wang, J.; Liu, Q.; Hu, J.; Wang, Z.; Wan, C.; Yang, S. Chromatic confocal measurement method using a phase Fresnel zone plate. Opt. Express 2022, 30, 2390–2401. [Google Scholar] [CrossRef]
  333. Hillenbrand, M.; Lorenz, L.; Kleindienst, R.; Grewe, A.; Sinzinger, S. Spectrally multiplexed chromatic confocal multipoint sensing. Opt. Lett. 2013, 38, 4694–4697. [Google Scholar] [CrossRef]
  334. Hillenbrand, M.; Mitschunas, B.; Wenzel, C.; Grewe, A.; Ma, X.; Feßer, P.; Bichra, M.; Sinzinger, S. Hybrid hyperchromats for chromatic confocal sensor systems. Adv. Opt. Technol. 2012, 1, 187–194. [Google Scholar] [CrossRef]
  335. Jin, B.; Deng, W.; Niu, C.; Li, X. Design of dispersive lens group for chromatic confocal measuring system. Opt. Tech. 2012, 38, 660–664. [Google Scholar]
  336. Pruss, C.; Ruprecht, A.; Korner, K.; Osten, W.; Lucke, P. Diffractive elements for chromatic confocal sensors. DGaO Proc. 2005, 2–3. Available online: https://www.dgao-proceedings.de/download/106/106_a1.pdf (accessed on 11 August 2025).
  337. Fleischle, D.; Lyda, W.; Schaal, F.; Osten, W. Chromatic confocal sensor for in-process measurement during lathing. In Proceedings of the 10th International Symposium of Measurement Technology and Intelligent Instruments, Daejeon, Republic of Korea, 29 June–2 July 2011; Volume 29. [Google Scholar]
  338. Ruprecht, A.K.; Pruss, C.; Tiziani, H.J.; Osten, W.; Lucke, P.; Last, A.; Mohr, J.; Lehmann, P. Confocal micro-optical distance sensor: Principle and design. In Proceedings of the Optical Measurement Systems for Industrial Inspection IV, Munich, Germany, 13–17 June 2005; SPIE: Bellingham, WA, USA, 2005; Volume 5856, pp. 128–135. [Google Scholar]
  339. Park, H.M.; Kwon, U.; Joo, K.N. Vision chromatic confocal sensor based on a geometrical phase lens. Appl. Opt. 2021, 60, 2898–2901. [Google Scholar] [CrossRef] [PubMed]
  340. Luecke, P.; Last, A.; Mohr, J.; Ruprecht, A.; Osten, W.; Tiziani, H.J.; Lehmann, P. Confocal micro-optical distance sensor for precision metrology. In Proceedings of the Optical Sensing, Strasbourg, France, 26–30 April 2004; SPIE: Bellingham, WA, USA, 2004; Volume 5459, pp. 180–184. [Google Scholar]
  341. Ruprecht, A.; Wiesendanger, T.; Tiziani, H. Chromatic confocal microscopy with a finite pinhole size. Opt. Lett. 2004, 29, 2130–2132. [Google Scholar] [CrossRef]
  342. Tiziani, H.; Achi, R.; Krämer, R. Chromatic confocal microscopy with microlenses. J. Mod. Opt. 1996, 43, 155–163. [Google Scholar] [CrossRef]
  343. Tiziani, H.J.; Uhde, H.M. Three-dimensional image sensing by chromatic confocal microscopy. Appl. Opt. 1994, 33, 1838–1843. [Google Scholar] [CrossRef]
  344. Ang, K.T.; Fang, Z.P.; Tay, A. Note: Development of high speed confocal 3D profilometer. Rev. Sci. Instruments 2014, 85, 116103. [Google Scholar] [CrossRef]
  345. Hwang, J.; Kim, S.; Heo, J.; Lee, D.; Ryu, S.; Joo, C. Frequency-and spectrally-encoded confocal microscopy. Opt. Express 2015, 23, 5809–5821. [Google Scholar] [CrossRef]
  346. Hillenbrand, M.; Weiss, R.; Endrödy, C.; Grewe, A.; Hoffmann, M.; Sinzinger, S. Chromatic confocal matrix sensor with actuated pinhole arrays. Appl. Opt. 2015, 54, 4927–4936. [Google Scholar] [CrossRef]
  347. Chanbai, S.; Wiora, G.; Weber, M.; Roth, H. A novel confocal line scanning sensor. In Proceedings of the Scanning Microscopy 2009, Monterey, CA, USA, 4–7 May 2009; SPIE: Bellingham, WA, USA, 2009; Volume 7378, pp. 351–358. [Google Scholar]
  348. Hu, H.; Mei, S.; Fan, L.; Wang, H. A line-scanning chromatic confocal sensor for three-dimensional profile measurement on highly reflective materials. Rev. Sci. Instruments 2021, 92, 053707. [Google Scholar] [CrossRef]
  349. Cui, Q.; Liang, R. Chromatic confocal microscopy using liquid crystal display panels. Appl. Opt. 2019, 58, 2085–2090. [Google Scholar] [CrossRef] [PubMed]
  350. Luo, D.; Kuang, C.; Liu, X. Fiber-based chromatic confocal microscope with Gaussian fitting method. Opt. Laser Technol. 2012, 44, 788–793. [Google Scholar] [CrossRef]
  351. Chen, C.; Yang, W.; Wang, J.; Lu, W.; Liu, X.; Jiang, X. Accurate and efficient height extraction in chromatic confocal microscopy using corrected fitting of the differential signal. Precis. Eng. 2019, 56, 447–454. [Google Scholar] [CrossRef]
  352. Shi, K.; Li, P.; Yin, S.; Liu, Z. Surface profile measurement using chromatic confocal microscopy. In Proceedings of the Two-and Three-Dimensional Vision Systems for Inspection, Control, and Metrology II, Philadelphia, PA, USA, 26–27 October 2004; SPIE: Bellingham, WA, USA, 2004; Volume 5606, pp. 124–131. [Google Scholar]
  353. Ruprecht, A.K.; Koerner, K.; Wiesendanger, T.F.; Tiziani, H.J.; Osten, W. Chromatic confocal detection for high-speed microtopography measurements. In Proceedings of the Three-Dimensional Image Capture and Applications VI, San Jose, CA, USA, 19–20 January 2004; SPIE: Bellingham, WA, USA, 2004; Volume 5302, pp. 53–60. [Google Scholar]
  354. Taphanel, M.; Hovestreydt, B.; Beyerer, J. Speed-up chromatic sensors by optimized optical filters. In Proceedings of the Optical Measurement Systems for Industrial Inspection VIII, Munich, Germany, 13–17 May 2013; SPIE: Bellingham, WA, USA, 2013; Volume 8788, pp. 190–199. [Google Scholar]
  355. Kim, T.; Kim, S.H.; Do, D.; Yoo, H.; Gweon, D. Chromatic confocal microscopy with a novel wavelength detection method using transmittance. Opt. Express 2013, 21, 6286–6294. [Google Scholar] [CrossRef]
  356. Chen, L.C.; Nguyen, D.T.; Chang, Y.W. Precise optical surface profilometry using innovative chromatic differential confocal microscopy. Opt. Lett. 2016, 41, 5660–5663. [Google Scholar] [CrossRef]
  357. Yu, Q.; Zhang, K.; Zhou, R.; Cui, C.; Cheng, F.; Fu, S.; Ye, R. Calibration of a chromatic confocal microscope for measuring a colored specimen. IEEE Photonics J. 2018, 10, 6901109. [Google Scholar] [CrossRef]
  358. Bai, J.; Li, X.; Wang, X.; Wang, J.; Ni, K.; Zhou, Q. Self-reference dispersion correction for chromatic confocal displacement measurement. Opt. Lasers Eng. 2021, 140, 106540. [Google Scholar] [CrossRef]
  359. Tan, J.; Liu, C.; Liu, J.; Wang, H. Sinc2 fitting for height extraction in confocal scanning. Meas. Sci. Technol. 2015, 27, 025006. [Google Scholar] [CrossRef]
  360. Ruprecht, A.; Wiesendanger, T.; Tiziani, H. Signal evaluation for high-speed confocal measurements. Appl. Opt. 2002, 41, 7410–7415. [Google Scholar] [CrossRef] [PubMed]
  361. Deng, W.; Niu, C.; Lv, N.; Gao, X. Research on chromatic confocal technology for displacement measurement. In Proceedings of the Fourth International Seminar on Modern Cutting and Measurement Engineering, Beijing, China, 10–12 December 2010; SPIE: Bellingham, WA, USA, 2011; Volume 7997, pp. 457–462. [Google Scholar]
  362. Niu, C.H.; Lv, Y. Chromatic confocal displacement measurment based on correlation algorithm. Appl. Mech. Mater. 2014, 446, 909–914. [Google Scholar] [CrossRef]
  363. Lu, W.; Chen, C.; Zhu, H.; Wang, J.; Leach, R.; Liu, X.; Wang, J.; Jiang, X. Fast and accurate mean-shift vector based wavelength extraction for chromatic confocal microscopy. Meas. Sci. Technol. 2019, 30, 115104. [Google Scholar] [CrossRef]
  364. Rey-Barroso, L.; Peña-Gutiérrez, S.; Yáñez, C.; Burgos-Fernández, F.J.; Vilaseca, M.; Royo, S. Optical technologies for the improvement of skin cancer diagnosis: A review. Sensors 2021, 21, 252. [Google Scholar] [CrossRef]
  365. Nishiyabu, R.; Shimizu, A. Boronic acid as an efficient anchor group for surface modification of solid polyvinyl alcohol. Chem. Commun. 2016, 52, 9765–9768. [Google Scholar] [CrossRef] [PubMed]
  366. Forbes, A.; Haverkamp, R.G.; Robertson, T.; Bryant, J.; Bearsley, S. Studies of the microstructure of polymer-modified bitumen emulsions using confocal laser scanning microscopy. J. Microsc. 2001, 204, 252–257. [Google Scholar] [CrossRef]
  367. Werner, J.G.; Deveney, B.T.; Nawar, S.; Weitz, D.A. Dynamic microcapsules with rapid and reversible permeability switching. Adv. Funct. Mater. 2018, 28, 1803385. [Google Scholar] [CrossRef]
  368. Chen, X.; Zeng, Z.; Wang, H.; Xi, P. Three-dimensional multimodal sub-diffraction imaging with spinning-disk confocal microscopy using blinking/fluctuating probes. Nano Res. 2015, 8, 2251–2260. [Google Scholar] [CrossRef]
  369. Teodori, L.; Crupi, A.; Costa, A.; Diaspro, A.; Melzer, S.; Tarnok, A. Three-dimensional imaging technologies: A priority for the advancement of tissue engineering and a challenge for the imaging community. J. Biophotonics 2017, 10, 24–45. [Google Scholar] [CrossRef]
  370. Mou, C.L.; Wang, W.; Li, Z.L.; Ju, X.J.; Xie, R.; Deng, N.N.; Wei, J.; Liu, Z.; Chu, L.Y. Trojan-horse-like stimuli-responsive microcapsules. Adv. Sci. 2018, 5, 1700960. [Google Scholar] [CrossRef]
  371. Yang, P.; Li, Q.; Wang, S.; Zhuang, W.; Zhou, J.; Zhu, S.; Wu, J.; Ying, H. Application of a humidity-mediated method to remove residual solvent from crystal lattice. Food Chem. 2019, 294, 123–129. [Google Scholar] [CrossRef] [PubMed]
  372. Alali, S.; Vitkin, A. Polarized light imaging in biomedicine: Emerging Mueller matrix methodologies for bulk tissue assessment. J. Biomed. Opt. 2015, 20, 061104. [Google Scholar] [CrossRef] [PubMed]
  373. Zaffar, M.; Pradhan, A. Assessment of anisotropy of collagen structures through spatial frequencies of Mueller matrix images for cervical pre-cancer detection. Appl. Opt. 2020, 59, 1237–1248. [Google Scholar] [CrossRef]
  374. Novikova, T.; Pierangelo, A.; Manhas, S.; Benali, A.; Validire, P.; Gayet, B.; De Martino, A. The origins of polarimetric image contrast between healthy and cancerous human colon tissue. Appl. Phys. Lett. 2013, 102, 241103. [Google Scholar] [CrossRef]
  375. Zhao, W.; Tang, L.; Yang, S.; Qiu, L. Research on high-precision large-aperture laser differential confocal-interferometric optical element multi-parameter measurement method. Light Adv. Manuf. 2025, 5, 553–566. [Google Scholar] [CrossRef]
  376. Wang, Z.; Wang, T.; Yang, Y.; Mi, X.; Wang, J. Differential Confocal Optical Probes with Optimized Detection Efficiency and Pearson Correlation Coefficient Strategy Based on the Peak-Clustering Algorithm. Micromachines 2023, 14, 1163. [Google Scholar] [CrossRef]
  377. Wang, F.; Tan, J.; Zhao, W. Optical probe using differential confocal technique for surface profile. In Proceedings of the Process Control and Inspection for Industry, Beijing, China, 8–10 November 2000; SPIE: Bellingham, WA, USA, 2000; Volume 4222, pp. 194–197. [Google Scholar]
  378. Sun, Y.; Zhao, W.; Qiu, L.; Li, R.; Wang, Y. Axial high-resolution differential confocal microscopy. Meas. Sci. Technol. 2019, 30, 125402. [Google Scholar] [CrossRef]
  379. Si, K.; Gong, W.; Sheppard, C.J. Three-dimensional coherent transfer function for a confocal microscope with two D-shaped pupils. Appl. Opt. 2009, 48, 810–817. [Google Scholar] [CrossRef]
  380. Wang, Y.; Qiu, L.; Zhao, X.; Zhao, W. Divided-aperture differential confocal fast-imaging microscopy. Meas. Sci. Technol. 2017, 28, 035401. [Google Scholar] [CrossRef]
  381. Zou, L.M.; Li, X.; Zhang, H.J.; Ding, X.M. Improvement of lateral resolution property of differential confocal system using radial birefringent pupil filter. In Proceedings of the Fifth International Symposium on Instrumentation Science and Technology, Shenyang, China, 15–18 September 2008; SPIE: Bellingham, WA, USA, 2009; Volume 7133, pp. 941–951. [Google Scholar]
  382. Wang, Y.; Qiu, L.; Zhao, W. High precision radially-polarized-light pupil-filtering differential confocal measurement. Opt. Laser Technol. 2016, 82, 87–93. [Google Scholar] [CrossRef]
  383. Wu, G.W.; Chen, L.C. Precise 3-D microscopic profilometry using diffractive image microscopy and artificial neural network in single-exposure manner. Opt. Lasers Eng. 2021, 147, 106732. [Google Scholar] [CrossRef]
  384. Sheng, Z.; Wang, Y.; Zhao, W.; Qiu, L.; Sun, Y. Laser differential fitting confocal microscopy with high imaging efficiency. Appl. Opt. 2016, 55, 6903–6909. [Google Scholar] [CrossRef]
  385. Zhao, W.; Jiang, Q.; Qiu, L.; Liu, D. Dual-axes differential confocal microscopy with high axial resolution and long working distance. Opt. Commun. 2011, 284, 15–19. [Google Scholar] [CrossRef]
  386. Mauch, F.; Lyda, W.; Gronle, M.; Osten, W. Improved signal model for confocal sensors accounting for object depending artifacts. Opt. Express 2012, 20, 19936–19945. [Google Scholar] [CrossRef]
  387. Wang, Z.; Wang, T.; Yang, Y.; Yang, Y.; Mi, X.; Wang, J. Precise Two-Dimensional Tilt Measurement Sensor with Double-Cylindrical Mirror Structure and Modified Mean-Shift Algorithm for a Confocal Microscopy System. Sensors 2022, 22, 6794. [Google Scholar] [CrossRef]
  388. Cacace, L. An Optical Distance Sensor: Tilt Robust Differential Confocal Measurement with mm Range and nm Uncertainty; Technische Universiteit Eindhoven: Eindhoven, The Netherlands, 2009. [Google Scholar]
  389. Moharam, M.; Gaylord, T.K. Rigorous coupled-wave analysis of planar-grating diffraction. J. Opt. Soc. Am. 1981, 71, 811–818. [Google Scholar] [CrossRef]
  390. Weidner, P.; Kasic, A.; Hingst, T.; Ehlers, C.; Philipp, S.; Marschner, T.; Moert, M. Model-free and model-based methods for dimensional metrology during the lifetime of a product. In Proceedings of the Ninth International Symposium on Laser Metrology, Singapore, 30 June–2 July 2008; SPIE: Bellingham, WA, USA, 2008; Volume 7155, pp. 319–326. [Google Scholar]
  391. Azzam, R.M.; Bashara, N.M.; Ballard, S.S. Ellipsometry and polarized light. Opt. Acta Int. J. Opt. 1978, 25, 270–271. [Google Scholar] [CrossRef]
  392. Fujiwara, H. Spectroscopic Ellipsometry: Principles and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2007. [Google Scholar]
  393. Minhas, B.K.; Coulombe, S.A.; Naqvi, S.S.H.; McNeil, J.R. Ellipsometric scatterometry for the metrology of sub-0.1-μm-linewidth structures. Appl. Opt. 1998, 37, 5112–5115. [Google Scholar] [CrossRef] [PubMed]
  394. Coulombe, S.A.; Minhas, B.K.; Raymond, C.J.; Sohail, H. Naqvi, S.; McNeil, J.R. Scatterometry measurement of sub-0.1 μm linewidth gratings. J. Vac. Sci. Technol. B Microelectron. Nanometer Struct. Process. Meas. Phenom. 1998, 16, 80–87. [Google Scholar]
  395. Huang, H.T.; Kong, W.; Terry Jr, F.L. Normal-incidence spectroscopic ellipsometry for critical dimension monitoring. Appl. Phys. Lett. 2001, 78, 3983–3985. [Google Scholar] [CrossRef]
  396. Niu, X.; Jakatdar, N.; Bao, J.; Spanos, C.J. Specular spectroscopic scatterometry. IEEE Trans. Semicond. Manuf. 2001, 14, 97–111. [Google Scholar] [CrossRef]
  397. Tompkins, H.G.; McGahan, W.A. Spectroscopic Ellipsometry and Reflectometry: A User’s Guide; Wiley: Hoboken, NJ, USA, 1999. [Google Scholar]
  398. Losurdo, M.; Bergmair, M.; Bruno, G.; Cattelan, D.; Cobet, C.; De Martino, A.; Fleischer, K.; Dohcevic-Mitrovic, Z.; Esser, N.; Galliet, M.; et al. Spectroscopic ellipsometry and polarimetry for materials and systems analysis at the nanometer scale: State-of-the-art, potential, and perspectives. J. Nanopart. Res. 2009, 11, 1521–1554. [Google Scholar] [CrossRef]
  399. Novikova, T.; Martino, A.D.; Bulkin, P.; Nguyen, Q.; Drévillon, B.; Popov, V.; Chumakov, A. Metrology of replicated diffractive optics with Mueller polarimetry in conical diffraction. Opt. Express 2007, 15, 2033–2046. [Google Scholar] [CrossRef]
  400. Novikova, T.; De Martino, A.; Ossikovski, R.; Drevillon, B. Metrological applications of Mueller polarimetry in conical diffraction for overlay characterization in microelectronics. Eur. Phys. J. Appl. Phys. 2005, 31, 63–69. [Google Scholar] [CrossRef]
  401. Novikova, T.; De Martino, A.; Hatit, S.B.; Drévillon, B. Application of Mueller polarimetry in conical diffraction for critical dimension measurements in microelectronics. Appl. Opt. 2006, 45, 3688–3697. [Google Scholar] [CrossRef]
  402. Schuster, T.; Rafler, S.; Osten, W.; Reinig, P.; Hingst, T. Scatterometry from crossed grating structures in different configurations. In Proceedings of the Modeling Aspects in Optical Metrology, Munich, Germany, 17–21 June 2007; SPIE: Bellingham, WA, USA, 2007; Volume 6617, pp. 343–351. [Google Scholar]
  403. Kim, Y.N.; Paek, J.S.; Rabello, S.; Lee, S.; Hu, J.; Liu, Z.; Hao, Y.; McGahan, W. Device based in-chip critical dimension and overlay metrology. Opt. Express 2009, 17, 21336–21343. [Google Scholar] [CrossRef] [PubMed]
  404. Collins, R.; Koh, J. Dual rotating-compensator multichannel ellipsometer: Instrument design for real-time Mueller matrix spectroscopy of surfaces and films. J. Opt. Soc. Am. A 1999, 16, 1997–2006. [Google Scholar] [CrossRef]
  405. Liu, S.; Chen, X.; Zhang, C. Development of a broadband Mueller matrix ellipsometer as a powerful tool for nanostructure metrology. Thin Solid Films 2015, 584, 176–185. [Google Scholar] [CrossRef]
  406. Garcia-Caurel, E.; De Martino, A.; Gaston, J.P.; Yan, L. Application of spectroscopic ellipsometry and Mueller ellipsometry to optical characterization. Appl. Spectrosc. 2013, 67, 1–21. [Google Scholar] [CrossRef] [PubMed]
  407. Chen, X.; Du, W.; Yuan, K.; Chen, J.; Jiang, H.; Zhang, C.; Liu, S. Development of a spectroscopic Mueller matrix imaging ellipsometer for nanostructure metrology. Rev. Sci. Instrum. 2016, 87, 053707. [Google Scholar] [CrossRef]
  408. Chen, X.-G.; Yuan, K.; Du, W.-C.; Chen, J.; Jing, H.; Zhang, C.-W.; Liu, S.-Y. Large-scale nanostructure metrology using Mueller matrix imaging ellipsometry. Acta Phys. Sin. 2016, 65, 070703. [Google Scholar] [CrossRef]
  409. Park, S.; Kim, E.; Kim, J.; An, I. Comparison null imaging ellipsometry using polarization rotator. Jpn. J. Appl. Phys. 2018, 57, 052501. [Google Scholar] [CrossRef]
  410. Torres, J.F.; Komiya, A.; Shoji, E.; Okajima, J.; Maruyama, S. Development of phase-shifting interferometry for measurement of isothermal diffusion coefficients in binary solutions. Opt. Lasers Eng. 2012, 50, 1287–1296. [Google Scholar] [CrossRef]
  411. Shoji, E.; Komiya, A.; Okajima, J.; Kubo, M.; Tsukada, T. Three-step phase-shifting imaging ellipsometry to measure nanofilm thickness profiles. Opt. Lasers Eng. 2019, 112, 145–150. [Google Scholar] [CrossRef]
  412. Chen, X.; Yuan, K.; Du, W.; Chen, J.; Jiang, H.; Zhang, C.; Liu, S. Large-Area Measurement of Geometrical Parameters of Nanostructures Using Mueller Matrix Imaging Ellipsometry. Acta Phys. Sin. 2016, 65, 070703. (In Chinese) [Google Scholar] [CrossRef]
  413. Kim, D.H.; Yun, Y.H.; Joo, K.N. LASIE: Large area spectroscopic imaging ellipsometry for characterizing multi-layered film structures. Int. J. Precis. Eng. Manuf. 2018, 19, 1125–1132. [Google Scholar] [CrossRef]
  414. Seo, Y.B.; Yun, Y.H.; Joo, K.N. 3D multi-layered film thickness profile measurements based on photometric type imaging ellipsometry. Int. J. Precis. Eng. Manuf. 2016, 17, 989–993. [Google Scholar] [CrossRef]
  415. Wu, J.; Zhang, Y.; Hu, P.; Wu, Y. A review of the application of hyperspectral imaging technology in agricultural crop economics. Coatings 2024, 14, 1285. [Google Scholar] [CrossRef]
  416. Meng, H.; Gao, Y.; Wang, X.; Li, X.; Wang, L.; Zhao, X.; Sun, B. Quantum dot-enabled infrared hyperspectral imaging with single-pixel detection. Light Sci. Appl. 2024, 13, 121. [Google Scholar] [CrossRef]
  417. Yang, Z.; Albrow-Owen, T.; Cui, H.; Alexander-Webber, J.; Gu, F.; Wang, X.; Wu, T.C.; Zhuge, M.; Williams, C.; Wang, P.; et al. Single-nanowire spectrometers. Science 2019, 365, 1017–1020. [Google Scholar] [CrossRef] [PubMed]
  418. Wen, J.; Gao, H.; Shi, W.; Feng, S.; Hao, L.; Liu, Y.; Xu, L.; Shao, Y.; Zhang, Y.; Shen, W.; et al. On-chip Real-time Hyperspectral Imager with Full CMOS Resolution Enabled by Massively Parallel Neural Network. arXiv 2024, arXiv:2404.09500. [Google Scholar]
  419. Specim, Spectral Imaging Ltd. Specim IQ Portable VNIR Hyperspectral Imaging Camera. Product Page, Specim Website. 2025. Spectral Range 400–1000 nm, Resolution Approximately 7 nm, 512 × 512 Pixels; Portable Handheld Device. Available online: https://www.specim.com/iq/tech-specs/ (accessed on 5 August 2025).
  420. Han, X.H.; Wang, J.; Jiang, H. Recent Advancements in Hyperspectral Image Reconstruction from a Compressive Measurement. Sensors 2025, 25, 3286. [Google Scholar] [CrossRef]
  421. Zhang, G.; Zhao, S.; Li, W.; Du, Q.; Ran, Q.; Tao, R. HTD-Net: A deep convolutional neural network for target detection in hyperspectral imagery. Remote Sens. 2020, 12, 1489. [Google Scholar] [CrossRef]
  422. Li, Y.; Hu, J.; Zhao, X.; Xie, W.; Li, J. Hyperspectral image super-resolution using deep convolutional neural network. Neurocomputing 2017, 266, 29–41. [Google Scholar] [CrossRef]
  423. Cheng, M.F.; Mukundan, A.; Karmakar, R.; Valappil, M.A.E.; Jouhar, J.; Wang, H.C. Modern Trends and Recent Applications of Hyperspectral Imaging: A Review. Technologies 2025, 13, 170. [Google Scholar] [CrossRef]
  424. Yan, Y.; Ren, J.; Zhao, H.; Windmill, J.F.; Ijomah, W.; De Wit, J.; Von Freeden, J. Non-destructive testing of composite fiber materials with hyperspectral imaging—Evaluative studies in the EU H2020 FibreEUse project. IEEE Trans. Instrum. Meas. 2022, 71, 6002213. [Google Scholar] [CrossRef]
  425. Fabelo, H.; Ortega, S.; Szolna, A.; Bulters, D.; Piñeiro, J.F.; Kabwama, S.; JO’Shanahan, A.; Bulstrode, H.; Bisshopp, S.; Kiran, B.R.; et al. In-vivo hyperspectral human brain image database for brain cancer detection. IEEE Access 2019, 7, 39098–39116. [Google Scholar] [CrossRef]
  426. Chabrillat, S.; Foerster, S.; Segl, K.; Beamish, A.; Brell, M.; Asadzadeh, S.; Milewski, R.; Ward, K.J.; Brosinsky, A.; Koch, K.; et al. The EnMAP spaceborne imaging spectroscopy mission: Initial scientific results two years after launch. Remote Sens. Environ. 2024, 315, 114379. [Google Scholar] [CrossRef]
  427. EO Portal. Pixxel Satellite Mission. Available online: https://www.eoportal.org/satellite-missions/pixxel (accessed on 5 August 2025).
  428. Bhargava, A.; Sachdeva, A.; Sharma, K.; Alsharif, M.H.; Uthansakul, P.; Uthansakul, M. Hyperspectral imaging and its applications: A review. Heliyon 2024, 10, e33208. [Google Scholar] [CrossRef] [PubMed]
  429. Díaz, M.; Guerra, R.; Horstrand, P.; Martel, E.; López, S.; López, J.F.; Sarmiento, R. Real-time hyperspectral image compression onto embedded GPUs. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 2, 2792–2809. [Google Scholar] [CrossRef]
  430. Eckstein, B.A.; Arlen, R. IEEE PROJECT 4001–Standards for Characterization and Calibration of Hyperspectral Imaging Devices. Proc. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, XLIV-M-3-2021, 43–47. [Google Scholar] [CrossRef]
  431. Allen, L.; Beijersbergen, M.W.; Spreeuw, R.; Woerdman, J. Orbital angular momentum of light and the transformation of Laguerre-Gaussian laser modes. Phys. Rev. A 1992, 45, 8185. [Google Scholar] [CrossRef]
  432. Lavery, M.P.; Speirits, F.C.; Barnett, S.M.; Padgett, M.J. Detection of a spinning object using light’s orbital angular momentum. Science 2013, 341, 537–540. [Google Scholar] [CrossRef] [PubMed]
  433. Luo, Z.; Liu, R.; Wu, L.; Ran, J.; Wang, Y.; Qin, S.; Lv, C.; Bai, Z.; Xu, X.; Wang, Y.; et al. Fiber angular displacement sensor utilizing orbital angular momentum beam interference. Opt. Lett. 2025, 50, 2759–2762. [Google Scholar] [CrossRef]
  434. Jeffries, G.D.; Edgar, J.S.; Zhao, Y.; Shelby, J.P.; Fong, C.; Chiu, D.T. Using polarization-shaped optical vortex traps for single-cell nanosurgery. Nano Lett. 2007, 7, 415–420. [Google Scholar] [CrossRef]
  435. Pan, Y.; Ou, J.; Chi, H. High-dimensional signal encoding and decoding method based on multi-ring perfect vortex beam. Opt. Express 2024, 32, 30800–30812. [Google Scholar] [CrossRef] [PubMed]
  436. Ma, H.; Li, X.; Tai, Y.; Li, H.; Wang, J.; Tang, M.; Tang, J.; Wang, Y.; Nie, Z. Generation of circular optical vortex array. Ann. Der Phys. 2017, 529, 1700285. [Google Scholar] [CrossRef]
  437. Chen, Y.; Xia, K.Y.; Shen, W.G.; Gao, J.; Yan, Z.Q.; Jiao, Z.Q.; Dou, J.P.; Tang, H.; Lu, Y.Q.; Jin, X.M. Vector vortex beam emitter embedded in a photonic chip. Phys. Rev. Lett. 2020, 124, 153601. [Google Scholar] [CrossRef]
  438. Du, J.; Quan, Z.; Li, K.; Wang, J. Optical vortex array: Generation and applications. Chin. Opt. Lett. 2024, 22, 020011. [Google Scholar] [CrossRef]
  439. Su, R.; Leach, R. Physics-based virtual coherence scanning interferometer for surface measurement. Light Adv. Manuf. 2021, 2, 120–135. [Google Scholar] [CrossRef]
  440. Giordani, T.; Suprano, A.; Polino, E.; Acanfora, F.; Innocenti, L.; Ferraro, A.; Paternostro, M.; Spagnolo, N.; Sciarrino, F. Machine learning-based classification of vector vortex beams. Phys. Rev. Lett. 2020, 124, 160401. [Google Scholar] [CrossRef]
  441. Panthong, P.; Srisuphaphon, S.; Pattanaporkratana, A.; Chiangga, S.; Deachapunya, S. A study of optical vortices inside the Talbot interferometer. arXiv 2015, arXiv:1510.00272. [Google Scholar] [CrossRef]
  442. Zhang, Z.; Zhao, P.; Liao, J.; Dai, M.; Zhou, J.; Xiao, W.; Chen, H. Vortex decomposition and reconfiguration via transformation optics. Phys. Rev. Appl. 2024, 21, 054006. [Google Scholar] [CrossRef]
  443. Gao, H.; Yang, D.; Hu, X.; He, W.; Yang, Z.; Liu, Z. High-precision micro-displacement measurement in a modified reversal shearing interferometer using vortex beams. Opt. Commun. 2023, 537, 129454. [Google Scholar] [CrossRef]
  444. Serrano-Trujillo, A.; Anderson, M.E. Surface profilometry using vortex beams generated with a spatial light modulator. Opt. Commun. 2018, 427, 557–562. [Google Scholar] [CrossRef]
  445. Passos, M.; Lemos, M.; Almeida, S.; Balthazar, W.; Da Silva, L.; Huguenin, J. Speckle patterns produced by an optical vortex and its application to surface roughness measurements. Appl. Opt. 2017, 56, 330–335. [Google Scholar] [CrossRef]
  446. Yang, Y.; Ren, Y.X.; Rosales-Guzmán, C. Optical Vortices: Fundamentals and Applications; IOP Publishing: Bristol, UK, 2024. [Google Scholar]
  447. Zuo, C.; Qian, J.; Feng, S.; Yin, W.; Li, Y.; Fan, P.; Han, J.; Qian, K.; Chen, Q. Deep learning in optical metrology: A review. Light Sci. Appl. 2022, 11, 39. [Google Scholar] [CrossRef]
  448. Wang, K.; Song, L.; Wang, C.; Ren, Z.; Zhao, G.; Dou, J.; Di, J.; Barbastathis, G.; Zhou, R.; Zhao, J.; et al. On the use of deep learning for phase recovery. Light Sci. Appl. 2024, 13, 4. [Google Scholar] [CrossRef] [PubMed]
  449. Liu, H.; Yan, N.; Shao, B.; Yuan, S.; Zhang, X. Deep learning in fringe projection: A review. Neurocomputing 2024, 581, 127493. [Google Scholar] [CrossRef]
  450. Wang, H.; Hu, J.; Wei, S.; Qu, Y. One-shot TSOM with a multi-task learning model for simultaneous dimension measurement and defect inspection. Opt. Lasers Eng. 2024, 181, 108345. [Google Scholar] [CrossRef]
  451. Houben, T.; Huisman, T.; Pisarenco, M.; Van Der Sommen, F.; de With, P. Training procedure for scanning electron microscope 3D surface reconstruction using unsupervised domain adaptation with simulated data. J. Micro/Nanopatterning Mater. Metrol. 2023, 22, 031208. [Google Scholar] [CrossRef]
  452. Liu, M. Scattering approaches. In Advances in Optical Surface Texture Metrology; IOP Publishing: Bristol, UK, 2020; pp. 1–6. [Google Scholar]
  453. Saba, A.; Gigli, C.; Ayoub, A.B.; Psaltis, D. Physics-informed neural networks for diffraction tomography. Adv. Photonics 2022, 4, 066001. [Google Scholar] [CrossRef]
  454. Gigli, C.; Saba, A.; Ayoub, A.B.; Psaltis, D. Predicting nonlinear optical scattering with physics-driven neural networks. Apl Photonics 2023, 8, 026105. [Google Scholar] [CrossRef]
  455. Medvedev, V.; Erdmann, A.; Rosskopf, A. Physics-informed deep learning for 3D modeling of light diffraction from optical metasurfaces. Opt. Express 2025, 33, 1371–1384. [Google Scholar] [CrossRef]
  456. Park, S.; Kim, Y.; Moon, I. Automated phase unwrapping in digital holography with deep learning. Biomed. Opt. Express 2021, 12, 7064–7081. [Google Scholar] [CrossRef]
  457. Fang, L.; Monroe, F.; Novak, S.W.; Kirk, L.; Schiavon, C.R.; Yu, S.B.; Zhang, T.; Wu, M.; Kastner, K.; Latif, A.A.; et al. Deep learning-based point-scanning super-resolution imaging. Nat. Methods 2021, 18, 406–416. [Google Scholar] [CrossRef] [PubMed]
  458. Chen, R.; Tang, X.; Zhao, Y.; Shen, Z.; Zhang, M.; Shen, Y.; Li, T.; Chung, C.H.Y.; Zhang, L.; Wang, J.; et al. Single-frame deep-learning super-resolution microscopy for intracellular dynamics imaging. Nat. Commun. 2023, 14, 2854. [Google Scholar] [CrossRef] [PubMed]
  459. White, J.; Wang, S.; Eschen, W.; Rothhardt, J. Real-time phase-retrieval and wavefront sensing enabled by an artificial neural network. Opt. Express 2021, 29, 9283–9293. [Google Scholar] [CrossRef]
  460. Fang, Q.; Xia, H.; Song, Q.; Zhang, M.; Guo, R.; Montresor, S.; Picart, P. Speckle denoising based on deep learning via a conditional generative adversarial network in digital holographic interferometry. Opt. Express 2022, 30, 20666–20683. [Google Scholar] [CrossRef]
  461. Yu, H.; Fang, Q.; Song, Q.; Montresor, S.; Picart, P.; Xia, H. Unsupervised speckle denoising in digital holographic interferometry based on 4-f optical simulation integrated cycle-consistent generative adversarial network. Appl. Opt. 2024, 63, 3557–3569. [Google Scholar] [CrossRef]
  462. Durech, E.; Newberry, W.; Franke, J.; Sarunic, M.V. Wavefront sensor-less adaptive optics using deep reinforcement learning. Biomed. Opt. Express 2021, 12, 5423–5438. [Google Scholar] [CrossRef]
  463. Landman, R.; Haffert, S.Y.; Radhakrishnan, V.M.; Keller, C.U. Self-optimizing adaptive optics control with reinforcement learning for high-contrast imaging. J. Astron. Telesc. Instruments Syst. 2021, 7, 039002. [Google Scholar] [CrossRef]
  464. Roos-Hoefgeest, S.; Roos-Hoefgeest, M.; Álvarez, I.; González, R.C. Reinforcement Learning Approach to Optimizing Profilometric Sensor Trajectories for Surface Inspection. Sensors 2025, 25, 2271. [Google Scholar] [CrossRef]
  465. Wong, J.C.; Chiu, P.H.; Ooi, C.C.; Da, M.H. Robustness of physics-informed neural networks to noise in sensor data. arXiv 2022, arXiv:2211.12042. [Google Scholar] [CrossRef]
Figure 1. Overview of application scenarios and technology classification of optical metrology and perception.
Figure 1. Overview of application scenarios and technology classification of optical metrology and perception.
Sensors 25 06811 g001
Figure 2. Interference-based measurement methods: (a) LIGO laser interferometer; (b) grating interferometer; (c) optical frequency comb; (d) Fizeau interferometer.
Figure 2. Interference-based measurement methods: (a) LIGO laser interferometer; (b) grating interferometer; (c) optical frequency comb; (d) Fizeau interferometer.
Sensors 25 06811 g002
Figure 3. Geometric optical imaging-based metrology: (a) the direct type of laser triangulation; (b) principle of TOF distance measurment; (c) binocular stereo vision; (d) structured light.
Figure 3. Geometric optical imaging-based metrology: (a) the direct type of laser triangulation; (b) principle of TOF distance measurment; (c) binocular stereo vision; (d) structured light.
Sensors 25 06811 g003
Figure 4. Representative applications of computational imaging (CI): (a) comparison of conventional and computational optical imaging architecture; (b) CI systems employing multiple single-photon detectors for 3D profiling; (c) single-disperser coded aperture snapshot spectral imaging (CASSI); (d) compressed ultrafast photography (CUP).
Figure 4. Representative applications of computational imaging (CI): (a) comparison of conventional and computational optical imaging architecture; (b) CI systems employing multiple single-photon detectors for 3D profiling; (c) single-disperser coded aperture snapshot spectral imaging (CASSI); (d) compressed ultrafast photography (CUP).
Sensors 25 06811 g004
Figure 5. Chromatic confocal technology and its applications: (a) CCT principle; (b) chromatic confocal sensor for contour measurement; (c) chromatic confocal sensor for workpiece thickness measurement.
Figure 5. Chromatic confocal technology and its applications: (a) CCT principle; (b) chromatic confocal sensor for contour measurement; (c) chromatic confocal sensor for workpiece thickness measurement.
Sensors 25 06811 g005
Figure 6. Research progress of chromatic confocal technology: (a) light source of chromatic confocal system: PCF device; (b) DOE-based chromatic confocal system; (c) detection methods of chromatic confocal system; (d) normalization strategies of chromatic confocal system: pre-scanned self-reference spectrum.
Figure 6. Research progress of chromatic confocal technology: (a) light source of chromatic confocal system: PCF device; (b) DOE-based chromatic confocal system; (c) detection methods of chromatic confocal system; (d) normalization strategies of chromatic confocal system: pre-scanned self-reference spectrum.
Sensors 25 06811 g006
Figure 7. Comparison between (a) conventional optical microscopy and (b) confocal laser scanning microscopy.
Figure 7. Comparison between (a) conventional optical microscopy and (b) confocal laser scanning microscopy.
Sensors 25 06811 g007
Figure 8. General scheme of ellipsometer: (a) a standard ellipsometer; (b) a imaging–conoscopic Mueller polarimeter; (c) a three-step phase shift imaging ellipsometer.
Figure 8. General scheme of ellipsometer: (a) a standard ellipsometer; (b) a imaging–conoscopic Mueller polarimeter; (c) a three-step phase shift imaging ellipsometer.
Sensors 25 06811 g008
Table 1. Classification of optical measurement techniques across multiple spatial scales.
Table 1. Classification of optical measurement techniques across multiple spatial scales.
ScaleAccuracyMeasurement RangeTechniqueTypical Application
Nanometer ± 0.1  nm0.1–1 mmWhite-light interferometryWafer surface topography inspection 
Nanometer ± 1  nm0.01–0.5 mmFizeau interferometryOptical component surface metrology 
Micrometer0.2 μm0.1–5 mmLaser confocal microscopyReverse engineering of precision parts 
Micrometer0.1 μm1–10 mmSpectral confocal microscopyThickness measurement of transparent materials 
Micrometer ± 0.5 μm0.05–2 mmStructured light (sinusoidal fringe)High-reflectivity surface inspection 
Millimeter ± 5  μm50–500 mmStructured light (speckle encoding)3D reconstruction of automotive bodies 
Millimeter ± 0.1  mm3–7 cmLaser triangulationDimensional inspection of industrial parts 
Centimeter ± 3  mm0.1–300 mLiDAR (dToF)Large-scale 3D industrial mapping 
Centimeter ± 1  cm1–10 mStereo visionKinect motion capture 
Decimeter ± 5  cm1–50 miToF phase rangingVR gesture interaction 
Meter ± 0.1 % 1–100 mFMCW coherent rangingAutonomous driving obstacle detection 
Cross-scalePixel level0.1–100 mComputational imagingImaging through scattering media 
Cross-scalePhoton level1–1000 mQuantum imagingSingle-photon night vision systems 
Special scale ± 10  nm1–100 μmMicroscopic imaging3D reconstruction of biological cells 
Table 2. Common Spectral Detection Techniques.
Table 2. Common Spectral Detection Techniques.
CategoryTypical MethodsMeasurement PrincipleCharacteristics and Limitations
Absorption/
Transmission
UV–Vis, NIR spectroscopyMeasures light absorption or transmission at specific wavelengthsOffers fast data acquisition and simple setup; however, spatial resolution is limited for material composition analysis.
Reflection SpectroscopyWhite-light reflectance, RIFSAnalyzes wavelength-dependent surface reflection propertiesProvides surface sensitivity but limited morphological discrimination; requires careful calibration for quantitative use.
Raman SpectroscopyRaman microscopy, confocal imagingDetects inelastic scattering frequency shifts induced by molecular vibrationsEnables chemical mapping with high specificity, but the inherently weak Raman signals demand long acquisition times.
Fluorescence SpectroscopyTime-resolved micro-fluorescenceDetects excitation–emission spectra from fluorescent markersAchieves high sensitivity and selectivity; however, fluorophore tagging is often required and may alter sample properties.
Dispersive Focus ProfilingChromatic confocal technology (CCT)Encodes depth information via wavelength-dependent focal shiftsProvides non-contact, tilt-tolerant 3D profiling for complex or transparent surfaces.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shan, S.; Zhao, F.; Li, Z.; Luo, L.; Li, X. A Comprehensive Review of Optical Metrology and Perception Technologies. Sensors 2025, 25, 6811. https://doi.org/10.3390/s25226811

AMA Style

Shan S, Zhao F, Li Z, Luo L, Li X. A Comprehensive Review of Optical Metrology and Perception Technologies. Sensors. 2025; 25(22):6811. https://doi.org/10.3390/s25226811

Chicago/Turabian Style

Shan, Shuonan, Fangyuan Zhao, Zinan Li, Linbin Luo, and Xinghui Li. 2025. "A Comprehensive Review of Optical Metrology and Perception Technologies" Sensors 25, no. 22: 6811. https://doi.org/10.3390/s25226811

APA Style

Shan, S., Zhao, F., Li, Z., Luo, L., & Li, X. (2025). A Comprehensive Review of Optical Metrology and Perception Technologies. Sensors, 25(22), 6811. https://doi.org/10.3390/s25226811

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop