Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (28)

Search Parameters:
Keywords = polarization pixelated camera

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 7946 KiB  
Article
Optical Camera Characterization for Feature-Based Navigation in Lunar Orbit
by Pierluigi Federici, Antonio Genova, Simone Andolfo, Martina Ciambellini, Riccardo Teodori and Tommaso Torrini
Aerospace 2025, 12(5), 374; https://doi.org/10.3390/aerospace12050374 - 26 Apr 2025
Viewed by 503
Abstract
Accurate localization is a key requirement for deep-space exploration, enabling spacecraft operations with limited ground support. Upcoming commercial and scientific missions to the Moon are designed to extensively use optical measurements during low-altitude orbital phases, descent and landing, and high-risk operations, due to [...] Read more.
Accurate localization is a key requirement for deep-space exploration, enabling spacecraft operations with limited ground support. Upcoming commercial and scientific missions to the Moon are designed to extensively use optical measurements during low-altitude orbital phases, descent and landing, and high-risk operations, due to the versatility and suitability of these data for onboard processing. Navigation frameworks based on optical data analysis have been developed to support semi- or fully-autonomous onboard systems, enabling precise relative localization. To achieve high-accuracy navigation, optical data have been combined with complementary measurements using sensor fusion techniques. Absolute localization is further supported by integrating onboard maps of cataloged surface features, enabling position estimation in an inertial reference frame. This study presents a navigation framework for optical image processing aimed at supporting the autonomous operations of lunar orbiters. The primary objective is a comprehensive characterization of the navigation camera’s properties and performance to ensure orbit determination uncertainties remain below 1% of the spacecraft altitude. In addition to an analysis of measurement noise, which accounts for both hardware and software contributions and is evaluated across multiple levels consistent with prior literature, this study emphasizes the impact of process noise on orbit determination accuracy. The mismodeling of orbital dynamics significantly degrades orbit estimation performance, even in scenarios involving high-performing navigation cameras. To evaluate the trade-off between measurement and process noise, representing the relative accuracy of the navigation camera and the onboard orbit propagator, numerical simulations were carried out in a synthetic lunar environment using a near-polar, low-altitude orbital configuration. Under nominal conditions, the optical measurement noise was set to 2.5 px, corresponding to a ground resolution of approximately 160 m based on the focal length, pixel pitch, and altitude of the modeled camera. With a conservative process noise model, position errors of about 200 m are observed in both transverse and normal directions. The results demonstrate the estimation framework’s robustness to modeling uncertainties, adaptability to varying measurement conditions, and potential to support increased onboard autonomy for small spacecraft in deep-space missions. Full article
(This article belongs to the Special Issue Planetary Exploration)
Show Figures

Figure 1

21 pages, 4607 KiB  
Article
Polarization-Modulated Optical Homodyne for Time-of-Flight Imaging with Standard CMOS Sensors
by Ayaka Ebisu, Takahito Aoto and Tsuyoshi Takatani
Sensors 2025, 25(6), 1886; https://doi.org/10.3390/s25061886 - 18 Mar 2025
Viewed by 499
Abstract
Indirect time-of-flight (iToF) imaging is a widely applied technique to obtain a depth image from the phase difference of amplitude-modulated signals between emitted light and reflected light. The phase difference is computed via electrical correlation on a conventional iToF sensor. However, iToF sensors [...] Read more.
Indirect time-of-flight (iToF) imaging is a widely applied technique to obtain a depth image from the phase difference of amplitude-modulated signals between emitted light and reflected light. The phase difference is computed via electrical correlation on a conventional iToF sensor. However, iToF sensors face a trade-off between spatial resolution and light collection efficiency because it is hard to downsize the circuit of the electrical correlation in a pixel. Thus, we propose a novel iToF depth imaging system based on polarization-modulated optical homodyne detection with a standard CMOS sensor. A resonant photoelastic modulator is employed to modulate the polarization state, enabling optical correlation through interaction with an analyzer. The homodyne detection enhances noise resistance and sensitivity in the phase difference estimation. Furthermore, the use of a polarization camera allows to reduce the number of measurements. We first validate the successful estimation of the phase difference in both setups with an avalanche photodiode or a CMOS sensor. The experimental results show accurate depth estimation even in challenging factors such as a low signal-to-noise ratio, temporal intensity variations, and speckle noise. The proposed system enables high-resolution iToF depth imaging using readily available image sensors. Full article
(This article belongs to the Special Issue Recent Advances in CMOS Image Sensor)
Show Figures

Figure 1

23 pages, 6487 KiB  
Article
Synchronous Atmospheric Correction of Wide-Swath and Wide-Field Remote Sensing Image from HJ-2A/B Satellite
by Honglian Huang, Yuxuan Wang, Xiao Liu, Rufang Ti, Xiaobing Sun, Zhenhai Liu, Xuefeng Lei, Jun Lin and Lanlan Fan
Remote Sens. 2025, 17(5), 884; https://doi.org/10.3390/rs17050884 - 1 Mar 2025
Viewed by 1081
Abstract
The Chinese HuanjingJianzai-2 (HJ-2) A/B satellites are equipped with advanced sensors, including a Multispectral Camera (MSC) and a Polarized Scanning Atmospheric Corrector (PSAC). To address the challenges of atmospheric correction (AC) for the MSC’s wide-swath, wide-field images, this study proposes a pixel-by-pixel method [...] Read more.
The Chinese HuanjingJianzai-2 (HJ-2) A/B satellites are equipped with advanced sensors, including a Multispectral Camera (MSC) and a Polarized Scanning Atmospheric Corrector (PSAC). To address the challenges of atmospheric correction (AC) for the MSC’s wide-swath, wide-field images, this study proposes a pixel-by-pixel method incorporating Bidirectional Reflectance Distribution Function (BRDF) effects. The approach uses synchronous atmospheric parameters from the PSAC, an atmospheric correction lookup table, and a semi-empirical BRDF model to produce surface reflectance (SR) products through radiative, adjacency effect, and BRDF corrections. The corrected images showed significant improvements in clarity and contrast compared to pre-correction images, with minimum increases of 55.91% and 35.63%, respectively. Validation experiments in Dunhuang and Hefei, China, demonstrated high consistency between the corrected SR and ground-truth data, with maximum deviations below 0.03. For surface types not covered by ground measurements, comparisons with Sentinel-2 SR products yielded maximum deviations below 0.04. These results highlight the effectiveness of the proposed method in improving image quality and accuracy, providing reliable data support for applications such as disaster monitoring, water resource management, and crop monitoring. Full article
Show Figures

Figure 1

17 pages, 4378 KiB  
Article
Snapshot Imaging of Stokes Vector Polarization Speckle in Turbid Optical Phantoms and In Vivo Tissues
by Daniel C. Louie, Carla Kulcsar, Héctor A. Contreras-Sánchez, W. Jeffrey Zabel, Tim K. Lee and Alex Vitkin
Photonics 2025, 12(1), 59; https://doi.org/10.3390/photonics12010059 - 11 Jan 2025
Viewed by 1192
Abstract
Significance: We present a system to measure and analyze the complete polarization state distribution of speckle patterns generated from in vivo tissue. Accurate measurement of polarization speckle requires both precise spatial registration and rapid polarization state acquisition. A unique measurement system must be [...] Read more.
Significance: We present a system to measure and analyze the complete polarization state distribution of speckle patterns generated from in vivo tissue. Accurate measurement of polarization speckle requires both precise spatial registration and rapid polarization state acquisition. A unique measurement system must be designed to achieve accurate images of polarization speckle patterns for detailed investigation of the scattering properties of biological tissues in vivo. Aim and approach: This system features a polarization state analyzer with no moving parts. Two pixel-polarizer cameras allow for the instantaneous acquisition of the spatial Stokes vector distribution of polarization speckle patterns. System design and calibration methods are presented, and representative images from measurements on liquid phantoms (microsphere suspensions) and in vivo healthy and tumor murine models are demonstrated and discussed. Results and Conclusions: Quantitative measurements of polarization speckle from microsphere suspensions with controlled scattering coefficients demonstrate differences in speckle contrast, speckle size, and the degree of polarization. Measurements on in vivo murine skin and xenograft tumor tissue demonstrate the ability of the system to acquire snapshot polarization speckle images in living systems. The developed system can thus rapidly and accurately acquire polarization speckle images from different media in dynamic conditions such as in vivo tissue. This capability opens the potential for future detailed investigation of polarization speckle for in vivo biomedical applications. Full article
(This article belongs to the Special Issue New Shining Spots in Biomedical Photonics)
Show Figures

Figure 1

15 pages, 7201 KiB  
Article
Using Light Polarization to Identify Fiber Orientation in Carbon Fiber Components: Metrological Analysis
by Luciano Chiominto, Giulio D’Emilia and Emanuela Natale
Sensors 2024, 24(17), 5685; https://doi.org/10.3390/s24175685 - 31 Aug 2024
Viewed by 1511
Abstract
In this work, a method for measuring tow angles in carbon fiber components, based on the use of a polarized camera, is analyzed from a metrological point of view. Carbon fibers alter the direction of the reflected light’s electrical field, so that in [...] Read more.
In this work, a method for measuring tow angles in carbon fiber components, based on the use of a polarized camera, is analyzed from a metrological point of view. Carbon fibers alter the direction of the reflected light’s electrical field, so that in each point of the surface of a composite piece, the angle of polarization of reflected light matches the fiber orientation. A statistical analysis of the angle of linear polarization (AoLP) in each pixel of each examined area allows to evaluate the average winding angle. An evaluation of the measurement uncertainty of the method on a cylinder obtained by a filament winding process is carried out, and the result appears adequate for the study of the distribution of angles along the surface of the piece, in order to optimize the process. Full article
(This article belongs to the Special Issue Advanced Sensing Technology in Structural Health Monitoring)
Show Figures

Figure 1

12 pages, 2854 KiB  
Article
Compact Single-Shot Dual-Wavelength Interferometry for Large Object Measurement with Rough Surfaces
by Yizhang Yan, Suhas P. Veetil, Pengfei Zhu, Feng Gao, Yan Kong, Xiaoliang He, Aihui Sun, Zhilong Jiang and Cheng Liu
Photonics 2024, 11(6), 518; https://doi.org/10.3390/photonics11060518 - 28 May 2024
Cited by 2 | Viewed by 1545
Abstract
Single-shot dual-wavelength interferometry offers a promising avenue for surface profile measurement of dynamic objects. However, current techniques employing pixel multiplexing or color cameras encounter challenges such as complex optical alignment, limited measurement range, and difficulty in measuring rough surfaces. To address these issues, [...] Read more.
Single-shot dual-wavelength interferometry offers a promising avenue for surface profile measurement of dynamic objects. However, current techniques employing pixel multiplexing or color cameras encounter challenges such as complex optical alignment, limited measurement range, and difficulty in measuring rough surfaces. To address these issues, this study presents a novel approach to single-shot dual-wavelength interferometry. By utilizing separated polarization illumination and detection, along with a monochromatic polarization camera and two slightly different wavelengths, this method enables the simultaneous recording of two frames of separated interferometric patterns. This approach facilitates straightforward optical alignment, expands measurement ranges, accelerates data acquisition, and simplifies data processing for dual-wavelength interferometry. Consequently, it enables online shape measurement of large dynamic samples with rough surfaces. Full article
(This article belongs to the Special Issue Recent Advances in 3D Optical Measurement)
Show Figures

Figure 1

20 pages, 6947 KiB  
Article
Fusion of Multimodal Imaging and 3D Digitization Using Photogrammetry
by Roland Ramm, Pedro de Dios Cruz, Stefan Heist, Peter Kühmstedt and Gunther Notni
Sensors 2024, 24(7), 2290; https://doi.org/10.3390/s24072290 - 3 Apr 2024
Cited by 3 | Viewed by 2373
Abstract
Multimodal sensors capture and integrate diverse characteristics of a scene to maximize information gain. In optics, this may involve capturing intensity in specific spectra or polarization states to determine factors such as material properties or an individual’s health conditions. Combining multimodal camera data [...] Read more.
Multimodal sensors capture and integrate diverse characteristics of a scene to maximize information gain. In optics, this may involve capturing intensity in specific spectra or polarization states to determine factors such as material properties or an individual’s health conditions. Combining multimodal camera data with shape data from 3D sensors is a challenging issue. Multimodal cameras, e.g., hyperspectral cameras, or cameras outside the visible light spectrum, e.g., thermal cameras, lack strongly in terms of resolution and image quality compared with state-of-the-art photo cameras. In this article, a new method is demonstrated to superimpose multimodal image data onto a 3D model created by multi-view photogrammetry. While a high-resolution photo camera captures a set of images from varying view angles to reconstruct a detailed 3D model of the scene, low-resolution multimodal camera(s) simultaneously record the scene. All cameras are pre-calibrated and rigidly mounted on a rig, i.e., their imaging properties and relative positions are known. The method was realized in a laboratory setup consisting of a professional photo camera, a thermal camera, and a 12-channel multispectral camera. In our experiments, an accuracy better than one pixel was achieved for the data fusion using multimodal superimposition. Finally, application examples of multimodal 3D digitization are demonstrated, and further steps to system realization are discussed. Full article
(This article belongs to the Special Issue Multi-Modal Image Processing Methods, Systems, and Applications)
Show Figures

Figure 1

12 pages, 10570 KiB  
Article
Parallel Phase-Shifting Digital Holographic Phase Imaging of Micro-Optical Elements with a Polarization Camera
by Bingcai Liu, Xinmeng Fang, Ailing Tian, Siqi Wang, Ruixuan Zhang, Hongjun Wang and Xueliang Zhu
Photonics 2023, 10(12), 1291; https://doi.org/10.3390/photonics10121291 - 23 Nov 2023
Cited by 6 | Viewed by 2422
Abstract
In this paper, we propose a measurement method of micro-optical elements with parallel phase-shifting digital holographic phase imaging. This method can record four phase-shifting holograms with a phase difference of π/2 in a single shot and correct the pixel mismatch error of the [...] Read more.
In this paper, we propose a measurement method of micro-optical elements with parallel phase-shifting digital holographic phase imaging. This method can record four phase-shifting holograms with a phase difference of π/2 in a single shot and correct the pixel mismatch error of the polarization camera using a bilinear interpolation algorithm, thereby producing high-resolution four-step phase-shifting holograms. This method reconstructs the real phase information of the object to be measured through a four-step phase-shifting algorithm. The reproduced image eliminates the interference of zero-order images and conjugate images, overcoming the problem that traditional phase-shifting digital holography cannot be measured in real time. A simulation analysis showed that the relative error of this measurement method could reach 0.0051%. The accurate surface topography information of the object was reconstructed from an experimental measurement through a microlens array. Multiple measurements yielded a mean absolute error and a mean relative error for the vertical height of the microlens array down to 5.9500 nm and 0.0461%, respectively. Full article
(This article belongs to the Special Issue Photodetector Materials and Optoelectronic Devices)
Show Figures

Figure 1

22 pages, 10296 KiB  
Article
Joint Calibration of a Multimodal Sensor System for Autonomous Vehicles
by Jon Muhovič and Janez Perš
Sensors 2023, 23(12), 5676; https://doi.org/10.3390/s23125676 - 17 Jun 2023
Cited by 4 | Viewed by 2808
Abstract
Multimodal sensor systems require precise calibration if they are to be used in the field. Due to the difficulty of obtaining the corresponding features from different modalities, the calibration of such systems is an open problem. We present a systematic approach for calibrating [...] Read more.
Multimodal sensor systems require precise calibration if they are to be used in the field. Due to the difficulty of obtaining the corresponding features from different modalities, the calibration of such systems is an open problem. We present a systematic approach for calibrating a set of cameras with different modalities (RGB, thermal, polarization, and dual-spectrum near infrared) with regard to a LiDAR sensor using a planar calibration target. Firstly, a method for calibrating a single camera with regard to the LiDAR sensor is proposed. The method is usable with any modality, as long as the calibration pattern is detected. A methodology for establishing a parallax-aware pixel mapping between different camera modalities is then presented. Such a mapping can then be used to transfer annotations, features, and results between highly differing camera modalities to facilitate feature extraction and deep detection and segmentation methods. Full article
Show Figures

Figure 1

14 pages, 5214 KiB  
Article
High-Resolution, Broad-Range Detection Setup for Polarimetric Optical Fiber Sensors
by Paweł Wierzba
Appl. Sci. 2023, 13(8), 4849; https://doi.org/10.3390/app13084849 - 12 Apr 2023
Viewed by 1689
Abstract
A common-path polarization interferometer using a Wollaston prism and an area detector for the measurement of retardation or optical path difference is presented. Employing a moderate-resolution 1280 by 1024 pixel monochrome camera, it offers a measurement range of approximately 780 radians at 830 [...] Read more.
A common-path polarization interferometer using a Wollaston prism and an area detector for the measurement of retardation or optical path difference is presented. Employing a moderate-resolution 1280 by 1024 pixel monochrome camera, it offers a measurement range of approximately 780 radians at 830 nm and 1350 radians at 515 nm while maintaining a high measurement resolution. Retardation introduced by a zero-order waveplate or a Soleil–Babinet compensator was measured to evaluate the performance of the interferometer. Based on the presented measurement results, the resolution of the measurement is estimated to be better than 0.002 rad. Full article
(This article belongs to the Section Optics and Lasers)
Show Figures

Figure 1

16 pages, 4053 KiB  
Article
Real-Time Ellipsometric Surface Plasmon Resonance Sensor Using Polarization Camera May Provide the Ultimate Detection Limit
by Nipun Vashistha, Marwan J. Abuleil, Anand M. Shrivastav, Aabha Bajaj and Ibrahim Abdulhalim
Biosensors 2023, 13(2), 173; https://doi.org/10.3390/bios13020173 - 22 Jan 2023
Cited by 9 | Viewed by 3956
Abstract
Ellipsometric Surface Plasmon Resonance (SPR) sensors are known for their relatively simple optical configuration compared to interferometric and optical heterodyne phase interrogation techniques. However, most of the previously explored ellipsometric SPR sensors based on intensity measurements are limited by their real-time applications because [...] Read more.
Ellipsometric Surface Plasmon Resonance (SPR) sensors are known for their relatively simple optical configuration compared to interferometric and optical heterodyne phase interrogation techniques. However, most of the previously explored ellipsometric SPR sensors based on intensity measurements are limited by their real-time applications because phase or polarization shifts are conducted serially. Here we present an ellipsometric SPR sensor based on a Kretschmann–Raether (KR) diverging beam configuration and a pixelated microgrid polarization camera. The proposed methodology has the advantage of real-time and higher precision sensing applications. The short-term stability of the measurement using the ellipsometric parameters tanψ and cos(Δ) is found to be superior over direct SPR or intensity measurements, particularly with fluctuating sources such as laser diodes. Refractive index and dynamic change measurements in real-time are presented together with Bovine Serum Albumin (BSA)–anti-BSA antibody binding to demonstrate the potential of the developed sensor for biological sensing applications with a resolution of sub-nM and down to pM with additional optimization. The analysis shows that this approach may provide the ultimate detection limit for SPR sensors. Full article
(This article belongs to the Special Issue Advanced Surface Plasmon Resonance Sensor and Its Application)
Show Figures

Figure 1

11 pages, 2416 KiB  
Article
Pixelated Micropolarizer Array Based on Carbon Nanotube Films
by Hui Zhang, Yanji Yi, Yibin Wang, Huwang Hou, Ting Meng, Peng Zhang and Yang Zhao
Nanomaterials 2023, 13(3), 391; https://doi.org/10.3390/nano13030391 - 18 Jan 2023
Cited by 1 | Viewed by 1902
Abstract
A micropolarizer array (MPA) that can be integrated into a scientific camera is proposed as a real-time polarimeter that is capable of extracting the polarization parameters. The MPA is based on highly aligned carbon nanotube (CNT) films inspired by their typical anisotropy and [...] Read more.
A micropolarizer array (MPA) that can be integrated into a scientific camera is proposed as a real-time polarimeter that is capable of extracting the polarization parameters. The MPA is based on highly aligned carbon nanotube (CNT) films inspired by their typical anisotropy and selectivity for light propagation over a wide spectral range. The MPA contains a dual-tier CNT pixel plane with 0° and 45° orientations. The thickness of the dual-tier structure of the CNT-based MPA is limited to less than 2 μm with a pixel size of 7.45 μm × 7.45 μm. The degree of polarization of the CNT-MPA reached 93% at a 632 nm wavelength. The specific designs in structure and semiconductor fabrication procedures are described. Compared with customary MPAs, CNT-based MPA holds great potential in decreasing the cross-talk risk associated with lower film thickness and can be extended to a wide spectral range. Full article
(This article belongs to the Special Issue 2D Materials for Advanced Sensors: Fabrication and Applications)
Show Figures

Figure 1

17 pages, 16219 KiB  
Article
Development of an Analog Gauge Reading Solution Based on Computer Vision and Deep Learning for an IoT Application
by João Peixoto, João Sousa, Ricardo Carvalho, Gonçalo Santos, Joaquim Mendes, Ricardo Cardoso and Ana Reis
Telecom 2022, 3(4), 564-580; https://doi.org/10.3390/telecom3040032 - 14 Oct 2022
Cited by 10 | Viewed by 6327
Abstract
In many industries, analog gauges are monitored manually, thus posing problems, especially in large facilities where gauges are often placed in hard-to-access or dangerous locations. This work proposes a solution based on a microcontroller (ESP32-CAM) and a camera (OV2640 with a 65° FOV [...] Read more.
In many industries, analog gauges are monitored manually, thus posing problems, especially in large facilities where gauges are often placed in hard-to-access or dangerous locations. This work proposes a solution based on a microcontroller (ESP32-CAM) and a camera (OV2640 with a 65° FOV lent) to capture a gauge image and send it to a local computer where it is processed, and the results are presented in a dashboard accessible through the web. This was achieved by first applying a Convolutional Neural Network (CNN) to detect the gauge with the CenterNet HourGlass104 model. After locating the dial, it is segmented using the circle Hough transform, followed by a polar transformation to determine the pointer angle using the pixel projection. In the end, the indicating value is determined using the angle method. The dataset used was composed of 204 gauge images split into train and test sets using a 70:30 ratio. Due to the small size of the dataset, a diverse set of data augmentations were applied to obtain high accuracy and a well-generalized gauge detection model. Additionally, the experimental results demonstrated adequate robustness and accuracy for industrial environments achieving an average relative error of 0.95%. Full article
Show Figures

Figure 1

20 pages, 14751 KiB  
Article
Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS
by Ningbo Long, Han Yan, Liqiang Wang, Haifeng Li and Qing Yang
Sensors 2022, 22(7), 2453; https://doi.org/10.3390/s22072453 - 23 Mar 2022
Cited by 14 | Viewed by 5522
Abstract
The perception module plays an important role in vehicles equipped with advanced driver-assistance systems (ADAS). This paper presents a multi-sensor data fusion system based on the polarization color stereo camera and the forward-looking light detection and ranging (LiDAR), which achieves the multiple target [...] Read more.
The perception module plays an important role in vehicles equipped with advanced driver-assistance systems (ADAS). This paper presents a multi-sensor data fusion system based on the polarization color stereo camera and the forward-looking light detection and ranging (LiDAR), which achieves the multiple target detection, recognition, and data fusion. The You Only Look Once v4 (YOLOv4) network is utilized to achieve object detection and recognition on the color images. The depth images are obtained from the rectified left and right images based on the principle of the epipolar constraints, then the obstacles are detected from the depth images using the MeanShift algorithm. The pixel-level polarization images are extracted from the raw polarization-grey images, then the water hazards are detected successfully. The PointPillars network is employed to detect the objects from the point cloud. The calibration and synchronization between the sensors are accomplished. The experiment results show that the data fusion enriches the detection results, provides high-dimensional perceptual information and extends the effective detection range. Meanwhile, the detection results are stable under diverse range and illumination conditions. Full article
(This article belongs to the Section Optical Sensors)
Show Figures

Figure 1

10 pages, 6024 KiB  
Communication
A Simple Phase-Sensitive Surface Plasmon Resonance Sensor Based on Simultaneous Polarization Measurement Strategy
by Meng-Chi Li, Kai-Ren Chen, Chien-Cheng Kuo, Yu-Xen Lin and Li-Chen Su
Sensors 2021, 21(22), 7615; https://doi.org/10.3390/s21227615 - 16 Nov 2021
Cited by 15 | Viewed by 3095
Abstract
The SPR phenomenon results in an abrupt change in the optical phase such that one can measure the phase shift of the reflected light as a sensing parameter. Moreover, many studies have demonstrated that the phase changes more acutely than the intensity, leading [...] Read more.
The SPR phenomenon results in an abrupt change in the optical phase such that one can measure the phase shift of the reflected light as a sensing parameter. Moreover, many studies have demonstrated that the phase changes more acutely than the intensity, leading to a higher sensitivity to the refractive index change. However, currently, the optical phase cannot be measured directly because of its high frequency; therefore, investigators usually have to use complicated techniques for the extraction of phase information. In this study, we propose a simple and effective strategy for measuring the SPR phase shift based on phase-shift interferometry. In this system, the polarization-dependent interference signals are recorded simultaneously by a pixelated polarization camera in a single snapshot. Subsequently, the phase information can be effortlessly acquired by a phase extraction algorithm. Experimentally, the proposed phase-sensitive SPR sensor was successfully applied for the detection of small molecules of glyphosate, which is the most frequently used herbicide worldwide. Additionally, the sensor exhibited a detection limit of 15 ng/mL (0.015 ppm). Regarding its simplicity and effectiveness, we believe that our phase-sensitive SPR system presents a prospective method for acquiring phase signals. Full article
(This article belongs to the Section Biosensors)
Show Figures

Figure 1

Back to TopTop