Next Article in Journal
The Role of Stabilizing Copolymer in Determining the Physicochemical Properties of Conjugated Polymer Nanoparticles and Their Nanomedical Applications
Next Article in Special Issue
Recent Advances in Tunable Metasurfaces and Their Application in Optics
Previous Article in Journal
Catalytic Nanomedicine as a Therapeutic Approach to Brain Tumors: Main Hypotheses for Mechanisms of Action
Previous Article in Special Issue
Bionic Plasmonic Nanoarrays Excited by Radially Polarized Vector Beam for Metal-Enhanced Fluorescence
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Polarization-Dependent Metasurface Enables Near-Infrared Dual-Modal Single-Pixel Sensing

1
MIIT Key Laboratory of Complex-Field Intelligent Sensing, Beijing Institute of Technology, Beijing 100081, China
2
Advanced Research Institute of Multidisciplinary Science & School of Information and Electronics, Beijing Institute of Technology, Beijing 100081, China
3
Beijing Key Laboratory for Precision Optoelectronic Measurement Instrument and Technology, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Nanomaterials 2023, 13(9), 1542; https://doi.org/10.3390/nano13091542
Submission received: 28 March 2023 / Revised: 22 April 2023 / Accepted: 27 April 2023 / Published: 4 May 2023
(This article belongs to the Special Issue Metasurfaces for Photonic Devices: Theory and Applications)

Abstract

:
Infrared single-pixel sensing with the two most representative modes, bright-field imaging and edge-enhanced imaging, has great application potential in biomedical diagnosis and defect inspection. Building a multifunctional and miniature optical computing device for infrared single-pixel sensing is extremely intriguing. Here, we propose and validate a dual-modal device based on a well-designed metasurface, which enables near-infrared bright-field and edge-enhanced single-pixel imaging. By changing the polarization of the incident beam, these two different modes can be switched. Simulations validate that our device can achieve high-fidelity dual-modal single-pixel sensing at 0.9 μ m with certain noise robustness. We also investigate the generalization of our metasurface-based device and validate that different illumination patterns are applied to our device. Moreover, these output images by our device can be efficiently utilized for biomedical image segmentation. We envision this novel device may open a vista in dual-modal infrared single-pixel sensing.

1. Introduction

Benefiting from the superior detection efficiency and low cost of the single-pixel method [1], infrared single-pixel sensing is emerging as an enabling technology of great technical and scientific interest [2,3,4]. It avoids the use of infrared array sensors with large dark current, low resolution, and high manufacturing cost, but uses a spatial light modulator (SLM) to compress and couple spatial information. It has great potential in various applications, such as defect inspection, biomedical diagnosis [5,6], and remote sensing [7,8]. In these envisioned scenarios, fast and reliable multifunctional sensing plays a vital role, especially when different tasks are needed.
Conventional strategies for different sensing are mainly digital domain computations using integrated circuits at the expense of high-power consumption, low speed, and limited capacity [9]. For better application, optical analog computing is proposed to tailor the field of the incident beam by placing appropriate optical elements in the optical system. It also modulates image information without limiting capacity and allows for parallel operation [10]. However, conventional optical analog computing inevitably increases the complexity and bulk of optical systems, especially when different sensing tasks are needed to be performed [11]. Therefore, developing a multifunctional and miniature optical analog computing device for infrared single-pixel sensing is extremely intriguing.
Metasurfaces, as specially designed two-dimensional optical elements, have become research hotspots. The subwavelength structure of the metasurface can interact with the incident electromagnetic field. These special nanostructures can also exert flexible and large-scale modulation on the amplitude, phase, polarization, and other characteristics of the optical field within the subwavelength thickness range [12,13]. This sudden change in optical parameters breaks the dependence of traditional optical elements on the propagation path. They have great potential in different optical wavefront modulations without changing the components of the optical systems [14,15,16,17,18,19,20]. Meta-designed sensors have shown promise with extremely high sensitivity [21,22]. Moreover, a single metasurface can replace several traditional optical elements, further minimizing the optical systems. These potentials have pushed metasurfaces into optical computing research areas [10,23,24] for conducting some practical tasks, such as optical logic operations [25], optical differentiation operations [26], and optical neural networks [27]. Among these tasks, optical differentiation operations based on computing metasurfaces are mainly used for image edge enhancement. These edge-enhanced images containing textures and morphologies can be used for further sensing tasks. Remarkably, they have been successfully validated for different imaging, including switchable edge-enhanced and bright-field imaging [11,28], tunable edge-enhanced imaging by adding an electric power driver [29,30], etc. In another context, the research on applying the metasurface in the single-pixel imaging domain has attracted extensive attention. Various metasurfaces have been studied for their fascinating roles in conventional single-pixel systems [5,31,32,33]. So far, a metasurface that can host phase-only and helicity-dependent holograms has been proposed to work as switchable and secret ghost imaging targets. This work builds the first bridge between the metasurface hologram and single-pixel imaging [5]. A metasurface can also be a prototype SLM for high-frame-rate single-pixel imaging or simplifying the single-pixel imaging system [31,32]. These works provide new applications for single-pixel imaging. More interestingly, a novel optical encryption scheme has also been proposed depending on the combination of metasurface hologram and single-pixel imaging technology [33]. The metasurface for dual-modal single-pixel sensing is valuable but has not received sufficient attention. Regardless, the development of these metasurface-based devices brings great hope to our research.
Here, we report a dual-modal metasurface-based device for near-infrared single-pixel sensing, which enables infrared and edge-enhanced imaging as requested. The dual-modal device is mainly achieved by a designed metasurface, which can function as different spatial filters just by rotating the polarizer in this device. Specifically, when the polarization of the incident beam is y-linear polarization (YLP), the output mode is high-fidelity bright-field imaging. Similarly, when the polarization of the incident beam is x-linear polarization (XLP), the output mode is high-fidelity edge-enhanced imaging. Additionally, this device is suitable for various illumination patterns, and the same illumination pattern can be used in dual modes. Results show that this device can achieve high-fidelity dual-modal single-pixel sensing at 0.9 μ m with certain robustness. Moreover, the output images of our device can be efficiently utilized for further computer vision tasks, such as biomedical image segmentation.

2. Principle

2.1. Principle of the Device

Figure 1 is the framework of the proposed device for near-infrared dual-modal single-pixel sensing. This device consists of a 4f imaging system embedded with a polarization-dependent metasurface and classical single-pixel imaging optical elements, which can efficiently obtain bright-field or edge-enhanced images. We can switch these two functions only by rotating the polarizer. Specifically, when the polarization of the incident beam is YLP, and the SLM projects Fourier basis patterns on sequence, the output field is a reconstructed high-fidelity bright-field image. Similarly, when the polarization of the incident beam is XLP, and the SLM projects the same illumination patterns, the output field is a reconstructed high-fidelity edge-enhanced image. In addition, this device is suitable for various illumination patterns, and the images obtained by our device can be used for further computer vision tasks, which we will show later.
To explain the feasibility of our proposed device, we analyze the whole processing and relevant principle in detail. In this system, the target scene is modulated to an XLP or YLP incident beam E i n x , y by the rotatable polarizer. x and y are, respectively, x- and y-direction coordinates in the input or output image plane. Then spatial filtering is achieved by optical computing. It is contributed by the 4f imaging system embedded with a polarization-dependent metasurface. The computed field distribution E M can be written as:
E M x , y = F 1 H ( f x , f y ) · F E i n x , y ,
where F represents a 2D spatial Fourier transformation, F 1 represents a 2D inverse spatial Fourier transformation, and H ( f x , f y ) is the optical spectrum transfer function, which is contributed by our metasurface. f x and f y are, respectively, u- and v-direction spatial frequency coordinates in the Fourier plane. Here, two different H ( f x , f y ) are designed for dual-modal single-pixel sensing. When bright-field imaging mode is chosen, the H ( f x , f y ) is equal to a constant. When edge-enhanced imaging mode is chosen, we utilize spiral phase contrast imaging based on the vortex beam [28] to design the H ( f x , f y ) . In this regard, H ( f x , f y ) can be written as:
H x f x , f y = exp i ( ϕ + C 1 )
H y f x , f y = exp i C 2
where H x f x , f y and H y f x , f y are optical spectrum transfer functions for edge-enhanced (XLP) and bright-field imaging (YLP), respectively, f x = u / ( λ f ) , f y = v / ( λ f ) , ϕ = arctan ( v / u ) , and C m ( m = 1 , 2 ) is constant. Our metasurface required phase profiles are Ψ x = ϕ + C 1 and Ψ y = C 2 , which are, respectively, under the illumination of an XLP and YLP incident beam. Next, the E M ( x , y ) field distributions for dual modes are both modulated by the corresponding patterns [34] projected onto the SLM. The obtained inner product D j between patterns P j ( x , y ) and E M ( x , y ) is measured by the single-pixel detector, which can be written as:
D j = x y P j ( x , y ) E M ( x , y ) , j = 1 , 2 , , n .
Accordingly, we finish the acquisition of the modulated target scene in dual modes. Then, the Alternating Direction Method of Multipliers (ADMM) framework is utilized to reconstruct the target image O. We introduce an auxiliary parameter Q to build the objective function into:
( O , Q ) = argmin A O D 2 + T V ( Q ) , s . t . O = Q ,
where A R m × n denotes the modulation matrix (m modulation patterns, and each pattern consists of n pixels), and D R m × 1 is the measurement vector. In addition, T V ( O ) is the total variation regularization term. The following distributed sub-problems can solve the minimization in Equation (5),
O k + 1 = argmin A O D 2 + ρ 2 O k Q k 1 ρ W k 2 , Q k + 1 = argmin T V ( Q ) + ρ 2 Q k O k + 1 + 1 ρ W k 2 , W k + 1 = W k + ρ O k + 1 Q k + 1 ,
where the superscript k represents the iteration number, and ρ represents the hyper-parameter. The sub-problem has a closed-form solution:
O k + 1 = O k + α A 1 D A Q k + 1 + W k + 1 ,
where α represents the hyper-parameter. When the above equation converges, we can reconstruct the target’s bright-field or edge-enhanced images as requested.
In addition, we want to emphasize that different patterns correspond to different modulations, the main ones being Fourier, Hadamard, and random modulations. We will describe these modulations in detail next. It should be noted that this work mainly uses Fourier modulation, but this does not affect the generalization of our method. We also prove the generalization of the method in Section 3.3.

2.2. Fourier Modulation

Fourier modulation projects sinusoidal patterns to the target scene and captures the one-dimensional signal with the bucket detector [34]. The Fourier basic pattern P can be expressed as:
P ( x , y ) = 0.5 + 0.5 s i n ( 2 π ( f x + f y ) + ϕ ) ,
where ϕ represents the initial phase. We use three-step phase shifting to sample images. Each coefficient in the Fourier space F ( f x , f y ) is derived using three sinusoidal patterns with the same spatial frequency and different initial phases:
F ( f x , f y ) = 2 D 0 D 2 π / 3 D 4 π / 3 + 3 i · D 2 π / 3 D 4 π / 3 ,
where D 0 , D 2 π / 3 , and D 4 π / 3 are the measurements corresponding to the illumination patterns of P ( x , y , 0 ) , P ( x , y , 2 π / 3 ) , and P ( x , y , 4 π / 3 ) , respectively. Because of the conjugate symmetric feature of the Fourier spectrum, we only need to measure the upper half of the Fourier coefficients.
Then, the objective function can be transformed into:
( O , Q ) = argmin F ( O ) F 2 + T V ( Q ) , s . t . O = Q ,
The problems can be solved like Equations (6) and (7).

2.3. Hadamard Modulation

Hadamard modulation is based on Hadamard transform. By applying an inverse Hadamard transform to a delta function δ H ( u , v ) , the Hadamard transform pattern P H ( x , y ) can be obtained [3,35,36]:
P H ( x , y ) = 1 2 1 + H 1 δ H ( u , v ) ,
where H and H 1 { } denote a Hadamard transform and an inverse Hadamard transform, respectively.
δ H ( u , v ) = 1 , u = u 0 , v = v 0 0 , otherwise .
Each Hadamard coefficient H ( u , v ) is acquired by two measurements. They are one measurement corresponding to a Hadamard basis pattern P H ( x , y ) and one measurement corresponding to an inverse pattern 1 P H ( x , y ) . The coefficient H ( u , v ) is obtained by using the two corresponding measurements:
H ( u , v ) = D + 1 D 1
where D + 1 and D 1 are measurements corresponding to the illuminations of P H ( x , y ) and 1 P H ( x , y ) .
Then, the objective function can be transformed into:
( O , Q ) = argmin H ( O ) H 2 + T V ( Q ) , s . t . O = Q ,
The problems can be solved like Equations (6) and (7).

2.4. Binary Random Modulation

The binary random pattern P j is generated by a binary pseudo-random number generator [37]. This modulation method can realize the projection consistent with the speed of the SLM, because the number of bits is only one. It greatly increases the modulation speed. The objective function can be solved directly by referring to Equations (5)–(7).

3. Simulations and Analysis

3.1. Design of Metasurface

To realize the proposed dual-modal metasurface-based device, we designed the dual-functional metasurface mentioned in Section 2. When the polarization of the incident beam is XLP, the designed metasurface should work as an edge filter in the Fourier plane. When the polarization of the incident beam is YLP, the designed metasurface should work as a bright-field filter in the Fourier plane. We used the polarization-dependent propagation phase of the nanobrick to achieve the two different phase profiles mentioned above, Ψ x and Ψ y . The simulation results were calculated through the software-FDTD solutions, where periodic boundary layers are used in the x and y directions, and a perfectly matching layer is used in the z direction. The mesh accuracy is equal to 2. The distance of the monitor from the nanobricks is 3 μ m . Dispersion is included in the material data. The plane-wave sources are utilized. Figure 2a displays a side view of a specially designed unit cell consisting of a silicon nanobrick sitting on a fused silica substrate. These nanobricks are periodically arranged with a fixed square lattice constant P = 360 nm and a height H = 500 nm . The propagation phase can basically cover 0 2 π by changing the major semi-axis a and minor semi-axis b of the nanobrick. The schematic for computing the transmission coefficients ( t x , t y ) and phase shifts ( δ x , δ y ) is shown in Figure 2b.
The simulated phase distribution δ x for an XLP incident beam as a function of a and b is shown in Figure 2c, and the corresponding transmission coefficient distribution t x is exhibited in Figure 2d. Similarly, the simulated phase distribution δ y for a YLP incident beam is shown in Figure 2e, and the corresponding transmission coefficient distribution t y is exhibited in Figure 2f. Accordingly, we chose an appropriate size ( a , b ) of the nanobrick in theory to obtain phase combination δ x , δ y , which is equal to Ψ x , Ψ y . We chose 16 discrete phases in the range of 0 2 π , and picked 16 corresponding nanobricks based on the minimum phase difference with the phase in our library and the maximum transmittance coefficient. The 16 selected nanobricks for constructing the metasurface mentioned in Section 2 are shown in Table 1. The designed metasurface consists of 109 nanobricks along both x and y directions. The working wavelength of the incident laser is 0.9 μ m .
To illustrate that the selected nanobricks are appropriate, we calculated the ideal and real phase profile distributions of the designed metasurface. Figure 3a shows the ideal phase profile distribution Ψ x of the designed metasurface under the illumination of the XLP. Figure 3b shows the ideal phase profile distribution Ψ y of the designed metasurface when illuminated by the YLP. Figure 3c,d display our designed metasurface’s real phase profile patterns under the illumination of the XLP or YLP incident beam. The phase distributions in Figure 3a,c are largely similar, albeit not entirely consistent, due to the selection of only 16 different nanobricks. However, these deviations do not significantly impact the function, as the spiral characteristic of the designed phase profile is well maintained. Similarly, the phase distributions in Figure 3b,d exhibit some inconsistencies due to the limitations of the nanobricks’ types, but these deviations are within acceptable tolerances. Other simulated results are shown in Figure 4. Respectively, Figure 4a,b show the far field intensity and phase distribution of the designed metasurface when illuminated by the XLP plane wave. Figure 4c,d, respectively, display the far field intensity and phase distribution of this metasurface when illuminated by the YLP plane wave. The doughnut-shaped intensity distribution is transformed into the Gaussian intensity distribution by altering the polarization of the incident beam. The phase pattern converts from a spiral-like distribution to an approximate constant distribution in the central area when switching the polarization of the incident beam. These figures show that our polarization-dependent metasurface is well designed.

3.2. Full-Process Simulations

To validate the implementability of our proposed device, we simulate the whole processing mentioned in Section 2. Based on MATLAB and FDTD-solutions, we first simulate the optical computing in Equation (1). This modulation is contributed by a 4f system embedded in our designed metasurface. Its computing result is shown in Figure 5. The input images are, respectively, a “BIT” plus cardiogram image, an infrared image [38], and USAF. Specifically, when the polarization of the incident beam is XLP, the designed metasurface should work as an edge filter in the Fourier plane. When the polarization of the incident beam is YLP, the designed metasurface should work as a bright-field filter in the Fourier plane. These bright-field filtering outputs are generally fainter than the ground truth but acceptable when illuminated by the YLP incident beam. All edge information after filtering is obvious when illuminated by the XLP incident beam. Moreover, Figure 5c shows details of the orange line on USAF, which maintains sharp textures and enhances the edge information. It could be concluded that optical computing based on the designed metasurface can efficiently achieve a dynamic switch between bright-field filtering and edge-enhanced filtering at 0.9 μ m , by changing the polarization of the incident beam.
Next, we couple these intermediate results with the corresponding Fourier basis patterns and reconstruct the relevant images [256, 256] in dual modes. We obtain the final results at different sampling ratios and compare them with those of conventional single-pixel imaging [34], as shown in Figure 6. Imaging results are generally fainter than conventional single-pixel imaging, but overall details are still well maintained. As the sampling ratio rises, the structures and details are recovered more clearly. The details of the orange line on USAF show that the device could maintain sharp textures and edge information at a low sampling ratio. It could be concluded that our proposed dual-modal metasurface-based device can indeed achieve high-fidelity dual-modal sensing.

3.3. Generalization Analysis

In addition, we also investigate the generalization of our dual-modal device in modulation. As shown in Figure 7, we applied Hadamard patterns [3,35,36], Fourier patterns [34], and random binary patterns [37] to couple the target information in two different modes. We obtained 6554 measurements of the cameraman image [128, 128] in the above modulations. From these figures, we can find that all of these bright-field and edge-enhanced images can be well reconstructed. Although the random method is slightly worse than the other modulations, these results still illustrate that various patterns are applied to our device.

3.4. Robustness Analysis

In practical scenarios, various factors such as dark current and thermal noise usually affect the imaging quality. To further perform an actual scene, we simulated the whole acquisition and reconstruction processing under different noise levels. Gaussian white noise was added to the one-dimensional measurements. The sampling ratio was 10%. As shown in Figure 8, the reconstructed images basically maintain most details under different noise levels. Edge-enhanced images are more sensitive to noise than bright-field images because the frequencies of edge-enhanced images are concentrated on high-frequency bands. In brief, despite the relatively high noise level, our device can still obtain acceptable edge-enhanced or bright-field images. Therefore, we think our proposed device has certain noise robustness.
In addition, random errors may arise during the processing of the metasurface due to unstable factors such as the environment. Robustness analysis is important as these random errors may result in changes in or even the deterioration of the experimental results. Simulating the process can help analyze the robustness and set reasonable accuracy requirements for the actual processing of the metasurface. During the fabrication, the central position of the nanobrick is relatively accurate, while its shape is not easily accurate and has a relatively great influence on phase modulation. Therefore, we kept the height of the nanobricks constant and added random errors mainly to the major and minor axes of each nanobrick. Three simulations were conducted with random errors ranging from −5 to 0 nm, 0, and 0 to 5 nm for each simulation to observe the results under different error ranges. The sampling ratio was 10%. The bright-field image obtained when the polarization of the incident beam is YLP is shown in Figure 9a, and the edge-enhanced image obtained when the incident beam is XLP is shown in Figure 9b. Our findings show that the simulation results under different ranges of random errors are various, but it is not very obvious. The overall image quality is still acceptable, indicating that our metasurface design can tolerate certain processing errors.

4. Biomedical Applications

In biomedical diagnosis, segmenting cell substructures allows for the analysis of clinical parameters related to volume and shape [39]. Bright-field imaging is high in redundancy but beneficial for complex segmentation tasks. However, medical images usually have low contrast and complex microstructure distributions, so edge image acquisition is one of the critical technologies in medical image processing. For segmentation, edge-enhanced images containing morphologies [40,41] are used to confirm the target’s boundaries. Many researchers have considered the meaning of edge enhancement in medical segmentation, which has effectively improved the segmentation accuracy through methods such as Edge Attention Network (ET-Net) [40], and KiU-Net [41]. These all demonstrate the important role of edge information in medical segmentation. Our device containing optical computing can directly output these bright-field and edge-enhanced images. It can work as the front end of neural networks for extracting features and directly provides edge information for network training, thereby reducing training parameters and computation.
To validate that our device can be smoothly combined with the neural network, we used biomedical images modulated by our device to train the classical Unet [42] for cell segmentation. The initial dataset is from serial section transmission electron microscopy of the Drosophila first instar larva ventral nerve cord (VNC) [43]. It contains 30 training images, whose sizes are [512, 512]. We expanded the dataset to 1200 images by randomly cropping the original training images into [128, 128] pixels. In this dataset, the corresponding ground truth segmentation results are manually sketched, where the cells and membranes are marked in white and black, respectively. The framework of Unet is shown in Figure 10, which has a contracting path to capture context and a symmetric expanding path that enables precise localization. We used the normalization initialization method with the bias initialized to 0 and used the Adam solver for gradient optimization. The weight decay was 1 × 10 3 and was decreased by 0.1 at 50 epochs and 350 epochs. We utilized Cross-Entropy loss, BCE loss, and Dice loss to train Unet. We obtained segmentation results of cells and membranes when we input these corresponding bright-field and edge-enhanced images into the trained Unet. Figure 10 validates that the outputs of our device could be efficiently applied to extract target features. We believe that the combination of bright-field and edge-enhanced images provides a low-power consumption approach for medical segmentation under a rationally designed network.

5. Conclusions

In summary, we report a near-infrared dual-modal single-pixel sensing device based on a polarization-dependent metasurface, which realizes switchable edge-enhanced imaging and bright-field imaging. Results show that it can achieve high-fidelity dual-modal single-pixel sensing at 0.9 μ m and has certain noise robustness. We explored the proposed device’s potential in biomedical image analysis. The advantages of the dual-modal device lie in three aspects. First, this device realizes dual-modal sensing in a simple optical system. By rotating a polarizer to change the polarization of the incident beam, the designed polarization-dependent metasurface can function as different filters for different sensing tasks. Second, this novel device could be applied to all illumination patterns, which differs from the existing edge single-pixel methods. It maintains generalization ability on different illumination patterns. The same illumination patterns can be utilized for dual modes, even different sensing tasks. These advantages will not limit further optimization for high-accuracy imaging. Third, the optical analog computing based on our designed metasurface can process target scenes with high speed and low-power consumption. Moreover, these kinds of devices containing optical computing have widely worked as the front end of neural networks for extracting required information from high-redundancy target scenes. Therefore, we can envision that our device, combined with the neural network, can pave a new path for further intriguing sensing tasks.

Author Contributions

Conceptualization, R.Y. and W.W.; methodology, R.Y. and W.W.; software, R.Y. and W.W.; validation, R.Y. and W.W.; formal analysis, Y.H. and L.B.; investigation, R.Y. and W.W.; resources, Q.H.; writing—original draft preparation, R.Y. and W.W.; writing—review and editing, Y.H. and L.B.; supervision, Q.H., Y.H. and L.B.; project administration, Q.H., Y.H. and L.B.; funding acquisition, Q.H., Y.H. and L.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (61971045, 62131003, 61991451, 62275022).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Huang, K.; Fang, J.; Yan, M.; Wu, E.; Zeng, H. Wide-field mid-infrared single-photon upconversion imaging. Nat. Commun. 2022, 13, 1077. [Google Scholar] [CrossRef]
  2. Chen, C.P.; Li, H.; Wei, Y.; Xia, T.; Tang, Y.Y. A local contrast method for small infrared target detection. IEEE Trans. Geosci. Remote Sens. 2013, 52, 574–581. [Google Scholar] [CrossRef]
  3. Radwell, N.; Mitchell, K.J.; Gibson, G.M.; Edgar, M.P.; Bowman, R.; Padgett, M.J. Single-pixel infrared and visible microscope. Optica 2014, 1, 285–289. [Google Scholar] [CrossRef]
  4. d’Acremont, A.; Fablet, R.; Baussard, A.; Quin, G. CNN-based target recognition and identification for infrared imaging in defense systems. Sensors 2019, 19, 2040. [Google Scholar] [CrossRef] [PubMed]
  5. Liu, H.C.; Yang, B.; Guo, Q.; Shi, J.; Guan, C.; Zheng, G.; Mühlenbernd, H.; Li, G.; Zentgraf, T.; Zhang, S. Single-pixel computational ghost imaging with helicity-dependent metasurface hologram. Sci. Adv. 2017, 3, e1701477. [Google Scholar] [CrossRef] [PubMed]
  6. Wang, Y.; Huang, K.; Fang, J.; Yan, M.; Wu, E.; Zeng, H. Mid-infrared single-pixel imaging at the single-photon level. Nat. Commun. 2023, 14, 1073. [Google Scholar] [CrossRef]
  7. Vodopyanov, K.L. Laser-Based Mid-Infrared Sources and Applications; John Wiley & Sons: New York, NY, USA, 2020. [Google Scholar]
  8. Hermes, M.; Morrish, R.B.; Huot, L.; Meng, L.; Junaid, S.; Tomko, J.; Lloyd, G.R.; Masselink, W.T.; Tidemand-Lichtenberg, P.; Pedersen, C.; et al. Mid-IR hyperspectral imaging for label-free histopathology and cytology. J. Opt. 2018, 20, 023002. [Google Scholar] [CrossRef]
  9. Wang, Z.; Wang, Z.; Zheng, Y.; Chuang, Y.Y.; Satoh, S. Learning to Reduce Dual-Level Discrepancy for Infrared-Visible Person Re-Identification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
  10. Solli, D.R.; Jalali, B. Analog optical computing. Nat. Photonics 2015, 9, 704–706. [Google Scholar] [CrossRef]
  11. Yang, H.; Xie, Z.; He, H.; Zhang, Q.; Li, J.; Zhang, Y.; Yuan, X. Switchable imaging between edge-enhanced and bright-field based on a phase-change metasurface. Opt. Lett. 2021, 46, 3741–3744. [Google Scholar] [CrossRef]
  12. Badri, S.H.; Gilarlue, M.; SaeidNahaei, S.; Kim, J.S. Narrowband-to-broadband switchable and polarization-insensitive terahertz metasurface absorber enabled by phase-change material. J. Opt. 2022, 24, 025101. [Google Scholar] [CrossRef]
  13. Badri, S.H.; SaeidNahaei, S.; Kim, J.S. Polarization-sensitive tunable extraordinary terahertz transmission based on a hybrid metal–vanadium dioxide metasurface. Appl. Opt. 2022, 61, 5972–5979. [Google Scholar] [CrossRef]
  14. Li, Y.B.; Li, L.L.; Cai, B.G.; Cheng, Q.; Cui, T.J. Holographic leaky-wave metasurfaces for dual-sensor imaging. Sci. Rep. 2015, 5, 1–7. [Google Scholar] [CrossRef] [PubMed]
  15. Ye, W.; Zeuner, F.; Li, X.; Reineke, B.; He, S.; Qiu, C.W.; Liu, J.; Wang, Y.; Zhang, S.; Zentgraf, T. Spin and wavelength multiplexed nonlinear metasurface holography. Nat. Commun. 2016, 7, 11930. [Google Scholar] [CrossRef]
  16. Wang, Z.; Li, T.; Soman, A.; Mao, D.; Kananen, T.; Gu, T. On-chip wavefront shaping with dielectric metasurface. Nat. Commun. 2019, 10, 3547. [Google Scholar] [CrossRef] [PubMed]
  17. Chen, S.; Liu, W.; Li, Z.; Cheng, H.; Tian, J. Metasurface-empowered optical multiplexing and multifunction. Adv. Mater. 2020, 32, 1805912. [Google Scholar] [CrossRef]
  18. Hu, Y.; Wang, X.; Luo, X.; Ou, X.; Li, L.; Chen, Y.; Yang, P.; Wang, S.; Duan, H. All-dielectric metasurfaces for polarization manipulation: Principles and emerging applications. Nanophotonics 2020, 9, 3755–3780. [Google Scholar] [CrossRef]
  19. Rubin, N.A.; Chevalier, P.; Juhl, M.; Tamagnone, M.; Chipman, R.; Capasso, F. Imaging polarimetry through metasurface polarization gratings. Opt. Express 2022, 30, 9389–9412. [Google Scholar] [CrossRef]
  20. Deng, Y.; Cai, Z.; Ding, Y.; Bozhevolnyi, S.I.; Ding, F. Recent progress in metasurface-enabled optical waveplates. Nanophotonics 2022. [Google Scholar] [CrossRef]
  21. Joseph, S.; Sarkar, S.; Joseph, J. Grating-coupled surface plasmon-polariton sensing at a flat metal–analyte interface in a hybrid-configuration. ACS Appl. Mater. Interfaces 2020, 12, 46519–46529. [Google Scholar] [CrossRef]
  22. Lio, G.E.; Ferraro, A.; Kowerdziej, R.; Govorov, A.O.; Wang, Z.; Caputo, R. Engineering Fano-Resonant Hybrid Metastructures with Ultra-High Sensing Performances. Adv. Opt. Mater. 2022, 2203123. [Google Scholar] [CrossRef]
  23. Abdollahramezani, S.; Hemmatyar, O.; Adibi, A. Meta-optics for spatial optical analog computing. Nanophotonics 2020, 9, 4075–4095. [Google Scholar] [CrossRef]
  24. Zangeneh-Nejad, F.; Sounas, D.L.; Alù, A.; Fleury, R. Analogue computing with metamaterials. Nat. Rev. Mater. 2021, 6, 207–225. [Google Scholar] [CrossRef]
  25. Zhao, Z.; Wang, Y.; Ding, X.; Li, H.; Fu, J.; Zhang, K.; Burokur, S.N.; Wu, Q. Compact logic operator utilizing a single-layer metasurface. Photonics Res. 2022, 10, 316–322. [Google Scholar] [CrossRef]
  26. He, S.; Wang, R.; Luo, H. Computing metasurfaces for all-optical image processing: A brief review. Nanophotonics 2022. [Google Scholar] [CrossRef]
  27. Badloe, T.; Lee, S.; Rho, J. Computation at the speed of light: Metamaterials for all-optical calculations and neural networks. Adv. Photonics 2022, 4, 064002. [Google Scholar] [CrossRef]
  28. Huo, P.; Zhang, C.; Zhu, W.; Liu, M.; Zhang, S.; Zhang, S.; Chen, L.; Lezec, H.J.; Agrawal, A.; Lu, Y.; et al. Photonic Spin-Multiplexing Metasurface for Switchable Spiral Phase Contrast Imaging. Nano Lett. 2020, 20, 2791–2798. [Google Scholar] [CrossRef]
  29. Xiao, T.; Yang, H.; Yang, Q.; Xu, D.; Wang, R.; Chen, S.; Luo, H. Realization of tunable edge-enhanced images based on computing metasurfaces. Opt. Lett. 2022, 47, 925–928. [Google Scholar] [CrossRef]
  30. Wang, Y.; Yang, Q.; He, S.; Wang, R.; Luo, H. Computing Metasurfaces Enabled Broad-Band Vectorial Differential Interference Contrast Microscopy. ACS Photonics 2022. [Google Scholar] [CrossRef]
  31. Zeng, B.; Huang, Z.; Singh, A.; Yao, Y.; Azad, A.K.; Mohite, A.D.; Taylor, A.J.; Smith, D.R.; Chen, H.T. Hybrid graphene metasurfaces for high-speed mid-infrared light modulation and single-pixel imaging. Light Sci. Appl. 2018, 7, 51. [Google Scholar] [CrossRef]
  32. Yan, J.; Wang, Y.; Liu, Y.; Wei, Q.; Zhang, X.; Li, X.; Huang, L. Single pixel imaging based on large capacity spatial multiplexing metasurface. Nanophotonics 2022, 11, 3071–3080. [Google Scholar] [CrossRef]
  33. Yan, J.; Wei, Q.; Liu, Y.; Geng, G.; Li, J.; Li, X.; Li, X.; Wang, Y.; Huang, L. Single pixel imaging key for holographic encryption based on spatial multiplexing metasurface. Small 2022, 18, 2203197. [Google Scholar] [CrossRef] [PubMed]
  34. Zhang, Z.; Ma, X.; Zhong, J. Single-pixel imaging by means of Fourier spectrum acquisition. Nat. Commun. 2015, 6, 6225. [Google Scholar] [CrossRef] [PubMed]
  35. Duarte, M.F.; Davenport, M.A.; Takhar, D.; Laska, J.N.; Sun, T.; Kelly, K.F.; Baraniuk, R.G. Single-pixel imaging via compressive sampling. IEEE Signal Process. Mag. 2008, 25, 83–91. [Google Scholar] [CrossRef]
  36. Watts, C.M.; Shrekenhamer, D.; Montoya, J.; Lipworth, G.; Hunt, J.; Sleasman, T.; Krishna, S.; Smith, D.R.; Padilla, W.J. Terahertz compressive imaging with metamaterial spatial light modulators. Nat. Photonics 2014, 8, 605–609. [Google Scholar] [CrossRef]
  37. Sun, B.; Edgar, M.P.; Bowman, R.; Vittert, L.E.; Welsh, S.; Bowman, A.; Padgett, M.J. 3D computational imaging with single-pixel detectors. Science 2013, 340, 844–847. [Google Scholar] [CrossRef]
  38. St-Charles, P.; Bilodeau, G.; Bergevin, R. Online Mutual Foreground Segmentation for Multispectral Stereo Videos. 2019. Available online: https://www.polymtl.ca/litiv/codes-et-bases-de-donnees (accessed on 1 December 2022).
  39. Seo, H.; Badiei Khuzani, M.; Vasudevan, V.; Huang, C.; Ren, H.; Xiao, R.; Jia, X.; Xing, L. Machine learning techniques for biomedical image segmentation: An overview of technical aspects and introduction to state-of-art applications. Med. Phys. 2020, 47, e148–e167. [Google Scholar] [CrossRef]
  40. Zhang, Z.; Fu, H.; Dai, H.; Shen, J.; Pang, Y.; Shao, L. Et-net: A generic edge-attention guidance network for medical image segmentation. In Proceedings of the Medical Image Computing and Computer Assisted Intervention–MICCAI, Shenzhen, China, 13–17 October 2019; Springer: Cham, Switzerland, 2019; pp. 442–450. [Google Scholar]
  41. Valanarasu, J.M.J.; Sindagi, V.A.; Hacihaliloglu, I.; Patel, V.M. Kiu-net: Towards accurate segmentation of biomedical images using over-complete representations. In Proceedings of the Medical Image Computing and Computer Assisted Intervention–MICCAI: 23rd International Conference, Lima, Peru, 4–8 October 2020; Springer International Publishing: Cham, Switzerland, 2020; pp. 363–373. [Google Scholar]
  42. Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention (MICCAI), Munich, Germany, 5–9 October 2015; Springer: Cham, Switzerland, 2015; Volume 9351, pp. 234–241. [Google Scholar]
  43. WWW: Web Page of the Em Segmentation Challenge. Available online: http://brainiac2.mit.edu/isbi_challenge/ (accessed on 1 March 2023).
Figure 1. The framework of the near-infrared dual-modal single-pixel sensing metasurface-based device.
Figure 1. The framework of the near-infrared dual-modal single-pixel sensing metasurface-based device.
Nanomaterials 13 01542 g001
Figure 2. Design of the polarization-dependent metasurface. (a) is a side view of a typical unit cell with the period (P), height (H), and varying cross sizes (a and b). (b) is schematic for the numerical calculation of the transmission coefficients ( t x , t y ) and phase shifts ( δ x , δ y ). (c,d) are simulated propagation phase δ x and transmission coefficient distribution t x for an XLP incident beam. (e,f) are simulated propagation phase δ y and transmission coefficient distribution t y for an YLP incident beam.
Figure 2. Design of the polarization-dependent metasurface. (a) is a side view of a typical unit cell with the period (P), height (H), and varying cross sizes (a and b). (b) is schematic for the numerical calculation of the transmission coefficients ( t x , t y ) and phase shifts ( δ x , δ y ). (c,d) are simulated propagation phase δ x and transmission coefficient distribution t x for an XLP incident beam. (e,f) are simulated propagation phase δ y and transmission coefficient distribution t y for an YLP incident beam.
Nanomaterials 13 01542 g002
Figure 3. (a,b) are ideal phase profile distributions of the designed metasurface when illuminated by an XLP or YLP incident beam. (c,d) are real phase profile distributions of the designed metasurface when illuminated by an XLP or YLP incident beam.
Figure 3. (a,b) are ideal phase profile distributions of the designed metasurface when illuminated by an XLP or YLP incident beam. (c,d) are real phase profile distributions of the designed metasurface when illuminated by an XLP or YLP incident beam.
Nanomaterials 13 01542 g003
Figure 4. (a,b) are, respectively, simulated intensity and phase distribution of our designed metasurface under the illumination of an XLP incident beam. (c,d) are, respectively, simulated intensity and phase distribution of our designed metasurface under the illumination of a YLP incident beam.
Figure 4. (a,b) are, respectively, simulated intensity and phase distribution of our designed metasurface under the illumination of an XLP incident beam. (c,d) are, respectively, simulated intensity and phase distribution of our designed metasurface under the illumination of a YLP incident beam.
Nanomaterials 13 01542 g004
Figure 5. Metasurface-based computing results of bright-field filtering and edge-enhanced filtering. (ac) are computing results using a “BIT” plus cardiogram image, an infrared image, and USAF.
Figure 5. Metasurface-based computing results of bright-field filtering and edge-enhanced filtering. (ac) are computing results using a “BIT” plus cardiogram image, an infrared image, and USAF.
Nanomaterials 13 01542 g005
Figure 6. The proposed dual-modal single-pixel sensing of our device and conventional single-pixel imaging [34].
Figure 6. The proposed dual-modal single-pixel sensing of our device and conventional single-pixel imaging [34].
Nanomaterials 13 01542 g006
Figure 7. The proposed dual-modal single-pixel sensing results of our device under different modulations.
Figure 7. The proposed dual-modal single-pixel sensing results of our device under different modulations.
Nanomaterials 13 01542 g007
Figure 8. The proposed dual-modal single-pixel sensing results of our device under different noise levels.
Figure 8. The proposed dual-modal single-pixel sensing results of our device under different noise levels.
Nanomaterials 13 01542 g008
Figure 9. The proposed dual-modal single-pixel sensing of our device under different fabrication errors. (a) shows results of bright-field imaging under different fabrication errors, (b) shows results of edge-enhanced imaging under different fabrication errors.
Figure 9. The proposed dual-modal single-pixel sensing of our device under different fabrication errors. (a) shows results of bright-field imaging under different fabrication errors, (b) shows results of edge-enhanced imaging under different fabrication errors.
Nanomaterials 13 01542 g009
Figure 10. Single-pixel segmentation flowchart and corresponding results.
Figure 10. Single-pixel segmentation flowchart and corresponding results.
Nanomaterials 13 01542 g010
Table 1. Metasurface parameters of the 16 selected nanobricks.
Table 1. Metasurface parameters of the 16 selected nanobricks.
NUMa(nm)b(nm) δ x t x δ y t y
1781603.070.850.620.4
282156−2.810.830.320.74
386152−2.320.830.360.68
488150−2.110.880.440.53
592160−1.460.920.230.95
696160−1.1010.380.94
7104401.480.960.690.98
8104158−0.600.960.530.91
9112152−0.240.990.490.91
10116401.860.950.720.98
111261440.180.980.500.92
12130402.330.970.760.98
13140402.730.90.790.98
141441360.620.960.500.92
1515240−3.140.780.820.98
161601301.030.850.470.92
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yan, R.; Wang, W.; Hu, Y.; Hao, Q.; Bian, L. Polarization-Dependent Metasurface Enables Near-Infrared Dual-Modal Single-Pixel Sensing. Nanomaterials 2023, 13, 1542. https://doi.org/10.3390/nano13091542

AMA Style

Yan R, Wang W, Hu Y, Hao Q, Bian L. Polarization-Dependent Metasurface Enables Near-Infrared Dual-Modal Single-Pixel Sensing. Nanomaterials. 2023; 13(9):1542. https://doi.org/10.3390/nano13091542

Chicago/Turabian Style

Yan, Rong, Wenli Wang, Yao Hu, Qun Hao, and Liheng Bian. 2023. "Polarization-Dependent Metasurface Enables Near-Infrared Dual-Modal Single-Pixel Sensing" Nanomaterials 13, no. 9: 1542. https://doi.org/10.3390/nano13091542

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop