Next Article in Journal
Sunlight Communication System Built with Tunable 3D-Printed Optical Components
Previous Article in Journal
ADMM-SVNet: An ADMM-Based Sparse-View CT Reconstruction Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhanced Single-Beam Multiple-Intensity Phase Retrieval Using Holographic Illumination

1
School of Physics, University of Electronic Science and Technology of China, Chengdu 610054, China
2
Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu 610209, China
*
Author to whom correspondence should be addressed.
Photonics 2022, 9(3), 187; https://doi.org/10.3390/photonics9030187
Submission received: 20 January 2022 / Revised: 4 March 2022 / Accepted: 12 March 2022 / Published: 15 March 2022
(This article belongs to the Topic Optical and Optoelectronic Materials and Applications)

Abstract

:
Single-beam multiple-intensity iterative phase retrieval is a high-precision and lens-free computational imaging method, which reconstructs the complex-valued distribution of the object from a volume of axially captured diffraction intensities using the post-processing algorithm. However, for the object with slowly-varying waves, the method may encounter the problem of convergence stagnation since the lack of diversity between the captured intensity patterns. In this paper, a novel technique to enhance phase retrieval using holographic illumination is proposed. One special computer-generated hologram is designed, which can generate multiple significantly different images at the required distances. The incident plane wave is firstly modulated by the hologram, and then the exit wave is used to illuminate the object. Benefitting from this holographic illumination, remarkable intensity changes in the given detector planes can be produced, which is conducive to fast and high-accuracy reconstruction. Simulation and optical experiments are performed to verify the feasibility of the proposed method.

1. Introduction

Structural properties of the object, such as curvature, micro shape, and layer depth are carried by the phase information. However, due to the high-frequency light waves, general detectors like charge-coupled devices (CCD) are sensitive only to the intensity. The lost phase in the detection process can be retrieved by means of reconstruction methods, which can be mainly divided into three categories, namely, interferometry [1,2], transport of intensity equation (TIE) [3,4], and iterative phase recovery algorithm (IPR) [5,6]. Interferometry involves adding interference light to the imaging system, and strict registration between the object beam and the reference beam poses challenges to the stability of the experimental environment. TIE uses the intensity difference of the target beam along the optical axis to solve the 2D phase distribution; however, it is not suitable for the general complex wavefront. Compared with the former two methods, IPR puts emphasis on the superiority of the reconstruction algorithm with a simple imaging system. Considerable progress in IPR has sparked a revolution in applications of super-resolution imaging [7,8], wavefront sensing [9,10], and optical encryption [11]. IPR was first proposed by Gerchberg and Saxton [12]. In this method, the target wavefront propagates alternately between the object plane and the imaging plane, and a corresponding amplitude constrain is imposed on each plane. The process is repeated until the difference between the calculated results and the desired intensity distribution reaches an acceptable error. However, this method only uses a single recorded light field and is very sensitive to the initial phase guess, resulting in poor imaging quality and slow reconstruction speed, thus limiting its further application.
To tackle this problem, multiple-intensity phase retrieval was proposed. As a representative imaging modality, the ptychographical iterative engine (PIE) [13] can reconstruct the complete complex amplitude distribution of a sample by laterally scanning multiple overlapping regions with a probe. A large amount of data redundancy ensures the convergence and robustness of PIE technology. However, PIE requires several tens of images to be captured and takes a long time to record, and is not suitable for the detection of fast-moving and changing samples. Unlike laterally scanning, single-beam multi-intensity reconstruction (SBMIR) is proposed to record a lesser number of diffraction intensities along the optical axis and is still valid for reconstructing wavefronts. SBMIR is a more low-cost, compact implement, and simple procedure method, which is successfully applied in coherent diffraction imaging [14], lens-free imaging [15,16], etc.
In SBMIR, the convergence speed is determined by the intensity difference among the captured diffraction patterns. Hence, it is difficult to recover the object with a slowly-varying wavefront. To deal with this problem, many methods have been proposed, such as speckle illumination [17,18,19], where a diffuser is placed upstream of the sample to introduce the speckle field to enhance the diversity of axial measurements. Similar methods such as microlens array modulation [20], multimode optical fiber illumination [21], and spherical wave illumination [22,23] were proposed to achieve the rapid change of diffraction field within a short distance. In addition, algorithm improvements such as disordered wavefront propagation [24], non-equal-interval propagation [25], adaptive support [26], and relaxation constraint [27] were also proposed to solve the problem of convergence stagnation.
In this paper, an enhanced single-beam multiple-intensity phase retrieval is proposed. Instead of plane wave illumination, holographical illumination is used to provide considerable intensity variations in the axial direction, which is beneficial to stable and unique phase retrieval without the problems of ambiguity or stagnation. The idea of holographic illumination is inspired by the principle of 3D holographic projection [28,29,30], which can generate multi-plane intensity distribution with a single hologram.
In this work, several holograms are designed for generating different types of images to compare their capability of reconstructing the complex amplitude light field. The normalized cross-correlation (NCC) is introduced to reveal how to select images to generate holograms. The effects of various parameters on reconstruction accuracy and speed of the proposed method are also investigated. The simulation proves that our method can be successfully applied to recover both rough and smooth objects. In experiments, amplitude and phase objects are used to verify our method.

2. Principle

Figure 1 shows the imaging system of the phase retrieval method with holographical illumination. A collimated incident beam illuminates the hologram and the exiting wave is forward propagating to modulate the sample. A CCD camera is first located downstream of the sample with a distance of z0. After one recording, the camera moves a fixed interval to capture the next diffraction pattern. The above measurement steps are repeated until the nth pattern is recorded. Using an iterative phase retrieval algorithm, the complex amplitude of the sample can be reconstructed from the n collected diffraction intensities. At the top of Figure 1, the function of the hologram is also illustrated. When a laser beam passes through this hologram, a series of images at certain distances can be generated. It is worth noting that the positions of n projected images coincide with n recording planes one by one, aiming to introduce significant intensity change in the recorded diffraction pattern of the sample.
In order to produce different types of images, two hologram design methods are adopted. One is the random superposition method [31]. The target amplitude in each plane is multiplied by a random phase, and then back propagates to the hologram plane. The above technique is well suited to generate a series of simple images composed of points or lines, such as Group 1 and Group 2 in Figure 3a. When a hologram is designed to produce complicated images like Group 3, another method called noniterative projection (NIP) should be employed [32]. In this method, the estimated phase hologram passes through each plane one by one, and a relaxation constraint is imposed on the estimated field. The wave-field is then propagating backward through each plane using the same procedure. The final hologram can be obtained from the phase part of the wave-field at the hologram plane. The above diffraction propagation is implemented using the angular spectrum method:
U n ( x , y ) = F { F { U n ( x , y ) } e x p [ j 2 π Z λ 2 f x 2 f y 2 ] } ,
where F{} and F{} denote the Fourier transform operation and its inverse, respectively, λ is the wavelength of the incident laser beam, fx and fy are the spatial frequency, Un(x,y) is the wave-field to be propagating, Un′(x,y) is the propagated wave-field, and Z is the diffraction distance.
With the known hologram distribution Holo(x,y) and the recorded diffraction intensities of the sample I1, I2,…, In, the phase retrieval process can be conducted using the following procedure. For simplicity, the wave propagation is shortened as Prop{U, Z}, where U represents the wave-field and Z represents the propagation distance.
(1)
The plane wave passes through the hologram and propagates to the sample with a diffraction distance of z-z0. The initial guess of the sample is given by U0, with constant amplitude and phase. Then, the emitted wave-field from the sample can be expressed as US = U0Prop{Holo, z-z0}.
(2)
The transmitted wavefront US propagates forward at the first recording plane, and the complex amplitude at the first recording plane can be written as U1 = Prop{US, z0}. The phase is kept and the amplitude is constrained by the square root of the measured intensity. The modified light field can be defined as U1′ = I11/2 ∗ exp[j(angle{U1})].
(3)
The updated wave-field U1′ passes through each recording plane one-by-one and the same constraint is applied.
(4)
The light field Un′ is backpropagated at the last recording plane to the sample plane with a distance of z0 + (n − 1) ∗ d. The estimated complex wave-field at the sample plane can be written as US′ = Un′ ∗ Prop{Holo, −z0 − (n − 1) ∗ d}.
(5)
The modulation of the hologram then can be removed. The distribution of the sample is updated by U0′ = Prop{Un’, −(z0 + (n − 1) ∗ d)}./Prop{Holo, z-z0}, and one round of iteration is completed.
The iteration is terminated when the difference between the desired target and the reconstructed results reaches an acceptable error, which can be evaluated by the structural similarity index measure (SSIM), and it is calculated as:
SSIM = ( 2 u T u R + c 1 ) ( 2 σ T , R + c 2 ) ( u T 2 + u R 2 + c 1 ) ( σ T 2 + σ R 2 + c 2 ) ,
where uT and uR are the mean values of the target and the reconstruction results, respectively, σT,R is the covariance of the target and the reconstructed results, σT and σR are the standard deviations of the target and the reconstructed results, respectively, c1 and c2 are constant to avoid being divided by zero.
Peak signal-to-noise ratio (PSNR) is also introduced to evaluate the reconstructed image quality:
PSNR = 10 log 10 { m 2 i max 2 ( x , y ) / x y [ i ( x , y ) i ( x , y ) ] 2 }
where m is the image size, i(x,y) is the target image, imax(x,y) is the maximal value of i(x,y), and i(x,y) is the reconstructed image. The higher SSIM and PSNR indicate the better reconstruction quality.

3. Simulation

To investigate the performance of the proposed method, two types of objects are used for simulation, as shown in Figure 2. One is the rough object with the “baboon” image as the amplitude and the “peppers” image as the phase. The amplitude is normalized and the phase is scaled in the range [0, π]. The object is sampled with 200 × 200 pixels and padded with zeros to form 300 × 300 pixels. Another is the smooth object, which consists of a constant amplitude with 300 × 300 pixels and a vortex phase with 200 × 200 pixels in the center. The topological charge of the optical vortex is set to two. The sampling interval of the object plane and the recording planes are all 7.4 μm. The initial distance between the sample and the recording plane is z0 = 10 mm. The plane interval is set to d = 2 mm. The number of the recording plane is n = 4. The wavelength of the laser is 658 nm.
Firstly, three holograms are designed to produce three groups of images as different holographic illumination (HI) to reveal the influence of hologram target image selection on the capability of phase retrieval. The images in Group 1 are simple and only contain one point with different positions. By contrast, the images in Group 2 and Group 3 are more complicated, in which the image of Group 2 contains more zero values. The distance between the hologram and the first projection plane is z = 100 mm. The sampling interval and resolution of the hologram are all the same with the object. The obtained holograms and their reconstructed images are shown in Figure 3b.
When the vortex phase object is illuminated by these holograms, the recorded diffraction patterns are presented in Figure 4. Three holographic illuminations are denoted by HI-group1, HI-group2, and HI-group3. For comparison, the captured intensity distributions with plane wave illumination (PWI) are also given in Figure 4. As we can see, under PWI, the diffraction patterns have no obvious and intuitive change. In contrast, a significant intensity difference can be observed with HI modes, since the diffracted light of the sample is superimposed with the modulated light from the hologram. The diffraction patterns of object 1 are not shown, owning to larger visually intensity change existing in both HI and PWI mode for the rough object.
The iteration convergence curves and the reconstructed results of the two objects are shown in Figure 5. It can be clearly seen that as the number of iterations increases, three kinds of proposed HI converge much faster than the traditional PWI for both objects. After 100 iterations, the PWI can only reconstruct the contour of the rough object, but cannot recover the smooth vortex phase. Using HI, both objects are well retrieved. Among them, HI-group3 has the fastest convergence rate and the highest accuracy, followed by HI-group2, and then HI-group1. To quantitatively compare the results, Table 1 is presented. Mean normalized cross-correlation (MNCC) is introduced as a new parameter to represent the similarity of the diffraction pattern of the sample, which is defined by:
MNCC = 1 n [ c o r r 2 ( I 1 , I 2 ) + + c o r r 2 ( I i 1 , I i ) + c o r r 2 ( I i , I 1 ) ] .
where i = 1, 2, …, n, and the operator corr2() is to calculate the cross-correlation. The smaller the value of MNCC, the more obvious the change in diffraction intensity. It can be found in Table 1 that the MNCC value of PWI is around 0.9, meaning the difference among the recorded intensities is very small. In contrast, the MNCC values of the three HI methods are all less than 0.5, indicating that there is a large difference between the recorded intensity maps. Besides, HI-group3 has the smallest MNCC, which suggests that complex target images are more suitable for holographic illumination. The iteration time in Table 1 represents the number of iterations required to achieve convergence. For rough objects, the traditional PWI needed 2687 iterations, while HI only requires a maximum of 246 iterations, the iteration speed increased by more than ten times, and using HI-group3 can even increase 30 times. For the vortex phase, PWI cannot reconstruct the correct result, because the difference between the recorded images is too small, the obtained SSIM value of the reconstructed image is only 0.1661. In contrast, the SSIM of the reconstruction results with HI methods is all greater than 0.9. From the relationship of the above three parameters in Table 1, it can be concluded that a small MNCC value can achieve the highest SSIM value with fewer convergence times, which proves that MNCC is feasible to define the diversity of intensity variation. It also provides a numerical index for the design of the hologram. Holograms should be selected with a small MNCC value.
We also investigate the influence of the number of measurement planes on the reconstruction performance. Figure 6a,b show the phase convergence curves of object 1 and object 2, respectively, under different numbers of the captured diffraction patterns, and the HI-group3 is adopted. It can be seen that the number of the measurements is two, the captured intensities cannot provide enough redundant information for algorithm convergence. When the number of planes reaches more than three, the convergence of the algorithm can be ensured, and the more planes are used, the faster the convergence will be. In general, the convergence speed and imaging quality can be guaranteed by recording at least four planes. Figure 6c shows the phase convergence accuracy of object 1 under different illumination modes and different number of measurement planes after 100 iterations. We can see that with the increase of intensity measurements, the convergence accuracy is also improved. However, for plane wave illumination, the SSIM is still very low. It may be feasible to increase the number of planes, but it will increase the measurement time. Figure 6e shows the difference between the reconstructed and the ideal phase of object 1 using HI-group3, and the maximum error is 10−3 rad. In comparison, the maximum phase error is shown in Figure 6f with the PWI method is 1.0134 rad. For the vortex phase object, the reconstructed phase error using PWI shown in Figure 6h is up to 2.9818 rad, while the error by HI-group3 shown in Figure 6g is on the order of 10−9 rad, which proves the effectiveness of our method.
Subsequently, the influence of measurement interval d on the reconstruction result is discussed. Four recording planes are used. Figure 7a,c show the phase convergence accuracy of object 1 and object 2 under different types of illumination at intervals of d = 1, 1.5, 2, 2.5, and 3 mm. The measurement interval is greater than 2 mm, HI can reconstruct the phase accurately, while for the PWI it needs more than 7 mm to achieve accurate reconstruction for object 1. However, for object 2, the reconstruction is failed at any interval. Figure 7b,d show the evolution of image and SSIM value with a measurement interval and iteration times using HI-group3. From left to right, when the number of iterations increases from 10 to 100, the reconstruction quality corresponding to each interval also increases. Similarly, from top to bottom, when the measurement interval grows from 1 to 3 mm under the same number of iterations, the accuracy of the recovered phase also improved gradually. The rightmost column is the reconstruction result of PWI for comparison, and the images are blurry, indicating that our method can successfully recover the sample with fast calculation and short-distance measurement.

4. Experiments

To further prove the effectiveness of the proposed method, the experiments are conducted and the imaging setup is shown in Figure 8. A laser parallel light tube with a wavelength of 658 nm is incident on the SLM (The HOLOEYE PLUTO VIS, 1920 × 1080 pixels with a pixel size of 8 μm) after being modulated by the aperture and the polarizer. A computer-generated hologram described above is loaded on the SLM, and the modulated light from the SLM illuminates the sample through a beam splitting prism, and then projects onto a CCD (The HR16000CTLGEC, 8-bit, 4896 × 3248 pixels with a pixel size of 7.4 μm). The CCD is mounted on a translation stage, moved with an interval step of 2 mm. Four diffraction intensity patterns are recorded. The initial distance between the sample and CCD is 90.3 mm, which is measured by the principle of grating diffraction [33]. The distance between the SLM and CCD is 200 mm, which is calibrated by the position of the clearest first holographic projection plane.
A “rabbit” pattern is firstly used as the amplitude sample, and the reconstruction results are shown in Figure 9. In Figure 9a, with the increase of iteration times, the original method using PWI always has problems of imaging blur, and the improvement of image quality is not significant. In the proposed method using HI, the calculation result of 60 iterations is visually better than that of the conventional method using 300 iterations. Meanwhile, the contour of the rabbit has been clearly formed at the beginning using HI, and increasing the number of iterations can effectively reduce the speckle noise. In order to quantitatively compare the two methods, the marked lines of the results of the two methods at 300 iterations are extracted and plotted in Figure 9b. It can be observed that the recovered result of the proposed method has a high imaging contrast. In addition, the logarithm of mean square error (LMSE) is adopted to evaluate the convergence performance of the two methods, which calculates the difference between the square root of the recorded intensity and the retrieved amplitude on the first plane. The lower the LMSE value, the better the reconstruction quality. The LMSE curves of the two methods are shown in Figure 9c, it can be found that with the increase of the number of iterations, the reconstruction error decreases with HI, while the reconstruction error is almost unchanged with PWI, which proves the effectiveness of our method.
Secondly, a quantified phase resolution plate is tested to evaluate the ability to retrieve the smooth wavefront. The retrieved phase distribution of the sample with the traditional method and the proposed method are shown in Figure 10a,b, respectively. Group 4 is enlarged in the bottom and the cross-section phase distribution of groups 4-2 and 4-3 are plotted in Figure 10c,d respectively. It can be seen that the lines in group 4-1 with a resolution of 31.1 μm can be distinguished with the original PWI method. In contrast, the lines in group 4-3 with a resolution of 24.8 μm can be discerned with the proposed HI method. Moreover, in order to quantitatively evaluate the accuracy of phase recovery, lines recovered in group 3-5 and 3-3 are selected and corresponding phase values are converted into depth values. The phase value φ and the depth value h are related by: φ = 2πh(1.4565 − 1)/λ, where the constant 1.4565 is the refractive index of the phase plate. The real depth measured by the step profiler (Stylus Profiler System, Dektak XT, Bruker, Karlsruhe, Germany) is 477 nm. Two sets of depth data are shown in Figure 10e,f. Several retrieved depths marked by the red dashed lines using HI are 454.4, 455.7, and 457.7 nm. The depth error is about 4.5%, while the depth recovered by PWI is obviously much less than the true value and the depth error is larger. It is worth noting that the reconstructed results could be influenced by noises in the experiment, such as Gaussian noise, speckle noise, and Poisson noise. Besides, the displacement accuracy of the electric translation stage and the dark current noise of CCD will affect the imaging results. We will strive to overcome these defects in future work.

5. Discussion

To date, there have been a number of hologram design methods for generating multiple planar images, such as noniterative projection (NIP), sequential GS (SGS) [34], global GS (GSG) [35], IFTA [36], non-convex optimization [37], binary optimization [38], and so on [39,40]. To reveal the influence of the hologram reconstruction accuracy on the multi-distance phase retrieval, three methods of hologram design are employed and compared. The parameters are the same as the simulation part and the iteration time in SGS and GGS is 100. Figure 11 shows the holographic reconstruction results. With the naked eye, the reconstruction results of NIP and SGS have large speckle noise. The results of GGS are clearer and have higher image contrast. From the value of NCC, the reconstruction accuracy from low to high is NIP, SGS, and GGS.
Then, the above three holograms are used as holographic illumination to reconstruct the sample (the vortex phase in Figure 2b). The iteration convergence curves are shown in Figure 12. It can be observed that all the methods can make the reconstruction algorithm convergence, and the convergence speed is only slightly different. In order to quantitatively compare the reconstruction performance, MNCC of light recordings, iteration time, and SSIM of the reconstructed sample are shown in Table 2. The hologram designed by the GGS algorithm can make the MNCC value of the recorded diffraction intensities as low as 0.0590. It means that the difference between the intensity is obvious, which can achieve a rapid reconstruction convergence rate, and it only takes 117 times to converge. While the NIP takes 140 to converge. It seems that using holograms with high reconstruction quality can realize fast phase recovery. However, the improvement is very slight. Therefore, it is considered that general methods can be used to design holograms for holographic illumination.

6. Conclusions

In conclusion, a new phase retrieval method using holographic illumination is proposed. The hologram is used to generate the significant variation between the measurements. Compare with the original plan wave illumination method, the proposed method has the capability of fast reconstruction, better imaging quality, and smooth phase object reconstruction. The present method opens new frontiers in the application in the other phase retrieval method based on multi-intensity modulation.

Author Contributions

Conceptualization, C.X. and H.P.; Methodology, C.X. and H.P.; Software, C.X. and A.C.; Validation, C.X.; Writing—original draft preparation, C.X.; Writing—review and editing, H.P.; Supervision, Q.D.; Funding acquisition, H.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China (61905251).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Schwider, J.; Burow, R.; Elssner, K.-E.; Grzanna, J.; Spolaczyk, R.; Merkel, K. Digital wave-front measuring interferometry: Some systematic error sources. Appl. Opt. 1983, 22, 3421–3432. [Google Scholar] [CrossRef] [PubMed]
  2. Han, G.; Kim, S. Numerical correction of reference phases in phase-shifting interferometry by iterative least-squares fitting. Appl. Opt. 1994, 33, 7321–7325. [Google Scholar] [CrossRef] [PubMed]
  3. Gorthi, S.S.; Schonbrun, E. Phase imaging flow cytometry using a focus-stack collecting microscope. Opt. Lett. 2012, 37, 707–709. [Google Scholar] [CrossRef] [PubMed]
  4. Wild, W.J. Linear phase retrieval for wave-front sensing. Opt. Lett. 1998, 23, 573–575. [Google Scholar] [CrossRef] [PubMed]
  5. Latychevskaia, T. Three-Dimensional Structure from Single Two-Dimensional Diffraction Intensity Measurement. Phys. Rev. Lett. 2021, 127, 063601. [Google Scholar] [CrossRef] [PubMed]
  6. Latychevskaia, T. Iterative phase retrieval for digital holography: Tutorial. J. Opt. Soc. Am. A 2019, 23, D31–D40. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Katkovnik, V.; Shevkunov, I.; Petrov, N.V.; Egiazarian, K. Computational super-resolution phase retrieval from multiple phase-coded diffraction patterns: Simulation study and experiments. Optica 2017, 4, 786–794. [Google Scholar] [CrossRef]
  8. Kocsis, P.; Shevkunov, I.; Katkovnik, V.; Egiazarian, K. Single exposure lensless subpixel phase imaging: Optical system design, modelling, and experimental study. Opt. Lett. 2020, 28, 4625–4637. [Google Scholar] [CrossRef] [PubMed]
  9. Wang, B.Y.; Han, L.; Yang, Y.; Yue, Q.Y.; Guo, C.S. Wavefront sensing based on a spatial light modulator and incremental binary random sampling. Opt. Lett. 2018, 42, 603–606. [Google Scholar] [CrossRef]
  10. Li, R.; Cao, L. Complex wavefront sensing based on alternative structured phase modulation. Appl. Opt. 2021, 60, A48–A53. [Google Scholar] [CrossRef] [PubMed]
  11. Velez-Zea, A.; Ramirez, J.F.B.; Torroba, R. Optimized random phase encryption. Opt. Lett. 2018, 43, 3558–3561. [Google Scholar] [CrossRef]
  12. Gerchberg, R.W.; Saxton, W.O. A practical algorithm for the determination of phase from image and diffraction plane pictures. Optik 1972, 35, 237–250. [Google Scholar]
  13. Rodenburg, J.M.; Faulkner, H.M.L. A phase retrieval algorithm for shifting illumination. Appl. Phys. Lett. 2004, 85, 4795–4797. [Google Scholar] [CrossRef] [Green Version]
  14. Shen, C.; Tan, J.; Wei, C.; Liu, Z. Coherent diffraction imaging by moving a lens. Opt. Express 2016, 24, 16520–16529. [Google Scholar] [CrossRef]
  15. Pedrini, G.; Osten, W. Wave-front reconstruction from a sequence of interferograms recorded at different planes. Opt. Lett. 2005, 30, 833–835. [Google Scholar] [CrossRef]
  16. Guo, C.; Zhao, Y.; Tan, J.; Liu, S.; Liu, Z. Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination. Opt. Express 2018, 26, 14407–14420. [Google Scholar] [CrossRef]
  17. Anand, A.; Chhaniwal, V.K.; Almoro, P.E.; Pedrini, G.; Osten, W. Shape and deformation measurements of 3D objects using volume speckle field and phase retrieval. Opt. Lett. 2009, 34, 1522–1524. [Google Scholar] [CrossRef]
  18. Almoro, P.E.; Gundu, P.N.; Hanson, S.G. Numerical correction of aberrations via phase retrieval with speckle illumination. Opt. Lett. 2009, 34, 521–523. [Google Scholar] [CrossRef]
  19. Almoro, P.E.; Pedrini, G.; Gundu, P.N.; Osten, W. Enhanced wavefront reconstruction by random phase modulation with a phase diffuser. Opt. Lasers Eng. 2011, 49, 252–257. [Google Scholar] [CrossRef]
  20. Yazdani, R.; Fallan, H. Wavefront sensing for a Shack–Hartmann sensor using phase retrieval based on a sequence of intensity patterns. Appl. Opt. 2017, 56, 1358–1364. [Google Scholar] [CrossRef]
  21. Liu, Y.; Liu, Q.; Li, Y.; Zhang, J.; He, Z. High-resolution multi-planar coherent diffraction imaging with multimode fiber source. Opt. Lasers Eng. 2021, 140, 106530. [Google Scholar] [CrossRef]
  22. Claus, D.; Pedrini, G.; Osten, W. Iterative phase retrieval based on variable wavefront curvature. Appl. Opt. 2017, 56, 134–137. [Google Scholar] [CrossRef] [PubMed]
  23. He, X.; Veetil, S.P.; Jiang, Z.; Kong, Y.; Wang, S.; Liu, C. High-speed coherent diffraction imaging by varying curvature of illumination with a focus tunable lens. Opt. Express 2020, 28, 25655–25663. [Google Scholar] [CrossRef] [PubMed]
  24. Binamira, J.F.; Almoro, P.E. Accelerated single-beam multiple-intensity reconstruction using unordered propagations. Opt. Lett. 2019, 44, 3130–3133. [Google Scholar] [CrossRef] [PubMed]
  25. Xu, C.; Yuan, W.; Cao, A.; Xue, L.; Deng, Q.; Pang, H.; Fu, Y. Enhancing multi-distance phase retrieval via unequal interval measurements. Photonics. 2021, 48, 8020048. [Google Scholar] [CrossRef]
  26. Buco, C.R.L.; Almoro, P.E. Enhanced multiple-plane phase retrieval using adaptive support. Opt. Lett. 2019, 44, 6045–6048. [Google Scholar] [CrossRef] [PubMed]
  27. Falaggis, K.; Kozacki, T.; Kujawinska, M. Accelerated single-beam wavefront reconstruction techniques based on relaxation and multiresolution strategies. Opt. Lett. 2013, 38, 1660–1662. [Google Scholar] [CrossRef] [PubMed]
  28. Dorsch, R.G.; Lohmann, A.W.; Sinzinger, S. Fresnel ping-pong algorithm for two-plane computer-generated hologram display. Appl. Opt. 1994, 33, 869–875. [Google Scholar] [CrossRef]
  29. Hernandez, O.; Papagiakoumou, E.; Tanese, D.; Fidelin, K.; Wyart, C.; Emiliani, V. Three-dimensional spatiotemporal focusing of holographic patterns. Nat. Commun. 2016, 7, 11928. [Google Scholar] [CrossRef]
  30. Wakunami, K.; Hsieh, P.Y.; Oi, R.; Senoh, T.; Sasaki, H.; Ichihashi, Y.; Okui, M.; Huang, Y.P.; Yamamoto, K. Projection-type see-through holographic three-dimensional display. Nat. Commun. 2016, 7, 12954. [Google Scholar] [CrossRef] [Green Version]
  31. Leseberg, D. Computer-generated three-dimensional image holograms. Appl. Opt. 1992, 31, 223–229. [Google Scholar] [CrossRef]
  32. Velez-Zea, A.; Torroba, R. Noniterative multiplane holographic projection. Appl. Opt. 2020, 59, 4377–4384. [Google Scholar] [CrossRef]
  33. Xu, C.; Pang, H.; Cao, A.; Deng, Q. Enhanced multiple-plane phase retrieval using a transmission grating. Opt. Lasers Eng. 2022, 149, 106810. [Google Scholar] [CrossRef]
  34. Makowski, M.; Sypek, M.; Kolodziejczyk, A.; Mikula, G.; Suszek, J. Iterative design of multiplane holograms: Experiments and applications. Opt. Eng. 2007, 46, 045802. [Google Scholar] [CrossRef]
  35. Piestun, R.; Spektor, B.; Shamir, J. Wave fields in three dimensions: Analysis and synthesis. J. Opt. Soc. Am. A 1996, 1837–1848. [Google Scholar] [CrossRef]
  36. Makey, G.; Yavuz, Ö.; Kesim, D.K.; Turnalı, A.; Elahi, P.; Ilday, S.; Tokel, O.; Ilday, F.Ö. Breaking crosstalk limits to dynamic holography using orthogonality of high-dimensional random vectors. Nat. Photonics 2019, 13, 251–256. [Google Scholar] [CrossRef]
  37. Zhang, J.; Pegard, N.; Zhong, J.; Adesnik, H.; Waller, L. 3D computer-generated holography by non-convex optimization. Optica 2017, 4, 1306–1313. [Google Scholar] [CrossRef]
  38. Lee, B.; Kim, D.; Lee, S.; Chen, C.; Lee, B. High-contrast, speckle-free, true 3D holography via binary CGH optimization. Sci. Rep. 2022, 12, 2811. [Google Scholar] [CrossRef]
  39. Zhang, H.; Zhou, C.; Shui, X.; Yu, Y. Computer-generated full-color phase-only hologram using a multiplane iterative algorithm with dynamic compensation. Appl. Opt. 2022, 61, B262–B270. [Google Scholar] [CrossRef]
  40. Pi, D.; Liu, J. Computer-generated hologram based on reference light multiplexing for holographic display. Appl. Sci. 2021, 11, 7199. [Google Scholar] [CrossRef]
Figure 1. Imaging system of the proposed phase retrieval method.
Figure 1. Imaging system of the proposed phase retrieval method.
Photonics 09 00187 g001
Figure 2. Two test objects: (a) rough object and (b) smooth object.
Figure 2. Two test objects: (a) rough object and (b) smooth object.
Photonics 09 00187 g002
Figure 3. Three types of patterns. (a) represents the targets of the hologram. (b) represents the holograms and its reconstruction results.
Figure 3. Three types of patterns. (a) represents the targets of the hologram. (b) represents the holograms and its reconstruction results.
Photonics 09 00187 g003
Figure 4. The recorded diffraction patterns of the vortex phase under different types of illumination.
Figure 4. The recorded diffraction patterns of the vortex phase under different types of illumination.
Photonics 09 00187 g004
Figure 5. The iteration convergence curve and the reconstructed results under different types of illumination methods. (a,b) are the reconstruction results of the amplitude and the phase of object 1, respectively.
Figure 5. The iteration convergence curve and the reconstructed results under different types of illumination methods. (a,b) are the reconstruction results of the amplitude and the phase of object 1, respectively.
Photonics 09 00187 g005
Figure 6. The reconstruction performance with different number of intensity measurements. (a,b) are the convergence curves of phase of object 1 and phase of object 2 using HI-group3, respectively. (c,d) is the convergence accuracy of the phase of object 1 and phase of object 2, respectively, in terms of the number of intensity measurements for different types of illumination. (e,f) is the difference between the simulated and reconstructed phases of object 1 using HI-group3 and PWI, respectively. (g,h) is the difference between the simulated and reconstructed phases of object 2 using HI-group3 and PWI, respectively.
Figure 6. The reconstruction performance with different number of intensity measurements. (a,b) are the convergence curves of phase of object 1 and phase of object 2 using HI-group3, respectively. (c,d) is the convergence accuracy of the phase of object 1 and phase of object 2, respectively, in terms of the number of intensity measurements for different types of illumination. (e,f) is the difference between the simulated and reconstructed phases of object 1 using HI-group3 and PWI, respectively. (g,h) is the difference between the simulated and reconstructed phases of object 2 using HI-group3 and PWI, respectively.
Photonics 09 00187 g006
Figure 7. The reconstruction performance with different measurement interval. (a,c) are the convergence accuracy curves of phase of object 1 and phase of object 2 in terms of the sequential measurement distance for different types of illumination. (b,d) is the reconstruction images of the phase of object 1 and phase of object 2 with the increase of iteration times and measurement interval.
Figure 7. The reconstruction performance with different measurement interval. (a,c) are the convergence accuracy curves of phase of object 1 and phase of object 2 in terms of the sequential measurement distance for different types of illumination. (b,d) is the reconstruction images of the phase of object 1 and phase of object 2 with the increase of iteration times and measurement interval.
Photonics 09 00187 g007
Figure 8. The experimental imaging setup of the proposed phase retrieval.
Figure 8. The experimental imaging setup of the proposed phase retrieval.
Photonics 09 00187 g008
Figure 9. The reconstruction results of a designed pattern “rabbit”. (a) The reconstructed amplitude images using original and proposed method after 60, 120, 150, and 300 iterations, respectively. (b) The intensity contrast of the marked lines in (a). (c) LMSE curves for an increasing number of iterations. The white bars in (a) indicate the length of 200 μm.
Figure 9. The reconstruction results of a designed pattern “rabbit”. (a) The reconstructed amplitude images using original and proposed method after 60, 120, 150, and 300 iterations, respectively. (b) The intensity contrast of the marked lines in (a). (c) LMSE curves for an increasing number of iterations. The white bars in (a) indicate the length of 200 μm.
Photonics 09 00187 g009
Figure 10. The reconstruction results of a quantified phase resolution plate. (a,b) are the reconstructed images using the original method and proposed method, respectively. (c,d) is the cross-section phase distribution of groups 4-2 and 4-3, respectively. (e,f) is the depth values of the retrieved lines in groups 3-5 and 3-3, respectively.
Figure 10. The reconstruction results of a quantified phase resolution plate. (a,b) are the reconstructed images using the original method and proposed method, respectively. (c,d) is the cross-section phase distribution of groups 4-2 and 4-3, respectively. (e,f) is the depth values of the retrieved lines in groups 3-5 and 3-3, respectively.
Photonics 09 00187 g010
Figure 11. The reconstruction holograms of three modified GS algorithm.
Figure 11. The reconstruction holograms of three modified GS algorithm.
Photonics 09 00187 g011
Figure 12. The iteration convergence curve of the vortex phase using three modified GS algorithms.
Figure 12. The iteration convergence curve of the vortex phase using three modified GS algorithms.
Photonics 09 00187 g012
Table 1. The reconstruction performance under different types of illumination.
Table 1. The reconstruction performance under different types of illumination.
SamplePerformancePWIHI-Group1HI-Group2HI-Group3
ModulusPhase
Photonics 09 00187 i001MNCC0.86540.40760.26370.0809
Iteration time268724610381
SSIM (Modulus)1.00001.00001.00001.0000
PSNR (Modulus)79.8472.3376.4676.37
Photonics 09 00187 i002MNCC0.90840.47100.37900.1127
Iteration time8828575180140
SSIM (Phase)0.40180.94841.00001.0000
PSNR (Phase)14.3324.47120.6126.9
Table 2. The reconstruction performance under different types of the holographic design method.
Table 2. The reconstruction performance under different types of the holographic design method.
MethodMNCCIteration TimesSSIM (Phase)
NIP0.11271401.0000
SGS0.11721291.0000
GGS0.05901171.0000
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xu, C.; Pang, H.; Cao, A.; Deng, Q. Enhanced Single-Beam Multiple-Intensity Phase Retrieval Using Holographic Illumination. Photonics 2022, 9, 187. https://doi.org/10.3390/photonics9030187

AMA Style

Xu C, Pang H, Cao A, Deng Q. Enhanced Single-Beam Multiple-Intensity Phase Retrieval Using Holographic Illumination. Photonics. 2022; 9(3):187. https://doi.org/10.3390/photonics9030187

Chicago/Turabian Style

Xu, Cheng, Hui Pang, Axiu Cao, and Qiling Deng. 2022. "Enhanced Single-Beam Multiple-Intensity Phase Retrieval Using Holographic Illumination" Photonics 9, no. 3: 187. https://doi.org/10.3390/photonics9030187

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop