Multi-Aperture Joint-Encoding Fourier Ptychography for a Distributed System
Abstract
:1. Introduction
2. Methods
2.1. Image Formation Model
2.2. Pixel Super-Resolution
- The initial estimated intensity image of the high-resolution target () is obtained by using the up-sampling result from averaging the low-resolution images taken with all coded apertures:
- Generation of low-resolution estimates of light fields: The first step involves performing a Fourier transformation on the high-resolution target, after which the optical system applies a low-pass filter. Here, we define as the optical transfer function corresponding to the i-th coded aperture. Then, the intensity information () is captured by the detector after the Fourier inverse transform:
- Implementation of spatial-domain intensity constraints: The amplitude of the estimated optical field () is replaced with the actual captured low-resolution intensity image (), and the phase information is kept unchanged to obtain the replaced estimated optical field ().
- Updating of the spectrum for high-resolution targets: We up-sample the replaced light field () and transform it to the frequency domain to update the region selected by the coded aperture: .
- Repetition of the updating process: Steps 2 to 4 are repeated for all acquired low-resolution images, which is considered completing one iteration.
- Iteration until convergence: Steps 2 to 5 are repeated until the algorithm converges to complete the reconstruction process of the high-resolution target spectrum. Subsequently, a Fourier inverse transform is applied to it to obtain the high-resolution complex amplitude information of the target. The goal of the iteration step is to minimize the difference between the estimated light field information and the actual low-resolution image acquired.
2.3. Joint Reconstruction of Multiple Apertures
- Gaussian pyramid decomposition is first performed on the high-resolution results of aligned single-aperture reconstruction:
- Creation of a Laplace pyramid. We interpolate the Gaussian pyramid to obtain the image , which has the same dimensions as :
- The fusion process is performed layer by layer starting from the top layer of the Laplace image pyramid of the sub-aperture high-resolution reconstruction result. The fused Laplace pyramid is obtained based on the fusion rule of maximum absolute value at high frequencies and average value at low frequencies. Finally, the inverse step of decomposition is applied to the fused pyramid to obtain the fused reconstructed image. The whole multi-scale fusion framework based on Laplace pyramid decomposition is shown in Figure 6.
3. Results
3.1. Experimental Setup
3.2. Spatial Resolution Results
3.3. Temperature Results
3.4. Discussion on Method Robustness
- SNR: We carried out simulation experiments on low-resolution images under different noise conditions and obtained the corresponding reconstruction results. Taking the noiseless condition as a reference, the reconstruction results were evaluated in terms of the structural similarity index (SSIM) for image quality, as shown in Figure 12. When the SNR of the image was close to 5.4, the structural similarity of the reconstruction result decreased to below 0.2, and the reconstruction quality was unacceptable. It can be seen that the proposed method has certain requirements relative to the SNR of the captured images. Therefore, our subsequent work will focus on how to further improve the SNR of the infrared system.
- Elliptical-aperture scanning speed: The elliptical rotation speed needs to be selected based on both image SNR and acquisition efficiency. Indeed, both slow rotation, to ensure a sufficient SNR for a single image, and fast speed, to improve the acquisition efficiency of the entire system, are required.
- Light source: In the experiment, we also attempted to use heated metal plates as infrared targets. Due to the highest heating temperature of the metal plate being 130 °C, the SNR of the collected images was low, which had a certain impact on the reconstruction results. The reason for using this blackbody as a light source is that it provides stable infrared radiation at high temperatures and high SNRs.
- Vibration and camera stability: In the high-resolution imaging scheme based on conventional synthetic apertures, the sub-wavelength scale of phase accuracy requires the sub-aperture to be highly stable. Different from those of conventional methods, the requirements of the proposed method in terms of camera stability are comparable to those of conventional single-aperture loads, i.e., the platform is required to be free from large vibrations as well as other instabilities during the exposure time of a single imaging session.
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhu, L.; Suomalainen, J.; Liu, J.; Hyyppä, J.; Kaartinen, H.; Haggren, H. A Review: Remote Sensing Sensors. In Multi-Purposeful Application of Geospatial Data; Rustamov, R.B., Hasanova, S., Zeynalova, M.H., Eds.; InTech: Houston, TX, USA, 2018; ISBN 978-1-78923-108-3. [Google Scholar]
- Lu, D.; Liu, Z. Hyperlenses and Metalenses for Far-Field Super-Resolution Imaging. Nat Commun 2012, 3, 1205. [Google Scholar] [CrossRef] [PubMed]
- Tippie, A.E.; Kumar, A.; Fienup, J.R. High-Resolution Synthetic-Aperture Digital Holography with Digital Phase and Pupil Correction. Opt. Express 2011, 19, 12027–12038. [Google Scholar] [CrossRef] [PubMed]
- Rogalski, A.; Martyniuk, P.; Kopytko, M. Challenges of Small-Pixel Infrared Detectors: A Review. Rep. Prog. Phys. 2016, 79, 046501. [Google Scholar] [CrossRef] [PubMed]
- Barnard, K.J. High-Resolution Image Reconstruction from a Sequence of Rotated and Translated Frames and Its Application to an Infrared Imaging System. Opt. Eng 1998, 37, 247. [Google Scholar] [CrossRef]
- Harris, J.L. Diffraction and Resolving Power. J. Opt. Soc. Am. 1964, 54, 931. [Google Scholar] [CrossRef]
- Dong, C.; Loy, C.C.; He, K.; Tang, X. Image Super-Resolution Using Deep Convolutional Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 38, 295–307. [Google Scholar] [CrossRef] [PubMed]
- Dong, C.; Loy, C.C.; Tang, X. Accelerating the Super-Resolution Convolutional Neural Network. In Computer Vision—ECCV 2016; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Swizerland, 2016; Volume 9906, pp. 391–407. ISBN 978-3-319-46474-9. [Google Scholar]
- Baker, S.; Kanade, T. Limits on Super-Resolution and How to Break Them. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head, SC, USA, 13–15 June 2000. [Google Scholar]
- Lin, Z.; Shum, H.Y. Fundamental Limits of Reconstruction-Based Superresolution Algorithms under Local Translation. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 83–97. [Google Scholar] [CrossRef]
- Nguyen, N.; Milanfar, P. A Wavelet-Based Interpolation-Restoration Method for Superresolution. Circuits Syst. Signal Process. 2000, 19, 321–338. [Google Scholar] [CrossRef]
- Irani, M.; Peleg, S. Improving Resolution by Image Registration. CVGIP Graph. Models Image Process. 1991, 53, 231–239. [Google Scholar] [CrossRef]
- Sohail, M. Geometric Superresolution Using an Optical Rectangular Mask. Opt. Eng 2012, 51, 013203. [Google Scholar] [CrossRef]
- Sohail, M.; Mudassar, A.A. Geometric Superresolution by Using an Optical Mask. Appl. Opt. 2010, 49, 3000. [Google Scholar] [CrossRef]
- Haq, I.U.; Mudassar, A.A. Geometric Super-Resolution Using Negative Rect Mask. Optik 2018, 168, 323–341. [Google Scholar] [CrossRef]
- Wang, B.; Zou, Y.; Zuo, C.; Sun, J.; Hu, Y. Pixel Super Resolution Imaging Method Based on Coded Aperture Modulation; SPIE: Bellingham, WA, USA, 2021; Volume 11761, p. 1176111. [Google Scholar]
- Tao, Y.; Muller, J.-P. A Novel Method for Surface Exploration: Super-Resolution Restoration of Mars Repeat-Pass Orbital Imagery. Planet. Space Sci. 2016, 121, 103–114. [Google Scholar] [CrossRef]
- La Grassa, R.; Cremonese, G.; Gallo, I.; Re, C.; Martellato, E. YOLOLens: A Deep Learning Model Based on Super-Resolution to Enhance the Crater Detection of the Planetary Surfaces. Remote Sens. 2023, 15, 1171. [Google Scholar] [CrossRef]
- Wang, C.; Zhang, Y.; Zhang, Y.; Tian, R.; Ding, M. Mars Image Super-Resolution Based on Generative Adversarial Network. IEEE Access 2021, 9, 108889–108898. [Google Scholar] [CrossRef]
- Wu, Y.; Cheng, W.; Wen, C. A Method to Determine the Parameters of Infrared Camera in the Electron Optic Tracking System of UAV. In Proceedings of the AOPC 2020: Optical Sensing and Imaging Technology, Beijing, China, 30 November–2 December 2020; SPIE: Bellingham, WA, USA, 2020; Volume 11567, pp. 139–144. [Google Scholar]
- Xiang, M.; Pan, A.; Zhao, Y.; Fan, X.; Zhao, H.; Li, C.; Yao, B. Coherent Synthetic Aperture Imaging for Visible Remote Sensing via Reflective Fourier Ptychography. Opt. Lett. 2021, 46, 29. [Google Scholar] [CrossRef] [PubMed]
- Miyamura, N.; Suzumoto, R.; Ikari, S.; Nakasuka, S. Conceptual Optical Design of a Synthetic Aperture Telescope by Small Satellite Formation Flying for GEO Remote Sensing. Aerosp. Technol. Jpn. 2020, 18, 101–107. [Google Scholar] [CrossRef]
- Fienup, J.P. Direct-Detection Synthetic-Aperture Coherent Imaging by Phase Retrieval. Opt. Eng 2017, 56, 113111. [Google Scholar] [CrossRef]
- Wu, Y.; Hui, M.; Li, W.; Liu, M.; Dong, L.; Kong, L.; Zhao, Y. MTF Improvement for Optical Synthetic Aperture System via Mid-Frequency Compensation. Opt. Express 2021, 29, 10249. [Google Scholar] [CrossRef]
- Underwood, C.; Pellegrino, S.; Lappas, V.; Bridges, C.; Taylor, B.; Chhaniyara, S.; Theodorou, T.; Shaw, P.; Arya, M.; Breckinridge, J.; et al. Autonomous Assembly of a Reconfiguarble Space Telescope (AAReST)—A CubeSat/Microsatellite Based Technology Demonstrator. In Proceedings of the 27th Annual AIAA/USU Conference on Small Satellites, Logan, UT, USA, 10–15 August 2013. [Google Scholar]
- Zhang, Q.; Liu, Y.; Blum, R.S.; Han, J.; Tao, D. Sparse Representation Based Multi-Sensor Image Fusion: A Review. Inf. Fusion 2017, 40, 57–75. [Google Scholar] [CrossRef]
- Zhou, Z.; Li, S.; Wang, B. Multi-Scale Weighted Gradient-Based Fusion for Multi-Focus Images. Inf. Fusion 2014, 20, 60–72. [Google Scholar] [CrossRef]
- Blum, R.S.; Liu, Z. Multi-Sensor Image Fusion and Its Applications; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Zheng, G.; Horstmeyer, R.; Yang, C. Wide-Field, High-Resolution Fourier Ptychographic Microscopy. Nat. Photonics 2013, 7, 739–745. [Google Scholar] [CrossRef]
- Dong, S.; Bian, Z.; Shiradkar, R.; Zheng, G. Sparsely Sampled Fourier Ptychography. Opt. Express 2014, 22, 5455–5464. [Google Scholar] [CrossRef] [PubMed]
- Sun, J.; Chen, Q.; Zhang, Y. Sampling Criteria for Fourier Ptychographic Microscopy in Object Space and Frequency Space. Opt. Express 2016, 24, 15765–15781. [Google Scholar] [CrossRef] [PubMed]
- Zheng, G.; Shen, C.; Jiang, S.; Song, P.; Yang, C. Concept, Implementations and Applications of Fourier Ptychography. Nat Rev Phys 2021, 3, 207–223. [Google Scholar] [CrossRef]
- Adelson, E.H.; Anderson, C.H.; Bergen, J.R.; Burt, P.J.; Ogden, J.M. Pyramid Methods in Image Processing. RCA Eng. 1984, 29, 33–41. [Google Scholar]
- Burt, P.J.; Adelson, E.H. The Laplacian Pyramid as a Compact Image Code. In Readings in Computer Vision; Elsevier: Amsterdam, The Netherlands, 1987. [Google Scholar]
- Pierrehumbert, R.T. Infrared Radiation and Planetary Temperature. Phys. Today 2011, 64, 33–38. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, T.; Xiang, M.; Liu, F.; Liu, J.; Dong, X.; Wang, S.; Li, G.; Shao, X. Multi-Aperture Joint-Encoding Fourier Ptychography for a Distributed System. Remote Sens. 2024, 16, 1017. https://doi.org/10.3390/rs16061017
Wang T, Xiang M, Liu F, Liu J, Dong X, Wang S, Li G, Shao X. Multi-Aperture Joint-Encoding Fourier Ptychography for a Distributed System. Remote Sensing. 2024; 16(6):1017. https://doi.org/10.3390/rs16061017
Chicago/Turabian StyleWang, Tianyu, Meng Xiang, Fei Liu, Jinpeng Liu, Xue Dong, Sen Wang, Gang Li, and Xiaopeng Shao. 2024. "Multi-Aperture Joint-Encoding Fourier Ptychography for a Distributed System" Remote Sensing 16, no. 6: 1017. https://doi.org/10.3390/rs16061017
APA StyleWang, T., Xiang, M., Liu, F., Liu, J., Dong, X., Wang, S., Li, G., & Shao, X. (2024). Multi-Aperture Joint-Encoding Fourier Ptychography for a Distributed System. Remote Sensing, 16(6), 1017. https://doi.org/10.3390/rs16061017