A Review of Ghost Imaging via Sparsity Constraints

Different from conventional imaging methods, which are based on the first-order field correlation, ghost imaging (GI) obtains the image information through high-order mutual-correlation of light fields from two paths with an object appearing in only one path. As a new optical imaging technology, GI not only provides us new capabilities beyond the conventional imaging methods, but also gives out a new viewpoint of imaging physical mechanism. It may be applied to many potential applications, such as remote sensing, snap-shot spectral imaging, thermal X-ray diffraction imaging and imaging through scattering media. In this paper, we reviewed mainly our research work of ghost imaging via sparsity constraints (GISC) and discussed the application and theory prospect of GISC concisely.


Introduction
With the concept of photon introduced by Albert Einstein in 1905, the statistical description of light wave is absolutely necessary for characterizing the fluctuation of light sources.Classical and quantum optical coherence theory were established by Emil Wolf in 1954 [1][2][3][4] and Roy J. Glauber in 1960 [5,6], respectively, to interpret the statistical properties of light fields.From the viewpoint of optical coherence theories, conventional imaging methods are based on the first-order correlation of light fields, and the attempt to extract spatial information of an object from high-order correlation of light fields can be traced back to the famous HBT experiment, which was conducted by R. Hanbury Brown and R. Q. Twiss in 1956 [7,8] based on the second-order autocorrelation of light fields in time domain.In 1995, Shih's group first demonstrated a novel imaging modality, called GI [9], by correlating an output of a single-pixel (bucket) photodetector, which collected light transmitted through or reflected from an object, with an output from a high spatial-resolution scanning photodetector or photodetector array whose illumination has not interacted with the object.The physics of GI has been thoroughly discussed in the past decades [10][11][12][13][14][15][16][17].Theories and experiments have showed that both quantum two-photon interference and classical intensity-fluctuation correlations could be used for GI.In this review, for the sake of the consistency of the discussion, we focused on the GI researches performed by our group, which is mainly based on the classical coherence theory.
The remainder of the paper is organized as follows.In Section 2, we introduce the theoretical framework of GISC with an emphasis on that the ensemble average required by GI could hardly be satisfied in real applications.In Section 3, we present a brief review of application researches of GISC conducted in Shanghai Institute of Optics & Fine Mechanics (SIOM), including two imaging applications in real space, GISC LiDAR [18,19] illuminated by pseudo-thermal light and GISC spectral camera [20] illuminated by natural light, and two imaging applications in Fourier space, X-ray Fourier-transform GISC [21] and lensless Wiener-Khinchin telescope based on high-order spatial autocorrelation of thermal light [22].Section 4 is devoted to a discussion of the prospect of GISC.

Theoretical Framework of GISC
According to GI theory [10,16,17,23], we have ∆I t (x t )∆I r (x r ) = O(x t , x r ), (1) where ∆I t (x t ) and ∆I r (x r ) are intensity fluctuations of a test and a reference arms, respectively, O(x t , x r ) is a representation of an object's image (such as an image in the real or Fourier-transform space), • • • is the ensemble average.When the intensity in the test arm is detected by a bucket or point detector and the corresponding coordinate x t is fixed, O(x t , x r ) can be expressed as O(x r ).
Since the sampling is always limited in practice, the solution of Equation ( 1) can be approximately presented as where N is the number of samples.Many algorithms have been proposed to improve the imaging quality of GI [24,25].In term of statistical signal processing, O(x r ) corresponds to an expectation, and r (x r ) denotes an estimator.To further improve the optimization results, the prior information of the object, which can be characterized as some kinds of sparsity constraints in many cases, should be applied in the image retrieval.Equation ( 2) can be expressed as min where the first term is the data fidelity term, and the second term is the regularization term corresponding to the prior information of the object, µ 1 and µ 2 are the weight coefficients.Equation ( 3) is the regularized framework of GISC.At present, the data fidelity term in Equation ( 3) is usually approximated as [26][27][28] min in a way that the existing optimization theoretical achievements such as compressive sensing (CS) [29][30][31] can be directly applied, where Y is a reshaped intensity distributions I t (x t ) in the test arm, measurement matrix A consists of the intensity distributions I r (x r ) in the reference arm.Another benefit of this approximation is that the imaging efficiency and imaging fidelity of a GI facility can be discussed in a manner of information theory [32,33].It has been demonstrated that speckle patterns can be optimized by maximizing the mutual information between the sampling signals and the probability distribution function of the target's images [34].Another method to describe an image statistical prior with sparse representation is dictionary learning [35,36], and the measurement matrix optimization based on dictionary learning and the mutual coherence of the matrix, has also been demonstrated [37].

GISC LiDAR
As an important detection tool, LiDAR has been widely used in remote sensing in recent decades [38,39].Traditional imaging LiDAR can be classified into two types: a scanning mode and a non-scanning mode (such as flash radiography) [40][41][42].In the scanning mode, a real-space image of a target is obtained by scanning the target region point-by-point with a pulsed laser [41].Therefore, it is difficult to image a high-speed moving target.The Non-scanning mode, characterized by an imaging system with high resolution and a pulsed flash laser, can cover the whole field of the target and obtain the real-space image in a single exposure [42].However, because the light intensity reflected by the target is divided onto many small pixels of charge-coupled device (CCD) camera, the detection sensitivity is poor, so the detection distance of the non-scanning LiDAR is limited by the signal-to-noise ratio (SNR) of the received photons distributed on the aperture of the imaging system.Currently, it is still difficult for the traditional LiDARs to achieve high-resolution images in long ranges.In addition, all these imaging methods require sampling rate at or beyond the Nyquist limit.GISC LiDAR was proposed to obtain a high-resolution 2D or 3D image in real-space at a sampling rate significantly less than the Nyquist rate with a time-resolved single-pixel detector, which means that it has the capability of both high efficiency in information extraction and high sensitivity in detection [18,19].
Generally speaking, GISC in real-space requires that a measurement distance (between the detector and the light source) in the reference arm equals to an illuminating distance (from the light source to the target) in the test arm, and all of the reflected intensity distribution in the test arm is collected by a bucket detector.However, in remote sensing, the distance to the target, a.k.a., illumination distance, is usually unknown, so matching the measurement distance to the illuminating distance is not a reasonable option.In addition, the optical aperture in practice is very limited, so that only a small fraction of the reflected intensity from the remote target can be collected.Therefore, two key issues should be addressed in GISC LiDAR: (1) light fields on the target and the reference detection plane should be correlated, even when the test arm is far longer than the reference arm, (2) high quality images should still be reconstructed with only a small fraction of the reflected light.
The second-order longitudinal correlation length of a thermal source can be expressed as [43] where λ is the wavelength, z is the illuminating distance from the light source to the target, and D is the diameter of the pseudo-thermal light source.Researches have shown that 3D GI can still be achieved as long as the difference between the illumination and measurement distances is within the spatial longitudinal correlation length of the pseudo-thermal light source [44].In 2009, a proof of principle experiment of far-field GI in real-space was completed [45].According to the statistical characteristics of random speckle fields, the sum of multiple speckle intensity is proportional to the sum of total speckle intensity.The intensity correlation of light fields between the reference and test arms still holds, when the receiving aperture is not much smaller than the transmission aperture, only part of the reflected light is necessary to achieve the equivalent 'bucket' detection and realize 3D images in 3D GISC LiDAR [46].These not only give a theoretical support for 3D GISC LiDAR, but also provide a system design guide.Figure 1a presents a typical schematic of GISC LiDAR system [19].A conjugate relationship, between the target surface and the sensitive plane of the CCD camera in the reference arm, ensures correlation properties between the intensity distributions on the corresponding target and camera planes.A pseudo-thermal light source, which consists of a pulsed laser, a diffuser such as a rotating ground-glass disk controlled by a high-precision motor, and a lens f , forms random speckle patterns on stop 1.The light emitted from the source goes through stop 2 and is subsequently divided by a beam splitter (BS) into an object and a reference paths.In the object path, the speckle pattern on stop 1 is imaged onto the target plane by an objective lens f 0 .The photons reflected from the target are collected by a light concentrator and then passes through an interference filter into a photomultiplier tube (PMT) connected with a high-speed digitizer (an equivalent time-resolved single-pixel bucket detector).In the reference path, CCD camera is placed on an image plane of stop 1.
In 2012, GISC LiDAR was first demonstrated experimentally with a target 900 m away in real atmosphere [18].A year later, a 3D image of a real scene at a range of approximate 1.0 km was demonstrated [19].However, compared to the traditional nonscanning LiDAR, GISC LiDAR usually requires many measurements in a long data acquisition time to achieve the information of the object.While in remote sensing applications, either the LiDAR system is often placed on a moving platform such as an airborne craft and a vehicle, or the object is a moving target with high speed, thus the relative motion between the LiDAR system and target during sampling process will result in motion blur inevitably [47][48][49][50].Besides algorithms and strategies that have been proposed to overcome motion blurs due to constant relative speeds [49,50], two new approaches are developed recently to further improve the imaging performance.One is to short the sampling time by either increasing the sampling speed or decreasing the number of samples.The other is to adopt the traditional motion compensation techniques to stare at the scene to be imaged during the sampling process.

A. Fast Sampling Methods
(a) Prefabricated pseudo-thermal light source.As depicted in Figure 1a, a reference CCD camera is placed in the reference path to record the intensity distributions simultaneously, and the sampling speed is constrained by the frames rate of the camera.Therefore, a prefabricated pseudo-thermal light source has been proposed to breakthrough this speed constraint.The key of the technique is to accurately address incident positions on a diffuser by a high-precision motor to produce repeatable pseudo-thermal speckle patterns.All reference intensity distributions are recorded in a low speed in advance, eliminating the camera recording in the imaging process [51][52][53].
(b) Structure 3D image reconstruction algorithms.As described above, GISC technique exploits the sparsity of a natural target as a constraint, and it can reconstruct the target with a sampling rate far below the Nyquist limit.In a 3D image reconstruction, the orthogonal properties of the target at different distances can be also used for better reconstruction.Taking these into consideration, the number of measurement required to realize a successful 3D reconstruction can be greatly reduced, which is also constructive for fast imaging [19,54].

B. Motion Compensation Techniques
Both the inertial navigation technique and the opto-electronic detection technique [55,56] are suitable for motion compensation.The inertial navigation system mainly consists of a global positioning system (GPS) and an inertial measurement unit (IMU).It makes use of the GPS information of both the LiDAR system and the target, together with the IMU information of LiDAR system, to compensate the relative motion.While the traditional opto-electronic detection technique relies on conventional thermal or visible light images of reference markers.The optimization program for motion compensation is [49] with the discrepancy D(µ) between the calculated signal and the recorded data as: where µ is the unknow constant speed of the target, A(µ) is the measurement matrix dependent on the relative speed µ of the target.
With the techniques mentioned above, an airborne 3D GISC LiDAR image with 0.48 m horizontal resolution as well as 0.5 m range resolution at approximately 1.04 km height is experimentally demonstrated in 2016 [57] (Figure 2).As shown in Figure 2c, this system has the capability of motion deblurring and it realizes successful 3D reconstruction on a high-speed platform.In the past decade, GISC LiDAR has experienced leapfrogging development from principle demonstration, static platforms and static targets to moving platforms and moving targets [19,57], and is promising to be applied in many fields such as high resolution airborne remote sensing, underwater detection and imaging, traffic monitoring, and so on.However, further theoretical and experimental researches in many aspects still need to be performed.For example, the unified theory of GISC LiDAR based on information theory and light field encoding, 1-bit compressive sensing theory [58] for GISC with photon counting detection, and the image quality evaluation of GISC LiDAR are still open questions.

GISC Spectral Camera
Spectral imaging is a multidimensional data acquisition technology combining spectroscopic and image analysis, which captures a 3D spectral data-cube (x,y,λ) about a scene to be imaged.With both spatial and spectral resolving capabilities, it is extremely effective and vital for scene survey and detail information extraction.Conventional spectral imaging, with th point-to-point corresponding between the object and image planes, requires time-consumed scanning along either a spatial or wavelength axis since the 3D spectral data-cube is detected by a 2D detector [59].Hence, its application is restricted in fields such as ultra-fast imaging [60,61].Recently, remarkable snap-shot spectral imaging techniques have been widely developed to acquire a 3D spectral data-cube in just a single exposure [62][63][64][65][66].However, since the correlation between pixels and wavelengths of the 3D image data-cube is not utilized in these methods, the information acquisition efficiency is lower than the Shannon limit [67,68].Another important category of snapshot spectral imaging approaches is the coded aperture spectral imaging, such as the coded aperture snapshot spectral imager (CASSI), which utilizes a binary-coded mask and an equilateral prism to modulate light fields and captures a spectral image with a single shot 2D measurement [69][70][71][72].Combining with the compressive sampling principle, it improves the sampling efficiency through the compressive sampling in photoelectric detection.
GI with true thermal light has been demonstrated by detecting the temporal fluctuation of light fields and the imaging information was obtained from the second-order correlation of light fields in time domain [73,74].This scheme requires that the time resolution of detector is close to or less than the coherence time of light field, which may be as short as femtoseconds for true thermal light [8,20,74].Unlike the other existing imaging methods, a GISC spectral camera modulates the image at the whole spectral band, and the true thermal light is modulated into a spatial fluctuating pseudo-thermal light field by utilizing a spatial random phase modulator.In GISC spectral camera, the spectral images of the object are obtained by the second-order spatial mutual-correlation of light fields.Therefore, high time resolution isn't required in this scheme, and the information of the object can be acquired in a single-shot measurement.As shown in Figure 3, the GISC spectral camera consists of (1) an imaging module, which projects the image of an object on the object plane 'a' onto the first image plane 'b', (2) a spatial random phase modulator, which disperses the image with different wavelengths as a random grating and modulates the image to generate speckle patterns in plane 'c' [20,75,76], (3) a microscope objective, which magnifies the speckle patterns in plane 'c', and (4) a CCD detector recording the magnified speckle patterns.According to the theory of the GISC spectral camera [20], the correlation function of intensity fluctuations between the spatial intensity fluctuations in the pre-determined reference arm and the test arm is where T i (r i , λ l ) is the spectral images of the object, ω and ζ are the height standard deviation and transverse correlation length of the spatial random phase modulator respectively, n is the refractive index, k l = 1 λ l , ⊗ denotes the operation of convolution.Equation (8) specifies that the spectral images of an object can be separated from the correlation function of intensity fluctuations ∆G (2) (r i , λ l ), while the spatial and spectral resolution are determined by the normalized second-order correlation of light fluctuations corresponding to different pixels and wavelengths in the first image plane 'b'.Based on this model, we set up a prototype of the GISC spectral camera.To improve the reconstruction quality with low sampling rate, the structure sparsity and low-effective-rank constraints are applied in practice [77].Figure 4 shows the experimental results of spectral imaging for vehicles, concrete ground and trees, etc.It indicates that the prototype has the capability of snapshot spectral imaging for natural scenes.As a new optical imaging technology, the GISC spectral camera provides a unique solution for spectral imaging of dynamic processes.This GISC imaging solution may also be expanded to other multi-dimensional information (such as polarization) acquisitions, ultra-fast measurements [60,61], and super-resolution imaging [78].With the improvement of GISC spectral camera, we believe it can be applied not only to remote sensing, but also to the consumer market.

X-ray Fourier-Transform GISC
Since X-ray diffraction in crystals was discovered in 1914, X-ray crystallography has become a powerful tool in exploring and analyzing the internal structures of complex materials, such as biomolecules and nanomaterials [79][80][81].However, the structure information of many important molecular materials, such as membrane proteins, is still out of reach, because it is difficult to grow macroscopic crystals for these materials.Meanwhile, with the rapid development of nanoscience and biology, there is an urgent need to observe the internal structures of samples in their natural states instead of in crystals.In 1999, the coherent diffraction imaging (CDI) method was proposed to extend X-ray crystallography to allow imaging noncrystalline structures in nanoscale by illuminating the samples with coherent X-rays and recording the diffraction patterns in the far field [82][83][84].Hereafter, Fresnel CDI and ptychography techniques were proposed to circumvent the intrinsic restriction of sample size in classical CDI [85][86][87][88].Nevertheless, the diffraction pattern within the beam stop is missing in classical CDI, and an additional technology is needed for low-frequency information.
Most of all, due to the requirements for high coherence and brightness, currently, only synchrotron radiation and X-ray free electron laser sources can be used to X-ray CDI applications, and coherent imaging using laboratory X-ray sources remain unexplored.
In Fourier-transform GI (FGI), a Fourier-transform diffraction pattern of a sample can be obtained at the Fresnel region by calculating the second-order correlation of spatially incoherent light [10].It has been demonstrated using an X-ray pseudo-thermal source [21], which provides a feasibility to achieve high-resolution microscopy with a tabletop X-ray source.Since lack of effective X-ray beam splitters, in practice the correlated light fields are generated by simply shuttling the sample in and out of the optical path between the source and the detector.In 2015, Candes group proposed a method of phase retrieval from encoded diffraction patterns to achieve better image quality in the classical CDI, where the sample is encoded by a group of random masks in real space.However, because of the destructive effect of X-ray irradiation, it is almost impossible to multiply encode the sample in the classical CDI.Fortunately, different from CDI techniques, in Fourier-transform GISC, the diffractive pattern of a sample in the testing path can be non-locally encoded by inserting modulation components in the reference path [89].Therefore, multiple encoding of the sample can be easily realized and better image quality may be expected.
Figure 5 shows the principle of coded Fourier-transform GISC [89].A thermal light beam is split into two paths.The sample is in the testing path and a point detector records the intensity signals of this path.In the reference path, a mask is inserted and a panel detector records the intensity distributions.The distances between the beam splitter and the detector in both paths are equal, and the mask's position in the reference arm is the same as the sample's position in the testing arm.The correlation function between the intensity fluctuations of these two arms is proportional to a modulus of a convolution, which can be expressed as [90] where s * (x) is the conjugate of the mask's transmission function, F represents a Fourier transform.Therefore, the Fourier-transform diffraction pattern of the sample encoded by the mask in real space can be obtained from the intensity correlation between the two paths.The mask in the reference path can "non-locally" encode the information of the sample in the test path.For pseudo-thermal light, it is easy to modulate the speckle fields in a predetermined manner, so we can alter the masks in the reference path to get different encoded patterns.Obviously, when s(x) = 1, Equation ( 9) degenerates to the conventional Fourier diffraction pattern of the sample.Based on recombination and reutilization of the correlated intensity distributions of light fields, a spatial multiplexing reconstruction method has been proposed to improve the sampling efficiency and image quality of Fourier-transform GISC.
It can greatly reduce the scale of the sensing matrix in the sensing equation, and provide the feasibility of reconstruction with just a few measurements [91].
Figure 6 shows the simulation results of an encoded X-ray Fourier-transform GISC, with a wavelength of 1 nm, distances d 1 = 10 cm and d 2 = 30 cm, respectively.The sample in the testing path is a double-slits with a separation d slit = 200 µm as shown in Figure 6a, and two masks were adopted in the simulation.The first one is the same double-slit with a 90 • rotation as shown in Figure 6b. Figure 6c is the corresponding non-locally encoded Fourier-transform GISC diffraction pattern.Clearly, it is a coherent diffraction pattern of the encoded sample.Using the second one, a concentric-circle phase mask with a period of 75 µm (Figure 6d), an encoded FGI diffraction patterns as shown in Figure 6e was obtained, which is almost the same as the encoded diffraction pattern in Figure 6f obtained from the conventional coherent diffraction imaging simulation.Then the image can be reconstructed by utilizing phase retrieval algorithms with sparsity constraints [92][93][94][95][96][97][98][99][100][101]. Figure 7 is a photograph of our tabletop X-ray GISC microscope facility.The whole system is placed in a vacuum chamber and controlled remotely.A micro-focus X-ray tube with center energy at 1.25 keV is adopted as the light source.X rays emitted from the tube is filtered to get monochromic X-rays, then passes through a pinhole array and illuminates a designed diffuser to produce high contrast X-ray speckles.A controllable pseudo-thermal X-ray source is obtained by shifting the diffuser transversely.A translation device can switch the sample and the modulation masks in and out of the beam automatically.A CCD camera with a pixel size of 15 µm is arranged in the beam to detect the X-ray intensity fluctuations.By exploiting X-ray Fourier-transform GISC, it is of great hope to realize tabletop high-resolution 3D imaging of complex materials, such as biomolecules and nanomaterials.As a unique lensless imaging scheme, Fourier-transform GISC may also be introduced to neutron optics [102], its novel capability to image crystalline and noncrystalline samples, especially the micro magnetic structures, will bring important applications to various scientific frontiers.

Lensless Wiener-Khinchin Telescope
Image resolution is always an important issue in various fields of scientific researches and engineering applications, many great work and discussions of resolution have been made by researchers.For conventional imaging technologies, which is based on the point-to-point correspondence between the object and the image planes, the resolution can be accurately analyzed with the transmission function of the imaging systems [103,104].However, for imaging systems based on the wavefront random phase modulation [20,64,[105][106][107][108], since the point-to-point correspondence doesn't exist anymore, their resolution ability is difficult to be directly obtained from the transmission function, but can be obtained by analyzing the high-order correlation of light fields.Based on high-order spatial autocorrelation of thermal light, a lensless Wiener-Khinchin telescope was proposed [22].It can acquire an image of an object in a single-shot measurement simply with a spatial random phase modulator.
As shown in Figure 8, the lensless Wiener-Khinchin telescope consists of a spatial random phase modulator and a CCD detector, whose positions are fixed.The object is illuminated by a true thermal light source.
where I 0 (r 0 ) is the intensity in the object plane, r 0 and r are the coordinates in object plane and the sensitive surface of the CCD detector, respectively.g (2) h ∆r 0 2λz 1 is the normalized second-order correlation of the incoherent impulse response distribution.Equation (10) indicates that the Fourier transform of the intensity distribution I 0 (r 0 ) on the object plane F {I 0 (r 0 )} r 0 → f r 0 can be separated from G I t (r + ∆r, r), and the resolution is determined by g (2) h ∆r 0 2λz 1 .The object I 0 (r 0 ) can be reconstructed by utilizing phase retrieval algorithms with sparsity constraints [92][93][94][95][96][97][98][99][100][101].In some cases, only amplitude information of an object is interested, which can be used as a constraint to greatly improve the reconstruction quality in phase retrieval [109].Images of two targets (a double slit and a starfish), illuminated by a xenon lamp, detected by the lensless Wiener-Khinchin telescope are shown in Figure 9. Different parameters are selected.For the double slit, z 1 = 0.15 m, z 2 = 12 mm, and for the starfish, z 1 = 1 m, z 2 = 4 mm.
Comparing with lensless CS imaging [72,105,110] and lensless GI [21,74,111], neither a measurement matrix nor a calibration process is required in lensless Wiener-Khinchin telescope, while the FOV of the lensless Wiener-Khinchin telescope is limited by the memory effect [112][113][114][115] of the system.The lensless Wiener-Khinchin telescope may have many applications, such as astronomical observation and X-rays remote sensing.Moreover, considering the scattering medium as the spatial random phase modulator, the lensless Wiener-Khinchin telescope may also open a door for a quantitatively description for imaging through scattering media [116][117][118][119][120].

Discussion and Future Work
Before the work of Emil Wolf [1][2][3][4], most researchers viewed the fluctuations of light as uninteresting "noise", and the statistical methods is a "second-class" approach in Optics as pointed out by Joseph W. Goodman [121].However, as a computational imaging technology based on high-order correlation of light, GISC not only may be applied to many applications including remote sensing, scattering imaging, and X-ray diffraction imaging, but also opens a way of combining optical imaging technology with modern information sciences to maximize the information gain.The researches of GISC are of course far from closed.In GI, both signals in the reference and the test arms are detected in one measurement.While in the present optimal estimation framework of GISC, the fidelity term is approximated by the conventional min f 1 {Y − AO} model, where only one signal y i ∈ Y is considered in one measurement once the system is calibrated.A self-consistent and more accurate optimal estimation framework for GISC, together with the developments of more powerful techniques of light field modulation, light field fluctuation detection and the electronics specially designed for GISC, are still the burning challenges to demonstrate ghost imaging applications with image quality outshine conventional imaging methods in the near future.

Figure 2 .
Figure 2. The result of the 3D GISC LiDAR reconstruction, the color bar denotes the distance to the GISC LiDAR: (a) the airborne 3D GISC LiDAR system, (b) a photo of the target, (c) the structured 3D GISC reconstruction [57].

Figure 4 .
Figure 4. Experimental results of a natural scenes at 1km away.(a) Reconstructed spectral images by the GISC spectral camera; (b) the original image recorded by the monochrome monitoring camera.

Figure 6 .
Figure 6.Simulation results of X-ray Fourier-transform GISC.(a) is the sample, (b,d) are the masks, (c,e) are the non-locally encoded X-ray Fourier-transform GISC diffraction patterns reconstructed from the intensity correlations with (b,d) inserted in the reference path, respectively.(f) is the corresponding coded diffraction pattern obtained from the coherent diffraction imaging [89].

Figure 7 .
Figure 7.A photograph of the tabletop X-ray GISC microscope facility in SIOM.

Figure 8 .
Figure 8. Schematic of a lensless Wiener-Khinchin telescope based on high-order spatial autocorrelation of thermal light [22].

Figure 9 .
Figure 9. Photographs and reconstructed images of two objects: (a) the photograph of a double-slit, (b) the photograph of a starfish, (c,d) the reconstructed images of the double-slit and starfish, respectively.