1. Introduction
In recent years, single-photon counting lidar has been widely used in three-dimensional (3D) imaging under weak light conditions because of its single-photon sensitivity and picosecond time resolution. This technology has many significant applications, such as long-distance imaging, underwater target detection [
1]. The core idea of this technology is time-correlated single-photon counting (TCSPC). TCSPC is the most effective way to measure the temporal structure of weak optical signals, which is based on the accurate measurement of photon arrival times and time difference [
2,
3]. TCSPC was mainly used in the detection of fluorescence lifetime in the early stage [
4,
5,
6,
7,
8,
9,
10,
11,
12]. After that, TCSPC began to be used in the field of laser ranging [
13,
14,
15,
16,
17,
18] and gradually developed to the direction of laser long-distance 3D imaging based on time-of-flight (TOF) algorithm [
19,
20,
21,
22,
23,
24,
25,
26]. Recently, Li et al. achieved the imaging of targets 45 kilometers away [
27]. Kirmani et al. [
28] and Xialin Liu et al. [
29] detected the first photon in TCSPC system to image the object.
In the general TCSPC system, the image result is greatly affected by environmental noise because of the ultra-high sensitivity of the single-photon detectors (SPD). Up to now, most of the research teams used the range-gated technology to decrease the impact of noise [
20,
24,
27]. Besides, Kong et al. used two Geiger-mode avalanche photodiodes and an AND gate to reduce false alarms caused by environmental noise [
30]. Huang et al. come up with a new method to decrease environmental noise by using correlative photons and spatial correlations [
31].
In this paper, we combine SPD, high repetition rate laser pulses illumination and TCSPC technology to design a new imaging scheme. The new scheme has the capability of de-noise, high-resolution imaging, accurate analysis of the photon flight time and 3D imaging. Three experiments are completed to demonstrate the power of the new imaging scheme. The first is the de-noise experiment which proves that our scheme can image the object when the signal-to-noise ratio (SNR) is decreased to −13 dB and improve the structural similarity index (SSIM) [
32] of imaging results by 10 times. The second proves that the scheme has a lateral imaging resolution of
and an axial resolution of 4.2 mm. The third is 3D imaging by photometric stereo algorithm which uses multiple detectors to reconstruct the 3D image of the object. Besides, the new imaging scheme has the potential to realize turbulence-free imaging [
33], non-of-sight imaging technology [
34] and imaging in a complicated environment.
2. Theory Analysis
In our scheme, the core idea is to use the TCSPC technology to get the information of the number and arrival time of each signal photon. Aided by the range-gated technology, the environmental noise can be significantly reduced. Furthermore, with these information the TOF algorithm and photometric stereo algorithm can be used to reconstruct the 3D model of the object.
2.1. TCSPC
TCSPC technology, which is mainly used to detect the low intensity and high repetition frequency pulsed laser, is developed on the basis of the photon-counting technology. The realization process is as follows: pulse laser emits a pulse, and a synchronization signal is generated at the same time to trigger the START of the TCSPC module to start timing, as shown in
Figure 1a. Each photon detected by the SPD triggers the STOP of the TCSPC module to stop timing. Then the TCSPC module can record the arrival time of each photon reference to the emitted laser pulse. At last, after statistical processing, the histogram of the number of photons VS the arrival time can be obtained, as shown in
Figure 1b.
2.2. Detection Probability of Signal Photon Events
As described in [
28,
35], supposing the object point
is illuminated by a pulse
and the echo pulse reflected the SPD is
. Then the theoretical number of photons detected by SPD is
where
represents the flight distance of the laser from the target point,
c represents the speed of light,
represents the reflectivity of target and
b represents the number of background photons. Considering that the quantum efficiency of SPD is
and the dark counts are
d, the actual number of photons
detected by the SPD is
Let the pulse laser period be
then the average number of photons received by the SPD in a period is
According to the Poisson probability model [
36], for each pulse emitted by the laser, the number of photons
k received by the detector satisfies the following distribution:
When the SPD does not record any photon, that is , the probability can be denoted by .
2.3. Imaging Scheme Based on TCSPC
In our scheme, the dead time of SPD is longer than the pulse width of laser. Therefore, the number
of photons received by the SPD after each point scanned by the scanning mirror is
where
denotes the pulse laser period and
denotes the scanning frequency of scanning mirror. The pixel value of target point
is represented by
. When the whole object is scanned by the scanning mirror, the two-dimensional image of the target can be obtained by rearranging the pixel values. Besides, because the flight time of each photon and the number of photons can be obtained accurately by the TCSPC module, the depth information of the object can be calculated by the TOF algorithm. Lastly, multiple SPDs are used to detect objects from different angles. By combining the photometric stereo algorithm, we can also get the 3D model of the object.
3. Experimental Setup and Results
To verify the 3D imaging scheme based on TCSPC proposed in this paper, three groups of experiments are performed to confirm the de-noise ability, high-resolution imaging ability, 3D imaging based on the photometric stereo algorithm. We use a pulsed laser with a wavelength of 532 nm as light source, which generated 30 ps duration pulses at a repetition rate of 10 MHz. The SPD module is the PDM series manufactured by Micron Photon Devices. The dead time of SPD is 80 ns and the quantum efficiency at 532 nm is about 45%. Besides, HydraHarp 400, whose time resolution down to 1 ps and maximum count rate of up to 12 Mcps per channel, is used as the TCSPC module and a field-programmable gate array (FPGA) is used as the main controller.
3.1. Noise Reduction Experiment Based on Range-Gated Technology
The experimental setup is depicted in
Figure 2. FPGA controls the laser to emit a pulse and simultaneously generates a synchronous signal input to the HydraHarp 400 to trigger the START signal to start timing. After each fixed number of pulses produced by the laser (1024 pulses in our experiment), FPGA controls the scanning mirror to rotate once. Then SPD records the corresponding photon signal. Since the pulse width (30 ps) of the laser is smaller than the dead time of SPD (80 ns), the SPD can only respond to one photon at most for each pulse. So the probability that SPD detects the signal light is as shown in Equation (
5). After the entire object is scanned by the scanning mirror, the computer can get the image of the object by rearranging the photon signals of HydraHarp 400 according to the preset scanning mode of scanning mirror.
The experimental results are shown in
Figure 3. During the experiment, the number of noise photons in the environment is kept at around 10,000 counts per second (cps), and the number of signal photons is decreased from 3000 cps to 500 cps in turn. The experimental results are compared between the imaging results with and without the range-gated technology. As shown in
Figure 3, the experimental image quality with gating is obviously better than that without gating. When the number of signal photons decreased to 500 cps, that is SNR is −13 dB, the experimental results with gating can still distinguish the object while the image without gating cannot. Besides, the SSIM, defined as Equation (
6), is used to evaluate the qualities of the reconstructed images.
where
and
are the local means, standard deviations, and cross-covariance for images
. Furthermore,
,
, where
L is the specified dynamic range value (10 bits in this experiment). As shown in
Figure 4, the SSIM of reconstructed images with gating is 10 times higher than that without gating.
3.2. High-Resolution Imaging Experiments for Complex Objects
In this part, two sets of experiments are set to verify the high resolution including the lateral resolution and the axial resolution of the scheme. Firstly, the lateral resolution of our scheme can be calculated by
where
is the wavelength of the laser,
is beam waist diameter of laser,
z is the distance from the target to laser. To further prove the lateral resolution, we choose a dinosaur model with a size of
as the imaging object in this experiment. The experimental process is roughly the same as that of the experiment in
Section 3.1.
Figure 5 shows the original object and the experimental imaging result, in which the image resolution of the object is
, the image depth is 8 bit and the time for one frame is about 6.7 s.
Then to ensure the axial resolution, a whiteboard (
) is placed on the stepper motor (TSA150-B, repeat positioning accuracy less than 7 um). The stepper motor moves ten times and moves 10 mm each time. The distance that the whiteboard moves is evaluated by the TOF algorithm, defines as:
where
D is the distance,
c is the speed of light and
is the time interval. In this experiment, the theoretical total time jitter mainly consists of the pulse width of the laser
the time jitter of TCSPC module
and the time jitter of
as the function of
As shown in
Figure 6a, the total time jitter measured by the experiment is 90 ps. Therefore the distance error
of our scheme is
. Because we take the average value of 10 points near the peak position as the measurement value each time, the actual measurement error
can be decreased to 4.2 mm, as the function:
Figure 6b shows the measurement results and
Figure 6c shows the measurement errors which are less than ±1.5 mm.
3.3. 3D Imaging Experiment Based on Photometric Stereo Algorithm
The photometric stereo algorithm is a method to reconstruct the 3D model of an object from multiple images. It uses a set of cameras to take pictures of objects from different directions and then obtains the 3D information by calculating the reflectivity and surface normal vector [
37]. We define the direction of the light source is
, the normal unit vector of a point on the surface of the object is
, the reflectivity is
and the intensity of the pixel on the corresponding image is
I. Then we can get the Lambert model
where
represents the matrix transpose operator,
p and
q respectively represent the two gradient directions on the surface of the object. Assuming that there are three known non-coplanar light directions, then Equation (
11) can be obtained. Because
is a full rank matrix, the normal vector can be expressed as
. Then we can get the depth information by integrating the normal vector of the object surface,
Figure 7 shows the experimental setup. Three SPDs are used in this experiment to detect the 3D object (a rabbit model with a size of
) from different directions, which need to be calibrated in advance. The experimental data is processed by the photometric stereo algorithm to obtain a 3D image of the object. As shown in
Figure 8, the method can reconstruct the 3D model of the object well.
4. Conclusions
In summary, a 3D imaging scheme based on TCSPC technology has been proposed and demonstrated. In the scheme, we adopt the range-gated technology to reduce environmental noise and improve the SSIM of imaging results by 10 times. At the same time, the TOF algorithm and photometric stereo algorithm are used for three-dimensional imaging of objects. Experiments show that the proposed scheme can realize imaging with a lateral resolution of and an axial resolution of 4.2 mm. Finally, the proposed imaging scheme has the potential to be widely used in imaging in complicated environments, especially in remote, haze and underwater.
Author Contributions
Conceptualization and methodology, H.Z., Y.Z. and C.F.; software, C.F. and G.W.; validation, C.F., H.Z. and H.C.; formal analysis, G.W.; investigation, Y.H.; resources, H.C.; data curation, C.F.; writing–original draft preparation, C.F.; writing–review and editing, C.F., H.Z., Y.Z. and J.L.; visualization, H.C. and J.L.; project administration and Supervision, J.S. and Z.X. All authors have read and agreed to the published version of the manuscript.
Funding
This research was supported by Shanxi Key Research and Development Project (Grant No. 2019ZDLGY09-10); Key Innovation Team of Shaanxi Province (Grant No. 2018TD-024) and 111 Project of China (Grant No.B14040).
Conflicts of Interest
The authors declare no conflict of interest.
References
- Chhabra, P.; Maccarone, A.; McCarthy, A.; Buller, G.; Wallace, A. Discriminating underwater lidar target signatures using sparse multi-spectral depth codes. In Proceedings of the 2016 Sensor Signal Processing for Defence (SSPD), Edinburgh, UK, 22–23 September 2016; pp. 1–5. [Google Scholar]
- Becker, W. Advanced Time-Correlated Single Photon Counting Techniques; Springer Series in Chemical Physics; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
- Buller, G.S.; Collins, R.J. Single-photon generation and detection. Meas. Sci. Technol. 2010, 21, 012002. [Google Scholar] [CrossRef]
- Ware, W.R. Fluorescence Lifetime Measurements by Time Correlated Single Photon Counting; Technical Report; University of Minnesota: Minneapolis, MN, USA, 1969. [Google Scholar]
- Ware, W.R. Transient luminescence measurements. Creat. Detect. Excit. State 1971, 1, 213. [Google Scholar]
- Schuyler, R.; Isenberg, I. A monophoton fluorometer with energy discrimination. Rev. Sci. Instrum. 1971, 42, 813–817. [Google Scholar] [CrossRef]
- Bachrach, R. A photon counting apparatus for kinetic and spectral measurements. Rev. Sci. Instrum. 1972, 43, 734–737. [Google Scholar] [CrossRef]
- Binkert, T.; Tschanz, H.; Zinsli, P. The measurement of fluorescence decay curves with the single-photon counting method and the evaluation of rate parameters. J. Lumin. 1972, 5, 187–217. [Google Scholar] [CrossRef]
- Lewis, C.; Ware, W.R.; Doemeny, L.J.; Nemzek, T.L. The Measurement of Short-Lived Fluorescence Decay Using the Single Photon Counting Method. Rev. Sci. Instrum. 1973, 44, 107–114. [Google Scholar] [CrossRef]
- Love, L.C.; Shaver, L. Time correlated single photon technique: Fluorescence lifetime measurements. Anal. Chem. 1976, 48, 364A–371A. [Google Scholar] [CrossRef]
- Becker, W.; Bergmann, A.; König, K.; Tirlapur, U. Picosecond fluorescence lifetime microscopy by TCSPC imaging. In Multiphoton Microscopy in The Biomedical Sciences; International Society for Optics and Photonics: Bellingham, WA, USA, 2001; Volume 4262, pp. 414–419. [Google Scholar]
- Becker, W.; Bergmann, A.; Hink, M.; König, K.; Benndorf, K.; Biskup, C. Fluorescence lifetime imaging by time-correlated single-photon counting. Microsc. Res. Tech. 2004, 63, 58–66. [Google Scholar] [CrossRef]
- Degnan, J.J. Satellite laser ranging: Current status and future prospects. IEEE Trans. Geosci. Remote Sens. 1985, GE-23, 398–413. [Google Scholar] [CrossRef]
- Massa, J.S.; Wallace, A.M.; Buller, G.S.; Fancey, S.; Walker, A.C. Laser depth measurement based on time-correlated single-photon counting. Opt. Lett. 1997, 22, 543–545. [Google Scholar] [CrossRef] [Green Version]
- Massa, J.S.; Buller, G.S.; Walker, A.C.; Cova, S.; Umasuthan, M.; Wallace, A.M. Time-of-flight optical ranging system based on time-correlated single-photon counting. Appl. Opt. 1998, 37, 7298–7304. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Degnan, J.J. Photon-counting multikilohertz microlaser altimeters for airborne and spaceborne topographic measurements. J. Geodyn. 2002, 34, 503–549. [Google Scholar] [CrossRef] [Green Version]
- Ren, M.; Gu, X.; Liang, Y.; Kong, W.; Wu, E.; Wu, G.; Zeng, H. Laser ranging at 1550 nm with 1-GHz sine-wave gated InGaAs/InP APD single-photon detector. Opt. Express 2011, 19, 13497–13502. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chen, S.; Liu, D.; Zhang, W.; You, L.; He, Y.; Zhang, W.; Yang, X.; Wu, G.; Ren, M.; Zeng, H.; et al. Time-of-flight laser ranging and imaging at 1550 nm using low-jitter superconducting nanowire single-photon detection system. Appl. Opt. 2013, 52, 3241–3245. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- McCarthy, A.; Collins, R.J.; Krichel, N.J.; Fernández, V.; Wallace, A.M.; Buller, G.S. Long-range time-of-flight scanning sensor based on high-speed time-correlated single-photon counting. Appl. Opt. 2009, 48, 6241–6251. [Google Scholar] [CrossRef] [Green Version]
- McCarthy, A.; Ren, X.; Della Frera, A.; Gemmell, N.R.; Krichel, N.J.; Scarcella, C.; Ruggeri, A.; Tosi, A.; Buller, G.S. Kilometer-range depth imaging at 1550 nm wavelength using an InGaAs/InP single-photon avalanche diode detector. Opt. Express 2013, 21, 22098–22113. [Google Scholar] [CrossRef] [Green Version]
- Henriksson, M.; Larsson, H.; Grönwall, C.; Tolt, G. Continuously scanning time-correlated single-photon-counting single-pixel 3-D lidar. Opt. Eng. 2016, 56, 031204. [Google Scholar] [CrossRef]
- Gordon, K.; Hiskett, P.; Lamb, R. Advanced 3D imaging lidar concepts for long range sensing. In Advanced Photon Counting Techniques VIII; International Society for Optics and Photonics: Bellingham, WA, USA, 2014; Volume 9114, p. 91140G. [Google Scholar]
- Kostamovaara, J.; Huikari, J.; Hallman, L.; Nissinen, I.; Nissinen, J.; Rapakko, H.; Avrutin, E.; Ryvkin, B. On laser ranging based on high-speed/energy laser diode pulses and single-photon detection techniques. IEEE Photonics J. 2015, 7, 1–15. [Google Scholar] [CrossRef]
- Pawlikowska, A.M.; Halimi, A.; Lamb, R.A.; Buller, G.S. Single-photon three-dimensional imaging at up to 10 kilometers range. Opt. Express 2017, 25, 11919–11931. [Google Scholar] [CrossRef]
- Kang, Y.; Li, L.; Liu, D.; Li, D.; Zhang, T.; Zhao, W. Fast long-range photon counting depth imaging with sparse single-photon data. IEEE Photonics J. 2018, 10, 1–10. [Google Scholar] [CrossRef]
- Li, Z.; Wu, E.; Pang, C.; Du, B.; Tao, Y.; Peng, H.; Zeng, H.; Wu, G. Multi-beam single-photon-counting three-dimensional imaging lidar. Opt. Express 2017, 25, 10189–10195. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.P.; Huang, X.; Cao, Y.; Wang, B.; Li, Y.H.; Jin, W.; Yu, C.; Zhang, J.; Zhang, Q.; Peng, C.Z.; et al. Single-photon computational 3D imaging at 45 km. arXiv 2019, arXiv:1904.10341. [Google Scholar]
- Kirmani, A.; Venkatraman, D.; Shin, D.; Colaço, A.; Wong, F.N.; Shapiro, J.H.; Goyal, V.K. First-photon imaging. Science 2014, 343, 58–61. [Google Scholar] [CrossRef] [PubMed]
- Liu, X.; Shi, J.; Wu, X.; Zeng, G. Fast first-photon ghost imaging. Sci. Rep. 2018, 8, 5012. [Google Scholar] [CrossRef]
- Kong, H.J.; Kim, T.H.; Jo, S.E.; Oh, M.S. Smart three-dimensional imaging ladar using two Geiger-mode avalanche photodiodes. Opt. Express 2011, 19, 19323–19329. [Google Scholar] [CrossRef]
- Huang, P.; He, W.; Gu, G.; Chen, Q. Depth imaging denoising of photon-counting lidar. Appl. Opt. 2019, 58, 4390–4394. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [Green Version]
- Tobin, R.; Halimi, A.; McCarthy, A.; Laurenzis, M.; Christnacher, F.; Buller, G.S. Three-dimensional single-photon imaging through obscurants. Opt. Express 2019, 27, 4590–4611. [Google Scholar] [CrossRef]
- Xin, S.; Nousias, S.; Kutulakos, K.N.; Sankaranarayanan, A.C.; Narasimhan, S.G.; Gkioulekas, I. A theory of fermat paths for non-line-of-sight shape reconstruction. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 6800–6809. [Google Scholar]
- Li, S.; Zhang, Z.; Ma, Y.; Zeng, H.; Zhao, P.; Zhang, W. Ranging performance models based on negative-binomial (NB) distribution for photon-counting lidars. Opt. Express 2019, 27, A861–A877. [Google Scholar] [CrossRef]
- Snyder, D.L. Random Point Processes; Wiley: Hoboken, NJ, USA, 1975. [Google Scholar]
- Holroyd, M.; Lawrence, J.; Humphreys, G.; Zickler, T. A photometric approach for estimating normals and tangents. ACM Trans. Graph. (TOG) 2008, 27, 133. [Google Scholar] [CrossRef]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).