# Three-Dimensional Reconstruction of Light Field Based on Phase Similarity

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Principles

#### 2.1. Light Field Imaging and EPI Principle

_{t}

_{*, y*}= L (s, t*, x, y*). Similarly, other restrictions can be defined in the same way. For example, L

_{s}

_{*, t*}is a sub-aperture image of a particular angle in the light field image. Figure 1b shows 7 × 7 sub-aperture images extracted from the 4D light field data, in which the size of each sub-aperture image is 512 pixels × 512 pixels. L

_{s*}

_{, x*,}and L

_{t*}

_{, y*}are referred to as EPI, and EPI can be understood as a horizontal or vertical two-dimensional slice of 4D light field data, as shown in Figure 1c.

#### 2.2. The Principle of 3D Reconstruction of Light Field Based on Phase Similarity

## 3. Calibration

_{0}and the main lens, v represents the distance between the imaging plane and the main lens, and f is the focal length of the main lens.

^{c}, z

^{c}) and image points in MLA plan can be represented as:

_{c}, and x

^{m}is the distance from the MLA’s center to the position where the light passes through the MLA plane.

^{c}and disparity values Δx from Equation (8). According to the disparity calculation relationship, Equation (7) can be rewritten as Equation (9).

## 4. Disparity Calculation

^{*}, x

^{*}) in an EPI image (s, x) of a central-view structured light field, and those points have the highest similarity from the phase value of the target point in the light field slice. The slope of the line at the points (s

^{*}, x

^{*}) can be obtained by linear fitting of these points, as shown in Figure 5. Each point can similarly get the corresponding slope, and then the disparity map can be obtained. The detailed implementation steps are as follows:

## 5. Hardware and Experiments

#### 5.1. Hardware Implementation

#### 5.2. Experimental Results

## 6. Discussion and Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Lueke, J.P.; Rosa, F.; Marichal-Hernandez, J.G.; Sanluis, J.C.; Dominguez Conde, C.; Rodriguez-Ramos, J.M. Depth from light fields analyzing 4d local structure. J. Disp. Technol.
**2015**, 11, 900–907. [Google Scholar] [CrossRef] - Wu, G.; Masia, B.; Jarabo, A.; Zhang, Y.; Wang, L.; Dai, Q.; Chai, T.; Liu, Y. Light field image processing: An overview. IEEE J. Sel. Top. Sign. Proces.
**2017**, 11, 926–954. [Google Scholar] [CrossRef] [Green Version] - Sun, J.; Hossain, M.; Xu, C.; Zhang, B. Investigation of flame radiation sampling and temperature measurement through light field camera. Int. J. Heat Mass Tran.
**2018**, 121, 1281–1296. [Google Scholar] [CrossRef] [Green Version] - Conde, C.D.; Luke, J.P.; Gonzalez, F.R. Implementation of a Depth from Light Field Algorithm on FPGA. Sensors
**2019**, 19, 3562. [Google Scholar] [CrossRef] [Green Version] - Zhou, W.; Wei, X.; Yan, Y.; Wang, W.; Lin, L. A hybrid learning of multimodal cues for light field depth estimation. Digit. Signal. Process.
**2019**, 95, 102585. [Google Scholar] [CrossRef] - Rogge, S.; Schiopu, I.; Munteanu, A. Depth Estimation for Light-Field Images Using Stereo Matching and Convolutional Neural Networks. Sensors
**2020**, 20, 6188. [Google Scholar] [CrossRef] [PubMed] - Liyanage, N.; Wijenayake, C.; Edussooriya, C.; Madanayake, A.; Agathoklis, P.; Bruton, L.; Ambikairajah, E. Multi-depth filtering and occlusion suppression in 4-D light fields: Algorithms and architectures. IEEE Signal Proc. Mag.
**2020**, 167, 107294.1–107294.13. [Google Scholar] [CrossRef] - Pei, R.; Geng, Z.; Zhang, Z.; Cao, X.; Wang, R. A novel optimization method for lenticular 3-D display based on light field decomposition. J. Disp. Technol.
**2016**, 12, 727–735. [Google Scholar] [CrossRef] - Chen, N.; Zuo, C.; Lam, E.Y.; Lee, B. 3D Imaging Based on Depth Measurement Technologies. Sensors
**2018**, 18, 3711. [Google Scholar] [CrossRef] [Green Version] - Yu, Z.; Guo, X.; Ling, H.; Lumsdaine, A.; Yu, J. Line assisted light field triangulation and stereo matching. In Proceedings of the 2013 IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 2792–2799. [Google Scholar] [CrossRef] [Green Version]
- Heber, S.; Pock, T. Shape from light field meets robust PCA. Proc. Eur. Conf. Comput. Vis.
**2014**, 8694, 751–767. [Google Scholar] [CrossRef] - Wang, T.; Efros, A.A.; Ramamoorthi, R. Depth estimation with occlusion modeling using light-field cameras. IEEE Trans. Pattern Anal.
**2016**, 38, 2170–2181. [Google Scholar] [CrossRef] - Tao, M.W.; Hadap, S.; Malik, J.; Ramamoorthi, R. Depth from combining defocus and correspondence using light-field cameras. In Proceedings of the 2013 IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 673–680. [Google Scholar] [CrossRef]
- Tao, M.W.; Srinivasan, P.P.; Hadap, S.; Rusinkiewicz, S.; Malik, J.; Ramamoorthi, R. Shape estimation from shading, defocus, and correspondence using light-field, angular coherence. IEEE Trans. Pattern Anal.
**2017**, 39, 546–560. [Google Scholar] [CrossRef] - Wang, T.; Chandraker, M.; Efros, A.A.; Ramamoorthi, R. SVBRDF-Invariant shape and reflectance estimation from light-field cameras. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 5451–5459. [Google Scholar] [CrossRef]
- Wanner, S.; Goldluecke, B. Variational light field analysis for disparity estimation and super-resolution. IEEE Trans. Pattern Anal.
**2014**, 36, 606–619. [Google Scholar] [CrossRef] - Kim, C.; Zimmer, H.; Pritch, Y.; Sorkine-Hornung, A.; Gross, M. Scene reconstruction from high spatio-angular resolution light fields. ACM Trans. Graph.
**2013**, 32, 73:1–73:12. [Google Scholar] [CrossRef] - Suzuki, T.; Takahashi, K.; Fujii, T. Sheared EPI analysis for disparity estimation from light fields. IEICE Trans. Inf. Syst.
**2017**, 100, 1984–1993. [Google Scholar] [CrossRef] [Green Version] - Zhou, M.; Su, X.; Chen, W.; You, Z.; Lu, M.; Jing, H. Modulation measuring profilometry with auto-synchronous phase shifting and vertical scanning. Opt. Express.
**2014**, 22, 31620–31634. [Google Scholar] - Zhou, P.; Zhang, Y.; Yu, Y.; Cai, W.; Zhou, G. 3D shape measurement based on structured light field imaging. Math. Biosci. Eng.
**2020**, 17, 654–668. [Google Scholar] [CrossRef] - Duan, H.; Mei, L.; Wang, J.; Song, L.; Liu, N. A new imaging model of Lytro light field camera and its calibration. Neurocomputing.
**2019**, 328, 189–194. [Google Scholar] [CrossRef] - Liu, Z.; Wu, Q.; Chen, X.; Yin, Y. High-accuracy calibration of low-cost camera using image disturbance factor. Opt. Express.
**2016**, 24, 24321–24336. [Google Scholar] [CrossRef] [PubMed] - Jiang, W.; Wang, Z. Calibration of visual model for space manipulator with a hybrid LM–GA algorithm. Mech. Syst. Signal. Process.
**2016**, 66, 399–409. [Google Scholar] [CrossRef] - Zhou, P.; Cai, W.; Yu, Y.; Zhang, Y.; Zhou, G. A two-step calibration method of lenslet-based light field cameras. Opt. Laser Eng.
**2019**, 115, 190–196. [Google Scholar] [CrossRef] - Yao, T.; Sang, X.; Chen, D.; Wang, P.; Wang, H.; Yang, S. Multi-view acquisition for 3D light field display based on external mask and compressive sensing. Opt. Commun.
**2019**, 435, 118–125. [Google Scholar] [CrossRef] - Wang, S.; Liu, B.; Chen, Z.; Li, H.; Jiang, S. The Segmentation Method of Target Point Cloud for Polarization-Modulated 3D Imaging. Sensors
**2020**, 20, 179. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Dansereau, D.G.; Pizarro, O.; Williams, S.B. Decoding, calibration and rectification for lenselet-based plenoptic cameras. In Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 1027–1034. [Google Scholar] [CrossRef]

**Figure 1.**Light field imaging. (

**a**) Light field EPI principle; (

**b**) Multi-view images; (

**c**) Epipolar plane image.

**Figure 4.**Sub-aperture imaging principle. (

**a**) Light field image raw data; (

**b**) Sub-aperture image extraction method.

**Figure 6.**3D reconstruction system based on a light field. (

**a**) Schematic diagram; (

**b**) Hardware Implementations.

**Figure 9.**The two objects in the figure were projected by horizontal and vertical fringe. (

**a**,

**e**) were fringe images, (

**b**,

**f**) were wrap phase images, (

**c**,

**g**) were unwrap phase images, (

**d**,

**h**) were disparity images based on phase similarity.

**Figure 10.**3D reconstruction results of different methods. (

**a**) traditional EPI for object 1; (

**b**) our approach for object 1; (

**c**) traditional EPI for object 2; (

**d**) our approach for object 2.

Methods | Data (mm) | Error (mm) | ||
---|---|---|---|---|

Object 1 | Object 2 | Object 1 | Object 2 | |

Ground Truth | 12 | 17 | ||

Tradition EPI | 12.9422 | 18.2741 | 0.9422 | 1.2741 |

Our approach | 12.3179 | 17.3865 | 0.3179 | 0.3865 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Feng, W.; Gao, J.; Qu, T.; Zhou, S.; Zhao, D.
Three-Dimensional Reconstruction of Light Field Based on Phase Similarity. *Sensors* **2021**, *21*, 7734.
https://doi.org/10.3390/s21227734

**AMA Style**

Feng W, Gao J, Qu T, Zhou S, Zhao D.
Three-Dimensional Reconstruction of Light Field Based on Phase Similarity. *Sensors*. 2021; 21(22):7734.
https://doi.org/10.3390/s21227734

**Chicago/Turabian Style**

Feng, Wei, Junhui Gao, Tong Qu, Shiqi Zhou, and Daxing Zhao.
2021. "Three-Dimensional Reconstruction of Light Field Based on Phase Similarity" *Sensors* 21, no. 22: 7734.
https://doi.org/10.3390/s21227734