Next Article in Journal
Nondestructive Structural Investigation of Yttria-Stabilized Zirconia Fiber Insulation Tile by Synchrotron X-ray In-Line Phase-Contrast Microtomography
Next Article in Special Issue
Phase-Shifting Projected Fringe Profilometry Using Binary-Encoded Patterns
Previous Article in Journal
Optical Fiber Fabry–Perot Interferometer Based Spirometer: Design and Performance Evaluation
Previous Article in Special Issue
Three-Dimensional Stitching of Binocular Endoscopic Images Based on Feature Points
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Augmented Reality Vector Light Field Display with Large Viewing Distance Based on Pixelated Multilevel Blazed Gratings

1
School of Optoelectronic Science and Engineering & Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou 215006, China
2
Key Lab of Advanced Optical Manufacturing Technologies of Jiangsu Province & Key Lab of Modern Optical Technologies of Education Ministry of China, Soochow University, Suzhou 215006, China
3
SVG Optronics, Co., Ltd., Suzhou 215026, China
*
Author to whom correspondence should be addressed.
Photonics 2021, 8(8), 337; https://doi.org/10.3390/photonics8080337
Submission received: 10 July 2021 / Revised: 12 August 2021 / Accepted: 14 August 2021 / Published: 16 August 2021
(This article belongs to the Special Issue Smart Pixels and Imaging)

Abstract

:
Glasses-free augmented reality (AR) 3D display has attracted great interest in its ability to merge virtual 3D objects with real scenes naturally, without the aid of any wearable devices. Here we propose an AR vector light field display based on a view combiner and an off-the-shelf purchased projector. The view combiner is sparsely covered with pixelated multilevel blazed gratings (MBG) for the projection of perspective virtual images. Multi-order diffraction of the MBG is designed to increase the viewing distance and vertical viewing angle. In a 20-inch prototype, multiple sets of 16 horizontal views form a smooth parallax. The viewing distance of the 3D scene is larger than 5 m. The vertical viewing angle is 15.6°. The light efficiencies of all views are larger than 53%. We demonstrate that the displayed virtual 3D scene retains natural motion parallax and high brightness while having a consistent occlusion effect with natural objects. This research can be extended to applications in areas such as human–computer interaction, entertainment, education, and medical care.

1. Introduction

With the aid of optics and computer science, augmented reality (AR) enriches the physical world by overlaying virtual contents and information and offers unprecedented visual experiences for people [1].
To take the advancement in glasses-free 3D display [2,3,4,5,6,7,8], people combine AR technology with glasses-free 3D display to avoid any wearable devices. There are huge demands for glasses-free AR 3D display in a wide range of applications, including human–computer interaction, entertainment, education, and medical care [9,10]. Glasses-free AR 3D display can be divided into reflection-type AR display [11,12] and optical see-through (OST) display [13,14,15,16,17,18,19,20,21,22].
A compact reflection-type AR 3D display with a size of 152 × 76 mm was proposed based on a reflective polarizer (RP), a conventional lens array, and a 2D display [12]. In reflection-type AR 3D display, however, virtual contents were overlayed on virtual images of the physical world rather than the physical world itself. Moreover, 3D displays based on conventional parallax barrier, lenticular lens array, or micro-lens array form self-repeating views. Thus, motion parallax is limited, and the virtual presented depth information might be false because of image flips.
The OST glasses-free 3D display is another type of glasses-free AR 3D display that allows people to directly perceive the real physical world through a transparent combiner. A monochromatic AR-3D display was proposed based on a digital holographic projector and a holographic screen. The viewing angle was 20.8°, and the diffraction efficiency of the holographic combiner (73.6 × 41.4 mm2) for the wavelength of 532 nm was 52.9% [23]. In another strategy, a holographic optical element with a size of 80 × 80 mm2 acted as a lenticular lens array at the Bragg-matched condition [24]. Our group designed a spatial multiplexing holographic see-through combiner based on pixelated metagratings. A 32-inch AR 3D display was demonstrated with a viewing angle of 47° [25]. However, the light efficiency of metagrating is low (~40% in theory and ~12% in the experiment). In fact, since holographic optical elements were generally adopted as the transparent combiner, it is challenging to achieve a large-scale OST 3D display with high efficiency.
Here we propose a glasses-free AR 3D display based on pixelated multilevel blazed gratings (MBG) with the feature of large format and high light efficiency. A 20-inch AR view combiner covered with pixelated MBG arrays redirect the mixed image from the projector to multiple sets of 16 horizontal views at different viewing distances. Light efficiency is crucial in AR display in order to achieve high brightness and high contrast. The efficiencies of the MBG based view combiner is no less than 53%. Moreover, the viewing distance for motion parallax is enlarged to more than 5 m, and the vertical viewing angle is improved to more than 15.6°. Finally, we proved that 3D virtual scenes presented by the view combiner not only preserved natural motion parallax, large viewing distance, and high light efficiency but also had a consistent occlusion effect with natural objects. The proposed glasses-free AR 3D display can be potentially used in head-up displays and exhibition displays for efficient human–environment interaction.

2. Design and Principle of the AR Vector Light Field Display System

As shown in Figure 1a, a virtual 3D scene can be formed by a view combiner. Unlike microlens array-based light field display, the so-called “vector light field display” has no self-repeating views. Vector light field display manipulates the emitting direction of the light beam from each pixel of the view combiner so that convergent views are formed. Each pixelated MBG is aperiodic and has a consistent height. Moreover, the light passes through the pixelated MBG and forms multiple diffraction orders in a series of focal planes (Figure 1b). Multiple sets of 16 horizontal parallax views at different distances are formed with the aid of the view combiner, which creates a large range of viewing distance.
Harmonic diffraction optical element (HDOE) utilizes high-order diffracted light [26,27]. The optical path difference between adjacent bands of the structure is an integer times the wavelength, so the focal point corresponding to the light of other wavelengths is:
f ( λ ) = K λ 0 m λ f 0
where K (K > 1) is the harmonic diffraction coefficient, m represents the diffraction order, λ0 is the design wavelength of the harmonic diffraction elements, λ is the wavelength of the incident light, and f0 is the focal length of the harmonic diffraction elements. When the wavelength of the incident light, λ is consistent with the designed wavelength λ0, and the harmonic diffraction coefficient K is equal to the diffraction order m. The diffraction efficiency of this wavelength reaches its maximum at the designed order m.
The height h of pixelated MBG shown in Figure 1a can be determined according to the following equation [26,27]:
h = K · λ 0 n ( λ 0 ) 1
By increasing K, the height and minimum line width of the structure are increased. We set λ0 as 532 nm, and the refractive index n is 1.61. When K equals 5, the height of the structure is 4.36 μm.
The diffraction energy can be dispersed to adjacent orders by varying the height of the structure. The height is quantized to 16 steps. According to scalar diffraction theory, when a light beam is an incident from air to the base material of the MBG, and we add the height variable factor to the structure height, the diffraction efficiency at the diffraction order m can be determined by [26,27,28]:
η m 16 = { sin [ π ( m h ( 1 + ε ) [ n 2 ( λ ) sin 2 θ cos θ ] λ ) ] sin [ π ( m h ( 1 + ε ) [ n 2 ( λ ) sin 2 θ cos θ ] λ ) / 16 ] · sin [ π m / 16 ] π m } 2
where ε (−1 < ε < 1) is the height variable factor, θ represents the angle of the incident light.
Figure 2a shows the theoretical diffraction efficiency dependence on the structural height according to Equation (3) with θ = 0°. The diffraction efficiency of the 5th order diffraction light reaches a maximum value of 71.73% for a structural height of 4.36 μm. The light intensities of the rest of the orders are very low. When ε = −0.096 and the structural height is decreased to 3.94 μm, the diffraction efficiency of the third to sixth order is 4%, 31%, 30.9%, and 2.9%, respectively. As a result, the range of viewing distance can be tuned by the structural height h to allocate the intensity distribution to the desired diffraction orders. We further conducted an FDTD simulation of the radiation pattern along the propagation direction z. As shown in Figure 2b, the viewing distance of an MBG array has been almost doubled in theory when a height decreases from 4.36 to 3.94 μm. The simulation results are consistent with Equations (1)–(3).

3. Experimental Results of Glasses-Free AR Vector Light Field Display

The fabrication of the view combiner is of great challenge. Firstly, a pre-cleaned glass substrate was coated with 5 μm thick positive photoresist (RJZ-390, RUIHONG Electronics Chemicals, Shenzhen, China). A self-developed laser direct writing lithography system (IGRAPHER820, SVG Optronics, Suzhou, China) was adopted to pattern pixelated MBGs at high throughput. Theoretically, the characteristic machining size of the system is 0.253 μm, so the digital data required by the whole device was about 3.3 TB. It only took around 1.5 days to fabricate 20-inch view combiner covered with pixelated MBG arrays. After photolithography, the view combiner was developed in NaOH solutions and blown dry.
In our experiment, we built up a prototype of the proposed AR vector light field display based on MBG arrays, as shown in Figure 3a. The typical parameters of the prototype are listed in Table 1. A 4K commercially purchased projector (VPL-VW278, SONY, Tokyo, Japan) is integrated to project the coded mixed images. The view combiner is covered by a 1920 × 1080 pixelated MBG with varied orientations and periods. Figure 3b shows a single voxel, which contains 4 × 4 pixelated MBG (Figure 3c), and each pixel corresponds to a viewpoint. A total of 16 designed horizontal convergent views can be formed by 480 × 270 voxelated MBG arrays.
In order to achieve high transparency, the pixels are sparsely arranged with a filling factor of 25%. The size of each pixelated MBG is 121.44 × 121.44 μm2, and each voxel is 0.97152 × 0.97152 mm2.
Due to multiple orders of diffraction, multiple sets of views are formed at different distances. The third, fourth, fifth, and sixth orders of diffraction light at 95, 140, 220, and 525 cm, respectively, form a large viewing distance, as shown in Figure 3d. The corresponding angular resolution is 1.5°, 1.3°, 1.0°, and 0.8°, respectively. We test the actual diffraction efficiency by an optical power meter (PM400K1, Thorlabs, Newton, NJ, USA). The diffraction efficiency of the fourth and fifth orders diffraction light is 12.61% and 40.54%, respectively, leading to a total light efficiency no less than 53.15%. Moreover, the high diffraction orders have long focal depth, providing extended viewing distance, and, as shown in Figure 3d, the sixth order diffraction light contributes to a viewing distance larger than 3 m. We lit 1–16 views successively to individually study the radiance pattern of each view. The intensity distribution of the 16 separate views of the fifth-order is shown in Figure 3e.
We test the characteristics of the AR vector light field display by projecting hybrid-perspective images on the view combiner. The projected virtual 3D scene can be observed at a viewing distance ranged from 0.65 m to 6.60 m in front of the view combiner. We captured the photos with a camera (D810, Nikon, Tokyo, Japan) at a viewing distance of 95, 140, 220, and 525 cm, where the third, fourth, fifth, and sixth diffraction orders are formed. Figure 4 and Videos S1 and S2 show different perspectives of the virtual 3D dinosaur and a real Rubik’s cube at multiple viewing distances. The captured virtual dinosaur retains an image contrast of 200:1 and a smooth motion parallax as a natural object. The virtual 3D image has a consistent occlusion effect with the physical Rubik’s cube behind the view combiner. Moreover, the viewing depth is enlarged to more than 5 m, and the vertical viewing angle is extended to more than 15.6° (Videos S3 and S4).

4. Discussion

In summary, a 20-inch AR vector light field display was built up by integrating a 4K projector with a view combiner. We use pixelated MBG with designed structure height to effectively expand the viewing distance. The light efficiency based on the off-axis MBG is no less than 53.15%. Multiple sets of 16 views at different viewing distances form horizontal parallax with an angular resolution from 0.8° at a viewing distance of 5.25 m to 1.5° at a viewing distance of 0.95 m. The image resolution of a single view is 480 × 270.
Slight fluctuations of view intensity are observed in the SI videos. Firstly, the non-uniform radiance pattern between views can be ascribed to fabrication error and misalignment between view combiner and projected perspective images. Secondly, the fluctuation of total intensity is due to the transition from one view to another. The radiance pattern of each view can be optimized by the design of MBG patterns to further reduce the intensity fluctuation.

5. Conclusions

The proposed AR vector light field display system has the advantages of a simple setup, as well as high-definition, high brightness, and large viewing distance. Moreover, the virtual 3D object retains the motion parallax of natural objects and has a consistent occlusion effect with the real physical scene.
The 20 inches view combiner covered with surface-relief structures can be mass replicated by nanoimprinting technology. The proposed system has an extensive application prospect in human–computer interaction, education, communication, product, design, advertisement, and vehicle display.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/photonics8080337/s1, Video S1: Motion parallax of virtual 3D dinosaur and a physical Rubik’s cube captured at the distance of 140 cm. Video S2: Motion parallax of virtual 3D dinosaur and a physical Rubik’s cube captured at the distance of 220 cm.

Author Contributions

Data curation, F.Z.; Formal analysis, J.S.; Funding acquisition, W.Q.; Methodology, J.H.; Software, M.Y.; Supervision, W.Q.; Visualization, J.H.; Writing—original draft, J.S.; Writing—review and editing, J.S. and W.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Science Foundation of China (NSFC) (61975140), Leading Technology of Jiangsu Basic Research Plan BK20192003, Suzhou Natural Science Foundation of China SYG201930, and the project of the Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors thank Zhen Zhou and Ruibin Li for the useful discussions and suggestions on the nanofabrication process.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kun, Y.; He, Z.; Xiong, J.; Zou, J.; Li, K.; Wu, S. Virtual Reality and Augmented Reality Displays: Advances and Future Perspectives. J. Phys. Photonics 2021, 3, 022010. [Google Scholar]
  2. Geng, J. Three-dimensional display technologies. Adv. Opt. Photonics 2013, 5, 456–535. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Wan, W.; Qiao, W.; Huang, W.; Zhu, M.; Fang, Z.; Pu, D.; Ye, Y.; Liu, Y.; Chen, L. Efficient fabrication method of nano-grating for 3D holographic display with full parallax views. Opt. Express 2016, 24, 6203–6212. [Google Scholar] [CrossRef] [PubMed]
  4. Wan, W.; Qiao, W.; Huang, W.; Zhu, M.; Ye, Y.; Chen, X.; Chen, L. Multiview holographic 3D dynamic display by combining a nano-grating patterned phase plate and LCD. Opt. Express 2017, 25, 1114–1122. [Google Scholar] [CrossRef] [PubMed]
  5. Wan, W.; Qiao, W.; Pu, D.; Li, R.; Wang, C.; Hu, Y.; Duan, H.; Guo, L.J.; Chen, L. Holographic sampling display based on metagratings. iScience 2019, 23, 100773. [Google Scholar] [CrossRef] [PubMed]
  6. Hirayama, R.; Martinez Plasencia, D.; Masuda, N.; Subramanian, S. A volumetric display for visual, tactile and audio presentation using acoustic trapping. Nature 2019, 575, 320–323. [Google Scholar] [CrossRef]
  7. An, J.; Won, K.; Kim, Y.; Hong, J.Y.; Kim, H.; Kim, Y.; Song, H.; Choi, C.; Kim, Y.; Seo, J.; et al. Slim-panel holographic video display. Nat. Commun. 2020, 11, 5568. [Google Scholar] [CrossRef]
  8. Zhou, F.; Hua, J.; Shi, J.; Qiao, W.; Chen, L. Pixelated Blazed Gratings for High Brightness Multiview Holographic 3D Display. IEEE Photonics Technol. Lett. 2020, 32, 283–286. [Google Scholar] [CrossRef]
  9. Carmigniani, J.; Furht, B.; Anisetti, M.; Ceravolo, P.; Damiani, E.; Ivkovic, M. Augmented reality technologies, systems and applications. Multimed. Tools Appl. 2001, 51, 341–377. [Google Scholar] [CrossRef]
  10. Rolland, J.P.; Fuchs, H. Optical Versus Video See-Through HeadMounted Displays in Medical Visualization. Presence 2000, 9, 287–309. [Google Scholar] [CrossRef]
  11. Li, Q.; Deng, H.; Pang, S.; Jiang, W.; Wang, Q. A Reflective Augmented Reality Integral Imaging 3D Display by Using a Mirror-Based Pinhole Array. Appl. Sci. 2019, 9, 3124. [Google Scholar] [CrossRef] [Green Version]
  12. Li, Q.; He, W.; Deng, H.; Zhong, F.Y.; Chen, Y. High-performance reflection-type augmented reality 3D display using a reflective polarizer. Opt. Express 2021, 29, 9446–9453. [Google Scholar] [CrossRef] [PubMed]
  13. Zhang, H.L.; Deng, H.; Li, J.J.; He, M.Y.; Li, D.H.; Wang, Q.H. Integral imaging-based 2D/3D convertible display system by using holographic optical element and polymer dispersed liquid crystal. Opt. Lett. 2019, 44, 387–390. [Google Scholar] [CrossRef] [PubMed]
  14. Su, Y.; Cai, Z.; Liu, Q.; Shi, L.; Zhou, F.; Huang, S.; Guo, P.; Wu, J. Projection-type dual-view holographic three-dimensional display and its augmented reality applications. Opt. Commun. 2018, 428, 216–226. [Google Scholar] [CrossRef]
  15. Lee, J.H.; Yanusik, I.; Choi, Y.; Kang, B.; Hwang, C.; Park, J.; Nam, D.; Hong, S. Automotive augmented reality 3D head-up display based on light-field rendering with eye-tracking. Opt. Express 2020, 28, 29788–29804. [Google Scholar] [CrossRef] [PubMed]
  16. Hong, K.; Yeom, J.; Jang, C.; Hong, J.; Lee, B. Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality. Opt. Lett. 2014, 39, 127–130. [Google Scholar] [CrossRef]
  17. Su, Y.; Cai, Z.; Liu, Q.; Shi, L.; Zhou, F.; Wu, J. Binocular holographic three-dimensional display using a single spatial light modulator and a grating. J. Opt. Soc. Am. A Opt. Image Sci. Vis. 2018, 35, 1477–1486. [Google Scholar] [CrossRef]
  18. Hong, J.Y.; Park, S.G.; Lee, C.K.; Moon, S.; Kim, S.J.; Hong, J.; Kim, Y.; Lee, B. See-through multi-projection three-dimensional display using transparent anisotropic diffuser. Opt. Express 2016, 24, 14138–14151. [Google Scholar] [CrossRef]
  19. Li, G.; Lee, D.; Jeong, Y.; Cho, J.; Lee, B. Holographic display for see-through augmented reality using mirror-lens holographic optical element. Opt. Lett. 2016, 41, 2486–2489. [Google Scholar] [CrossRef]
  20. Krajancich, B.; Padmanaban, N.; Wetzstein, G. Factored Occlusion: Single Spatial Light Modulator Occlusion-capable Optical See-through Augmented Reality Display. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1871–1879. [Google Scholar] [CrossRef]
  21. Zhang, H.L.; Deng, H.; Ren, H.; He, M.Y.; Li, D.H.; Wang, Q.H. See-through 2D/3D compatible integral imaging display system using lens-array holographic optical element and polymer dispersed liquid crystal. Opt. Commun. 2020, 456, 124615. [Google Scholar] [CrossRef]
  22. Mu, C.T.; Tseng, S.H.; Chen, C.H. See-through holographic display with randomly distributed partial computer generated holograms. Opt. Express 2020, 28, 35674–35681. [Google Scholar] [CrossRef]
  23. Wakunami, K.; Hsieh, P.Y.; Oi, R.; Senoh, T.; Sasaki, H.; Ichihashi, Y.; Okui, M.; Huang, Y.P.; Yamamoto, K. Projection-type see-through holographic three-dimensional display. Nat. Commun. 2016, 7, 12954. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Deng, H.; Chen, C.; He, M.Y.; Li, J.J.; Zhang, H.L.; Wang, Q.H. High-resolution augmented reality 3D display with use of a lenticular lens array holographic optical element. J. Opt. Soc. Am. A Opt. Image Sci. Vis. 2019, 36, 588–593. [Google Scholar] [CrossRef]
  25. Shi, J.; Qiao, W.; Hua, J.; Li, R.; Chen, L. Spatial multiplexing holographic combiner for glasses-free augmented reality. Nanophotonics 2020, 9, 3003–3010. [Google Scholar] [CrossRef]
  26. Sweeney, D.W.; Sommargren, G.E. Harmonic diffractive lenses. Appl. Opt. 1995, 34, 2469–2475. [Google Scholar] [CrossRef]
  27. Faklis, D.; Morris, G.M. Spectral properties of multiorder diffractive lenses. Appl. Opt. 1995, 34, 2462–2468. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Yang, L.; Cui, Q.; Liu, T.; Xue, C. Effects of manufacturing errors on diffraction efficiency for multilayer diffractive optical elements. Appl. Opt. 2011, 50, 32. [Google Scholar] [CrossRef]
Figure 1. The schematic diagram of the AR vector light field display system. (a) Schematic of the AR 3D display based on a view combiner with pixelated MBG arrays. The 16-view forms a horizontal parallax along the x-direction. Pixelated MBG arrays with a height of h produce four sets of views along the propagation direction z for extended viewing distance; (b) Schematic diagram of the viewpoints formed by the pixelated MBG array in different focal planes (the third, fourth, fifth, and sixth-order diffractions are shown in the figure).
Figure 1. The schematic diagram of the AR vector light field display system. (a) Schematic of the AR 3D display based on a view combiner with pixelated MBG arrays. The 16-view forms a horizontal parallax along the x-direction. Pixelated MBG arrays with a height of h produce four sets of views along the propagation direction z for extended viewing distance; (b) Schematic diagram of the viewpoints formed by the pixelated MBG array in different focal planes (the third, fourth, fifth, and sixth-order diffractions are shown in the figure).
Photonics 08 00337 g001
Figure 2. Simulation properties of MBG. (a) The dependence of theoretical diffraction efficiency on structural height h for MBG. In the figure, m represents the diffraction order, and the red, green, blue, and cyan line represent the diffraction efficiency on structural height h for MBG of the diffraction order 3, 4, 5, and 6, respectively; (b) The FDTD simulation of light field distribution for a MBG array with a height of 4.36 μm and 3.94 μm along propagation direction z.
Figure 2. Simulation properties of MBG. (a) The dependence of theoretical diffraction efficiency on structural height h for MBG. In the figure, m represents the diffraction order, and the red, green, blue, and cyan line represent the diffraction efficiency on structural height h for MBG of the diffraction order 3, 4, 5, and 6, respectively; (b) The FDTD simulation of light field distribution for a MBG array with a height of 4.36 μm and 3.94 μm along propagation direction z.
Photonics 08 00337 g002
Figure 3. View combiner and its optical properties. (a) The photo of a 20-inch view combiner; (b) The microscopic picture of a voxel comprised of 4 × 4 pixels; (c) A 3D microscopic photo of a pixelated MBG, captured by a laser confocal microscope (LEXT, OLS4100, OLYMPUS, Tokyo, Japan). The artificial color indicates the structural variation in height; (d) The intensity distribution of 16 viewpoints at different viewing distances. The z-axis indicates the viewing distance before the view combiner; (e) The photo of 16 viewpoints at the fifth order diffraction and the intensity distribution curve of 1–16 viewpoints at the 5th order.
Figure 3. View combiner and its optical properties. (a) The photo of a 20-inch view combiner; (b) The microscopic picture of a voxel comprised of 4 × 4 pixels; (c) A 3D microscopic photo of a pixelated MBG, captured by a laser confocal microscope (LEXT, OLS4100, OLYMPUS, Tokyo, Japan). The artificial color indicates the structural variation in height; (d) The intensity distribution of 16 viewpoints at different viewing distances. The z-axis indicates the viewing distance before the view combiner; (e) The photo of 16 viewpoints at the fifth order diffraction and the intensity distribution curve of 1–16 viewpoints at the 5th order.
Photonics 08 00337 g003
Figure 4. Virtual 3D dinosaur and a physical Rubik’s cube captured from left, middle, and right views at a viewing distance of 95 (ac), 140 (df), 220 (gi), and 525 cm (jl), respectively. The f and m in the figure represent the viewing distance and the diffraction order, respectively. The f/#, and exposure time of the camera are 50/2.2, and 1/100 s, respectively.
Figure 4. Virtual 3D dinosaur and a physical Rubik’s cube captured from left, middle, and right views at a viewing distance of 95 (ac), 140 (df), 220 (gi), and 525 cm (jl), respectively. The f and m in the figure represent the viewing distance and the diffraction order, respectively. The f/#, and exposure time of the camera are 50/2.2, and 1/100 s, respectively.
Photonics 08 00337 g004
Table 1. Typical parameters of 20-inch AR vector light field display.
Table 1. Typical parameters of 20-inch AR vector light field display.
ParametersValues
Frame size466.3 × 262.3 mm2
View number16 × 4
Pitch size242.88 μm
Pixel size121.44 × 121.44 μm2
Angular separation1.5° @ 0.95 m~0.8° @ 5.25 m
Viewing angle24° @ 0.95 m~12.8° @ 5.25 m
Resolution1920 × 1080
Refresh rate60 Hz
Central wavelength|
Spectral bandwidth of the projector
Green: 551 nm|48 nm
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Shi, J.; Hua, J.; Zhou, F.; Yang, M.; Qiao, W. Augmented Reality Vector Light Field Display with Large Viewing Distance Based on Pixelated Multilevel Blazed Gratings. Photonics 2021, 8, 337. https://doi.org/10.3390/photonics8080337

AMA Style

Shi J, Hua J, Zhou F, Yang M, Qiao W. Augmented Reality Vector Light Field Display with Large Viewing Distance Based on Pixelated Multilevel Blazed Gratings. Photonics. 2021; 8(8):337. https://doi.org/10.3390/photonics8080337

Chicago/Turabian Style

Shi, Jiacheng, Jianyu Hua, Fengbin Zhou, Min Yang, and Wen Qiao. 2021. "Augmented Reality Vector Light Field Display with Large Viewing Distance Based on Pixelated Multilevel Blazed Gratings" Photonics 8, no. 8: 337. https://doi.org/10.3390/photonics8080337

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop