1. Introduction
With the aid of optics and computer science, augmented reality (AR) enriches the physical world by overlaying virtual contents and information and offers unprecedented visual experiences for people [
1].
To take the advancement in glasses-free 3D display [
2,
3,
4,
5,
6,
7,
8], people combine AR technology with glasses-free 3D display to avoid any wearable devices. There are huge demands for glasses-free AR 3D display in a wide range of applications, including human–computer interaction, entertainment, education, and medical care [
9,
10]. Glasses-free AR 3D display can be divided into reflection-type AR display [
11,
12] and optical see-through (OST) display [
13,
14,
15,
16,
17,
18,
19,
20,
21,
22].
A compact reflection-type AR 3D display with a size of 152 × 76 mm was proposed based on a reflective polarizer (RP), a conventional lens array, and a 2D display [
12]. In reflection-type AR 3D display, however, virtual contents were overlayed on virtual images of the physical world rather than the physical world itself. Moreover, 3D displays based on conventional parallax barrier, lenticular lens array, or micro-lens array form self-repeating views. Thus, motion parallax is limited, and the virtual presented depth information might be false because of image flips.
The OST glasses-free 3D display is another type of glasses-free AR 3D display that allows people to directly perceive the real physical world through a transparent combiner. A monochromatic AR-3D display was proposed based on a digital holographic projector and a holographic screen. The viewing angle was 20.8°, and the diffraction efficiency of the holographic combiner (73.6 × 41.4 mm
2) for the wavelength of 532 nm was 52.9% [
23]. In another strategy, a holographic optical element with a size of 80 × 80 mm
2 acted as a lenticular lens array at the Bragg-matched condition [
24]. Our group designed a spatial multiplexing holographic see-through combiner based on pixelated metagratings. A 32-inch AR 3D display was demonstrated with a viewing angle of 47° [
25]. However, the light efficiency of metagrating is low (~40% in theory and ~12% in the experiment). In fact, since holographic optical elements were generally adopted as the transparent combiner, it is challenging to achieve a large-scale OST 3D display with high efficiency.
Here we propose a glasses-free AR 3D display based on pixelated multilevel blazed gratings (MBG) with the feature of large format and high light efficiency. A 20-inch AR view combiner covered with pixelated MBG arrays redirect the mixed image from the projector to multiple sets of 16 horizontal views at different viewing distances. Light efficiency is crucial in AR display in order to achieve high brightness and high contrast. The efficiencies of the MBG based view combiner is no less than 53%. Moreover, the viewing distance for motion parallax is enlarged to more than 5 m, and the vertical viewing angle is improved to more than 15.6°. Finally, we proved that 3D virtual scenes presented by the view combiner not only preserved natural motion parallax, large viewing distance, and high light efficiency but also had a consistent occlusion effect with natural objects. The proposed glasses-free AR 3D display can be potentially used in head-up displays and exhibition displays for efficient human–environment interaction.
2. Design and Principle of the AR Vector Light Field Display System
As shown in
Figure 1a, a virtual 3D scene can be formed by a view combiner. Unlike microlens array-based light field display, the so-called “vector light field display” has no self-repeating views. Vector light field display manipulates the emitting direction of the light beam from each pixel of the view combiner so that convergent views are formed. Each pixelated MBG is aperiodic and has a consistent height. Moreover, the light passes through the pixelated MBG and forms multiple diffraction orders in a series of focal planes (
Figure 1b). Multiple sets of 16 horizontal parallax views at different distances are formed with the aid of the view combiner, which creates a large range of viewing distance.
Harmonic diffraction optical element (HDOE) utilizes high-order diffracted light [
26,
27]. The optical path difference between adjacent bands of the structure is an integer times the wavelength, so the focal point corresponding to the light of other wavelengths is:
where
K (
K > 1) is the harmonic diffraction coefficient,
m represents the diffraction order,
λ0 is the design wavelength of the harmonic diffraction elements,
λ is the wavelength of the incident light, and
f0 is the focal length of the harmonic diffraction elements. When the wavelength of the incident light,
λ is consistent with the designed wavelength
λ0, and the harmonic diffraction coefficient
K is equal to the diffraction order
m. The diffraction efficiency of this wavelength reaches its maximum at the designed order
m.
The height
h of pixelated MBG shown in
Figure 1a can be determined according to the following equation [
26,
27]:
By increasing K, the height and minimum line width of the structure are increased. We set λ0 as 532 nm, and the refractive index n is 1.61. When K equals 5, the height of the structure is 4.36 μm.
The diffraction energy can be dispersed to adjacent orders by varying the height of the structure. The height is quantized to 16 steps. According to scalar diffraction theory, when a light beam is an incident from air to the base material of the MBG, and we add the height variable factor to the structure height, the diffraction efficiency at the diffraction order
m can be determined by [
26,
27,
28]:
where
ε (−1 <
ε < 1) is the height variable factor,
θ represents the angle of the incident light.
Figure 2a shows the theoretical diffraction efficiency dependence on the structural height according to Equation (3) with
θ = 0°. The diffraction efficiency of the 5th order diffraction light reaches a maximum value of 71.73% for a structural height of 4.36 μm. The light intensities of the rest of the orders are very low. When
ε = −0.096 and the structural height is decreased to 3.94 μm, the diffraction efficiency of the third to sixth order is 4%, 31%, 30.9%, and 2.9%, respectively. As a result, the range of viewing distance can be tuned by the structural height
h to allocate the intensity distribution to the desired diffraction orders. We further conducted an FDTD simulation of the radiation pattern along the propagation direction z. As shown in
Figure 2b, the viewing distance of an MBG array has been almost doubled in theory when a height decreases from 4.36 to 3.94 μm. The simulation results are consistent with Equations (1)–(3).
3. Experimental Results of Glasses-Free AR Vector Light Field Display
The fabrication of the view combiner is of great challenge. Firstly, a pre-cleaned glass substrate was coated with 5 μm thick positive photoresist (RJZ-390, RUIHONG Electronics Chemicals, Shenzhen, China). A self-developed laser direct writing lithography system (IGRAPHER820, SVG Optronics, Suzhou, China) was adopted to pattern pixelated MBGs at high throughput. Theoretically, the characteristic machining size of the system is 0.253 μm, so the digital data required by the whole device was about 3.3 TB. It only took around 1.5 days to fabricate 20-inch view combiner covered with pixelated MBG arrays. After photolithography, the view combiner was developed in NaOH solutions and blown dry.
In our experiment, we built up a prototype of the proposed AR vector light field display based on MBG arrays, as shown in
Figure 3a. The typical parameters of the prototype are listed in
Table 1. A 4K commercially purchased projector (VPL-VW278, SONY, Tokyo, Japan) is integrated to project the coded mixed images. The view combiner is covered by a 1920 × 1080 pixelated MBG with varied orientations and periods.
Figure 3b shows a single voxel, which contains 4 × 4 pixelated MBG (
Figure 3c), and each pixel corresponds to a viewpoint. A total of 16 designed horizontal convergent views can be formed by 480 × 270 voxelated MBG arrays.
In order to achieve high transparency, the pixels are sparsely arranged with a filling factor of 25%. The size of each pixelated MBG is 121.44 × 121.44 μm2, and each voxel is 0.97152 × 0.97152 mm2.
Due to multiple orders of diffraction, multiple sets of views are formed at different distances. The third, fourth, fifth, and sixth orders of diffraction light at 95, 140, 220, and 525 cm, respectively, form a large viewing distance, as shown in
Figure 3d. The corresponding angular resolution is 1.5°, 1.3°, 1.0°, and 0.8°, respectively. We test the actual diffraction efficiency by an optical power meter (PM400K1, Thorlabs, Newton, NJ, USA). The diffraction efficiency of the fourth and fifth orders diffraction light is 12.61% and 40.54%, respectively, leading to a total light efficiency no less than 53.15%. Moreover, the high diffraction orders have long focal depth, providing extended viewing distance, and, as shown in
Figure 3d, the sixth order diffraction light contributes to a viewing distance larger than 3 m. We lit 1–16 views successively to individually study the radiance pattern of each view. The intensity distribution of the 16 separate views of the fifth-order is shown in
Figure 3e.
We test the characteristics of the AR vector light field display by projecting hybrid-perspective images on the view combiner. The projected virtual 3D scene can be observed at a viewing distance ranged from 0.65 m to 6.60 m in front of the view combiner. We captured the photos with a camera (D810, Nikon, Tokyo, Japan) at a viewing distance of 95, 140, 220, and 525 cm, where the third, fourth, fifth, and sixth diffraction orders are formed.
Figure 4 and
Videos S1 and S2 show different perspectives of the virtual 3D dinosaur and a real Rubik’s cube at multiple viewing distances. The captured virtual dinosaur retains an image contrast of 200:1 and a smooth motion parallax as a natural object. The virtual 3D image has a consistent occlusion effect with the physical Rubik’s cube behind the view combiner. Moreover, the viewing depth is enlarged to more than 5 m, and the vertical viewing angle is extended to more than 15.6° (
Videos S3 and S4).
4. Discussion
In summary, a 20-inch AR vector light field display was built up by integrating a 4K projector with a view combiner. We use pixelated MBG with designed structure height to effectively expand the viewing distance. The light efficiency based on the off-axis MBG is no less than 53.15%. Multiple sets of 16 views at different viewing distances form horizontal parallax with an angular resolution from 0.8° at a viewing distance of 5.25 m to 1.5° at a viewing distance of 0.95 m. The image resolution of a single view is 480 × 270.
Slight fluctuations of view intensity are observed in the SI videos. Firstly, the non-uniform radiance pattern between views can be ascribed to fabrication error and misalignment between view combiner and projected perspective images. Secondly, the fluctuation of total intensity is due to the transition from one view to another. The radiance pattern of each view can be optimized by the design of MBG patterns to further reduce the intensity fluctuation.
5. Conclusions
The proposed AR vector light field display system has the advantages of a simple setup, as well as high-definition, high brightness, and large viewing distance. Moreover, the virtual 3D object retains the motion parallax of natural objects and has a consistent occlusion effect with the real physical scene.
The 20 inches view combiner covered with surface-relief structures can be mass replicated by nanoimprinting technology. The proposed system has an extensive application prospect in human–computer interaction, education, communication, product, design, advertisement, and vehicle display.
Supplementary Materials
The following are available online at
https://www.mdpi.com/article/10.3390/photonics8080337/s1, Video S1: Motion parallax of virtual 3D dinosaur and a physical Rubik’s cube captured at the distance of 140 cm. Video S2: Motion parallax of virtual 3D dinosaur and a physical Rubik’s cube captured at the distance of 220 cm.
Author Contributions
Data curation, F.Z.; Formal analysis, J.S.; Funding acquisition, W.Q.; Methodology, J.H.; Software, M.Y.; Supervision, W.Q.; Visualization, J.H.; Writing—original draft, J.S.; Writing—review and editing, J.S. and W.Q. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the Natural Science Foundation of China (NSFC) (61975140), Leading Technology of Jiangsu Basic Research Plan BK20192003, Suzhou Natural Science Foundation of China SYG201930, and the project of the Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions.
Data Availability Statement
The data presented in this study are available on request from the corresponding author.
Acknowledgments
The authors thank Zhen Zhou and Ruibin Li for the useful discussions and suggestions on the nanofabrication process.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Kun, Y.; He, Z.; Xiong, J.; Zou, J.; Li, K.; Wu, S. Virtual Reality and Augmented Reality Displays: Advances and Future Perspectives. J. Phys. Photonics 2021, 3, 022010. [Google Scholar]
- Geng, J. Three-dimensional display technologies. Adv. Opt. Photonics 2013, 5, 456–535. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wan, W.; Qiao, W.; Huang, W.; Zhu, M.; Fang, Z.; Pu, D.; Ye, Y.; Liu, Y.; Chen, L. Efficient fabrication method of nano-grating for 3D holographic display with full parallax views. Opt. Express 2016, 24, 6203–6212. [Google Scholar] [CrossRef] [PubMed]
- Wan, W.; Qiao, W.; Huang, W.; Zhu, M.; Ye, Y.; Chen, X.; Chen, L. Multiview holographic 3D dynamic display by combining a nano-grating patterned phase plate and LCD. Opt. Express 2017, 25, 1114–1122. [Google Scholar] [CrossRef] [PubMed]
- Wan, W.; Qiao, W.; Pu, D.; Li, R.; Wang, C.; Hu, Y.; Duan, H.; Guo, L.J.; Chen, L. Holographic sampling display based on metagratings. iScience 2019, 23, 100773. [Google Scholar] [CrossRef] [PubMed]
- Hirayama, R.; Martinez Plasencia, D.; Masuda, N.; Subramanian, S. A volumetric display for visual, tactile and audio presentation using acoustic trapping. Nature 2019, 575, 320–323. [Google Scholar] [CrossRef]
- An, J.; Won, K.; Kim, Y.; Hong, J.Y.; Kim, H.; Kim, Y.; Song, H.; Choi, C.; Kim, Y.; Seo, J.; et al. Slim-panel holographic video display. Nat. Commun. 2020, 11, 5568. [Google Scholar] [CrossRef]
- Zhou, F.; Hua, J.; Shi, J.; Qiao, W.; Chen, L. Pixelated Blazed Gratings for High Brightness Multiview Holographic 3D Display. IEEE Photonics Technol. Lett. 2020, 32, 283–286. [Google Scholar] [CrossRef]
- Carmigniani, J.; Furht, B.; Anisetti, M.; Ceravolo, P.; Damiani, E.; Ivkovic, M. Augmented reality technologies, systems and applications. Multimed. Tools Appl. 2001, 51, 341–377. [Google Scholar] [CrossRef]
- Rolland, J.P.; Fuchs, H. Optical Versus Video See-Through HeadMounted Displays in Medical Visualization. Presence 2000, 9, 287–309. [Google Scholar] [CrossRef]
- Li, Q.; Deng, H.; Pang, S.; Jiang, W.; Wang, Q. A Reflective Augmented Reality Integral Imaging 3D Display by Using a Mirror-Based Pinhole Array. Appl. Sci. 2019, 9, 3124. [Google Scholar] [CrossRef] [Green Version]
- Li, Q.; He, W.; Deng, H.; Zhong, F.Y.; Chen, Y. High-performance reflection-type augmented reality 3D display using a reflective polarizer. Opt. Express 2021, 29, 9446–9453. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.L.; Deng, H.; Li, J.J.; He, M.Y.; Li, D.H.; Wang, Q.H. Integral imaging-based 2D/3D convertible display system by using holographic optical element and polymer dispersed liquid crystal. Opt. Lett. 2019, 44, 387–390. [Google Scholar] [CrossRef] [PubMed]
- Su, Y.; Cai, Z.; Liu, Q.; Shi, L.; Zhou, F.; Huang, S.; Guo, P.; Wu, J. Projection-type dual-view holographic three-dimensional display and its augmented reality applications. Opt. Commun. 2018, 428, 216–226. [Google Scholar] [CrossRef]
- Lee, J.H.; Yanusik, I.; Choi, Y.; Kang, B.; Hwang, C.; Park, J.; Nam, D.; Hong, S. Automotive augmented reality 3D head-up display based on light-field rendering with eye-tracking. Opt. Express 2020, 28, 29788–29804. [Google Scholar] [CrossRef] [PubMed]
- Hong, K.; Yeom, J.; Jang, C.; Hong, J.; Lee, B. Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality. Opt. Lett. 2014, 39, 127–130. [Google Scholar] [CrossRef]
- Su, Y.; Cai, Z.; Liu, Q.; Shi, L.; Zhou, F.; Wu, J. Binocular holographic three-dimensional display using a single spatial light modulator and a grating. J. Opt. Soc. Am. A Opt. Image Sci. Vis. 2018, 35, 1477–1486. [Google Scholar] [CrossRef]
- Hong, J.Y.; Park, S.G.; Lee, C.K.; Moon, S.; Kim, S.J.; Hong, J.; Kim, Y.; Lee, B. See-through multi-projection three-dimensional display using transparent anisotropic diffuser. Opt. Express 2016, 24, 14138–14151. [Google Scholar] [CrossRef]
- Li, G.; Lee, D.; Jeong, Y.; Cho, J.; Lee, B. Holographic display for see-through augmented reality using mirror-lens holographic optical element. Opt. Lett. 2016, 41, 2486–2489. [Google Scholar] [CrossRef]
- Krajancich, B.; Padmanaban, N.; Wetzstein, G. Factored Occlusion: Single Spatial Light Modulator Occlusion-capable Optical See-through Augmented Reality Display. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1871–1879. [Google Scholar] [CrossRef]
- Zhang, H.L.; Deng, H.; Ren, H.; He, M.Y.; Li, D.H.; Wang, Q.H. See-through 2D/3D compatible integral imaging display system using lens-array holographic optical element and polymer dispersed liquid crystal. Opt. Commun. 2020, 456, 124615. [Google Scholar] [CrossRef]
- Mu, C.T.; Tseng, S.H.; Chen, C.H. See-through holographic display with randomly distributed partial computer generated holograms. Opt. Express 2020, 28, 35674–35681. [Google Scholar] [CrossRef]
- Wakunami, K.; Hsieh, P.Y.; Oi, R.; Senoh, T.; Sasaki, H.; Ichihashi, Y.; Okui, M.; Huang, Y.P.; Yamamoto, K. Projection-type see-through holographic three-dimensional display. Nat. Commun. 2016, 7, 12954. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Deng, H.; Chen, C.; He, M.Y.; Li, J.J.; Zhang, H.L.; Wang, Q.H. High-resolution augmented reality 3D display with use of a lenticular lens array holographic optical element. J. Opt. Soc. Am. A Opt. Image Sci. Vis. 2019, 36, 588–593. [Google Scholar] [CrossRef]
- Shi, J.; Qiao, W.; Hua, J.; Li, R.; Chen, L. Spatial multiplexing holographic combiner for glasses-free augmented reality. Nanophotonics 2020, 9, 3003–3010. [Google Scholar] [CrossRef]
- Sweeney, D.W.; Sommargren, G.E. Harmonic diffractive lenses. Appl. Opt. 1995, 34, 2469–2475. [Google Scholar] [CrossRef]
- Faklis, D.; Morris, G.M. Spectral properties of multiorder diffractive lenses. Appl. Opt. 1995, 34, 2462–2468. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yang, L.; Cui, Q.; Liu, T.; Xue, C. Effects of manufacturing errors on diffraction efficiency for multilayer diffractive optical elements. Appl. Opt. 2011, 50, 32. [Google Scholar] [CrossRef]
| Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).