Fast Multiple Projection Image Method for VR and AR Near-eye 3D Display

We propose a computer-generated hologram method for near-eye virtual reality and augmented reality 3D display. A 3D object located near the holographic plane is projected onto a projection plane to obtain a plurality of projected images with different angles. The hologram is calculated by superposition of projected images convolution with corresponding PSF. Holographic 3D display systems with LED as illumination, 4f optical filtering system and lens as eyepiece for near-eye VR display and HOE as combiner for near-eye AR display are designed and developed. The results show that the proposed method is about 38 times faster than the conventional point cloud method. The increase in computation speed can be further achieved through a reduction in the number of distinct viewing angles used in the calculation. The proposed method is flexible to produce speckle noise free high-quality VR and AR 3D images with efficient focus and defocus capabilities.


Introduction
Holographic display is a promising candidate for 3D display. It is possible to implement computer generated holograms (CGH) for holographic 3D display [1,2]. High resolution holograms such as rainbow holograms and Fresnel holograms have been proposed [3,4]. Many dynamic holographic display solutions have also been implemented [5][6][7][8][9][10]. However, the progress on holographic 3D display still has proven difficult due to the constraints of the followings such as the computational difficulty for the large size of 3D point cloud data, the limited bandwidth of spatial light modulator (SLM), and the speckle noise of the reconstructed 3D images due to the use of coherent illumination sources.
While the difficulties exist, virtual reality (VR) and augmented reality (AR) displays have become emerging technologies. In the VR display, the human eye views the virtual image through the display device. However, in the AR display, users can view both real scenes and some virtual images simultaneously [11]. In this technique, the virtual image is displayed in front of the human eye through an optical system and AR eyepiece and ambient light from real object can pass incident on human eye through the AR eyepiece without distortion. The combination of holography and VR or AR display to achieve 3D display is quite intuitive and thus desirable, which can avoid the accommodation vergence conflict problem existing in binocular parallax-based 3D display [12,13].
Fresnel hologram is often chosen to produce 3D display with large depth of field using coherent source such as laser as illumination. However, it is not quite suitable for holographic VR or AR display because of the large imaging distance between the source of illumination and the Fresnel hologram. Additionally, for use of coherent illumination source, speckle noise of holographic 3D display also can bring negative impact on image quality.
Based on these inspirations, a new method of holographic VR/AR display is proposed. In contrast to the conventional hologram computation method such as point cloud-based, layer-based, and triangular mesh-based algorithms [14][15][16][17][18][19], a faster hologram computation method is developed.
Within this method, plurality of images with different angles are projected on a projection plane near the hologram plane. Subsequently, the optical field on the hologram plane is computed by superposition of the convolution of each 2D projected image with specific point spread functions (PSF) corresponding to propagation from the projection plane to the hologram plane in the appropriate direction. Additionally, in order to reduce the speckle noise, LED is chosen as the light source for the optical display system. 4f optical filtering system is utilized for filtering out the unwanted noise, where one lens is used as the eyepiece for VR display and holographic lens for AR display. Overall, the proposed approach is lightweight and thus portable which has potential for VR or AR 3D display applications. The hologram projection pattern is shown in Figure 1 with two projection planes for ease of explanation since the object projection is multidirectional. In Figure 1(a), A and B are two points in the space, and layer1 and layer2 are two projection planes. On each projection plane, the projected rays from the spatial 3D object points are in the same direction. p1A and p1B are the projection points of the space points A and B on layer1, respectively. p2A and p2B are the projection points of space points A and B on layer2, respectively. The projection of 3D object points on one projection plane forms a projected image corresponding to this projecting angle. In the hologram calculation process, the illumination direction of each point on different projection planes is set as the propagation direction, as shown in Figure 1 (b). The complex amplitude distribution on the holographic plane H is a superposition of light from all projection planes. When the hologram is reconstructed, the beamlets in these directions are diffracted from the hologram. Similar to integral imaging 3D display, the overlapping regions of the different beamlets form a 3D object for 3D display [20,21], where the propagation direction of the beamlets determines the outcome of 3D display.

Methods
To simplify calculation, the projection layers are all assumed to coincide in the same plane. On this projection plane, projection images from the 3D object are obtained with the designed projection angles. The light in different directions of the 3D object is calculated onto holographic plane H from the projected images. The direction of propagation of light on a projected image is ix  and angular interval ix   in x direction is shown in Figure 2. The distance from the projection plane to the holographic plane H is z. The boundary of the propagation of the light from a point on the image on the holographic plane falls between 1 ih x and 2 ih x , respectively.

Figure 2. The hologram projection angle and boundary
The PSF of light propagating from the projection plane to the holographic plane can be expressed as:  The complex amplitude on the holographic plane H is the superposition of the convolution of each projected image with its corresponding PSF, which can be expressed as: where I and J are the number of projected images in x and y directions, respectively. The complex amplitude U (x, y) on hologram plane can be further encoded as a double phase hologram or as an off-axis amplitude hologram by introducing reference light [22,23]. Thus, the proposed approach has good degree of freedom for potential downstream of visualization tasks.
From above, it is evident that the simplified model consists of one projection plane on which 3D objects yield multiple projection images, as shown in Figure 3. The convolution of each projection with a PSF yields complex amplitude distribution of the 3D object. As shown in Figure.  where Xp and Yp are the coordinates of projected point of p on the projection plane. The projected image imi,j can be expressed as: where Ap is the amplitude of object p.

Experiments and result analysis
Our experiment utilizes the principle of off-axis amplitude hologram for achieving holographic 3D display. Within the experiment, a liquid-crystal-on-silicon (LCoS) SLM is utilized for modulating the holograms. The parameters of the SLM are shown in Table 1. Green light LED from Osram with 1w power, center wavelength of 528nm, light-emitting area of 1mm×1mm, bandwidth of 28nm is used as the illumination source. The center wavelength is used for hologram calculation.  (6) where pix is the pixel size of the LCoS. Given that the distance between the projection plane and the holographic plane is 10 mm, it is possible to calculate the PSF. Assumed that the plane wave perpendicular to the holographic plane is used as reference wave, which means that the phase of   image noise finally incident on human eye through the eyepiece lens4. The lenses used for collimating, 4f optical filtering system and eyepiece are all cemented doublet achromatic lenses with focal length of 50mm and diameter of 30mm. In this case, the distance from the LCoS to the eye is less than 25cm, which means this setup is a relatively compact display system.  Similarly, with the same experimental conditions, we also show another sample to demonstrate the capability of our method that can focus and defocus objects at a scene. For the tested scene, we utilize a stationary coral and a swimming fish 3D model (constructed using Unity3D), as shown in Figure 8. For projection of images, eight images at 0.9° angle interval both in the x and y directions are projected by Unity3D to generate frames for hologram calculation. In Figure 8 (a), the camera is aimed at two objects, and both are in focus yielding clear images because distance between the coral and the fish is close. As shown in Figures 8 (b) and (c), the distance between fish and coral is far.
Depending on the focus of camera, the object can become clear (b in focus at algae, c in focus at fish) or blurred. The results of Figures 7 and 8 show that the proposed holographic 3D VR display is high in image quality, free of speckle noise, and also able to realize 3D display.
For AR experiment, volume holographic optical element (HOE) is used for producing AR image due to its angle and wavelength selectivity [22][23][24]. The recording of HOE is illustrated in Figure 9.
The light wave emitted from the laser is filtered and collimated by lens1, and then split into two parts through the beam splitter (BS). The transmitted light passes through lens2 to form a convergent spherical wave, which undergoes interference with the plane wave reflected by mirror1 and mirror2 on the holographic material to form the HOE. The laser used for HOE recording is a single mode semiconductor laser with a wavelength of 532 nm and power of 400mw. The lens2 is a commercially available aspherical mirror with focal length of 40mm and a diameter of 50mm (Thorlabs, AL5040M-A, NA=0.55). The recording material we used in the experiment is a commercial holographic film (Ultimate Holography, U04), which requires an exposure density of ~600μJ/cm 2 [25]. Using the optical recording system, the angular selectivity of the reflective volume HOE recorded without using lens2 was tested, and the angular selectivity was within  5 . Using the SLM described above, the diffraction angle is within  1.8911 , which means that most light reflected by the SLM could be diffracted with a reasonably high efficiency. Figure 10 (a) and (b) are schematic and optical AR display system, respectively. The angle between the illumination light and the HOE is approximately 30°. The holographic 3D real image reproduced by the optical system is imaged by the holographic lens and incident at human eye.
Ambient light from real object does not meet the angular selectivity of the designed reflective HOE, and thus can directly reach human eye forming augmented reality holographic 3D display. Figure 10. Holographic 3D AR display system: (a) schematic of display system; (b) optical display system.
The diameter of the produced HOE for AR display is about 40 mm, as shown in Figure 11 (a).
The 3D model used for AR experiment is the same as the second experiment of the VR experiments. process is shown as follows. Given the camera is focused on a virtual image (e.g. either fish or coral), we move a tangible object away from the camera. When the captured image of the actual object becomes clear, the distance between the actual object and the camera is the depth at which the camera can clearly image the object. The distance is also the distance between the virtual image and camera as well. In our display experiment, when the image of the fish is clear, the image of the coral and external environment is blurred due to the defocus. The results show that the proposed method is able to produce correct depth cues, and hence the accommodation vergence conflict problem can be eliminated. (c) Focused on coral (3m).

Discussion and Conclusion
We demonstrate a holographic 3D VR/AR method to successfully create near-eye 3D VR and AR display. The VR results of our method are faster (38 times faster) than the traditional point cloud method. A further increase of computational speed is still possible using GPU acceleration for the convolution calculation. The VR results also show that the proposed method is able to achieve focus and defocus of the 3D objects. Similarly, AR results also show that the proposed method is able to achieve a good image quality with 3D display capability as well. More importantly, use of LED as illumination source results in a speckle noise free VR/AR image so that the image quality is better than the use of coherent source as illumination for display. Another advantage of this method is that the depth of the image can be easily controlled through the SLM. The proposed method is also quite flexible to be used for either VR or AR applications as well. It is worth noting that the difference between the display modes of AR and VR is only that the eyepiece is different, and it is not related to the calculation method of the hologram. Therefore, the proposed calculation method is the same for the hologram calculation acceleration of VR and AR display.
It is worth noting that use of LED as the illumination source is also associated with potential drawbacks. For example, due to the large light-emitting area of the LED and poor spatial coherence, it is more difficult to produce large depth of field image. Another disadvantage is that when the increase of the distance between the hologram and the 3D object is over a certain threshold (e.g. 20 mm), the blur of the VR/AR image becomes more evident. While this may not be significant for near-eye 3D display, a careful control of coherence of the light source as well as more controllable propagation of the light can be potentially useful to solve these problems.

Conflicts of Interest:
The authors declare no conflict of interest.