Next Article in Journal
Effect of Contact Angle on Friction Properties of Superhydrophobic Nickel Surface
Next Article in Special Issue
Robust Holographic Reconstruction by Deep Learning with One Frame
Previous Article in Journal
Electrical Relaxation and Transport Properties of ZnGeP2 and 4H-SiC Crystals Measured with Terahertz Spectroscopy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Isotropic Two-Dimensional Differentiation Based on Dual Dynamic Volume Holograms

1
Yunnan Provincial Key Laboratory of Modern Information Optics, Kunming University of Science and Technology, Kunming 650500, China
2
Kunming Power Supply Bureau, Yunnan Power Grid Co., Ltd., Kunming 650011, China
3
Center for Optical and Electromagnetic Research, State Key Laboratory for Modern Optical Instrumentation, Zhejiang University, Hangzhou 310058, China
4
Bradley Department of Electrical and Computer Engineering, Virginia Tech, Blacksburg, VA 24061, USA
*
Author to whom correspondence should be addressed.
Photonics 2023, 10(7), 828; https://doi.org/10.3390/photonics10070828
Submission received: 16 June 2023 / Revised: 8 July 2023 / Accepted: 12 July 2023 / Published: 17 July 2023
(This article belongs to the Special Issue Holographic Information Processing)

Abstract

:
We study the use of two dynamic thick holograms to realize isotropic two-dimensional (2D) differentiation under Bragg diffraction. Acousto-optic modulators (AOMs) are used as dynamic volume holograms. Using a single volume hologram, we can accomplish a first-order derivative operation, corresponding to selective edge extraction of an image. Since the AOM is a 1D spatial light modulator, filtering of the image only occurs along the direction of the sound propagation. To achieve 2D image processing, two AOMs are used within a Mach–Zehnder interferometer (MZI). By aligning one AOM along the x-direction on the upper arm of the interferometer and another AOM along the y-direction on the lower arm, we accomplish the sum of two first-derivative operations, leading to isotropic edge extraction. We have performed both computer simulations and optical experiments to verify the proposed idea. The system provides additional operations in optical computing using AOMs as dynamic holograms.

1. Introduction

The edge information in images usually carries characteristic information about objects. Indeed, edge extraction technology [1,2,3,4] has always been an important research topic in image processing. It has applications such as image enhancement [5], image restoration [6], and image segmentation [7]. Various edge extraction operators in digital image processing [8] have been used to obtain edge information. Recent optical edge extraction techniques include the use of a vortex beam in a standard 4-f coherent imaging system [9,10]. In the context of holographic image processing, Pan et al. have proposed a method for edge extraction using time-varying vortex beam in optical scanning holography for incoherent image processing [11].
Acousto-optic interaction provides a powerful means for optical information processing. Through Bragg diffraction, acousto-optic interaction can control the spatial frequency of light signals [12,13]. Compared to digital image processing techniques, using acousto-optic interaction in optical imaging systems can achieve dynamic and real-time programmable laser beam modulation [14,15,16,17]. An acousto-optic modulator (AOM) is considered as a dynamic thick hologram and is a type of spatial light modulator based on acousto-optic interaction [18], which is widely used in the real-time processing of light waves due to its high reliability, fast response speed, and programmability. The use of acousto-optic transfer functions has been used to describe the profile of the diffracted light field in acousto-optic interaction between light waves and a sound column. By multiplying the Fourier transform of the image to be processed by the acousto-optic transfer function in the frequency domain [15] and performing an inverse Fourier transform, a processed image can be obtained. Balakshy first pointed out the use of acousto-optic interaction to control the optical image structure [19], and Xia et al. [20] were the first to perform image edge extraction using an AOM that operated under Bragg conditions in an experiment, achieving first-order differentiation operation in a given spatial direction, e.g., x . The use of dual AOMs can be effectively used for various differential operations. Cao et al. [21] used two cascaded AOMs to achieve the effect of a second-order differential operation, e.g., 2 x 2 . Banerjee et al. [22] used two orthogonally modulated cascaded AOMs to achieve mixed-direction differential operation, e.g.,   2 x y . Recently, Zhang, et al. [23] provided a detailed review of various image processing methods using AOMs, and in addition, they achieved single-sided notch filtering by deviating the incident angle from the Bragg angle. In passing, we want to point out that Voloshinov [24] first used anisotropic Bragg diffraction in crystals to perform image edge extraction.
In certain cases, it is necessary to focus on edge information in anisotropic as well as isotropic manners [25,26]. Previous research utilizing dual AOMs for edge extraction had limitations and could not perform isotropic edge extraction. A Mach–Zehnder interferometry (MZI) optical system based on two AOMs has been proposed to overcome such limitations [23] but never been implemented. This paper aims to implement isotropic and anisotropic edge extraction using two acousto-optic modulators (AOMs) in a Mach–Zehnder interferometer. For isotropic edge extraction, we implement specifically the operation of x + y optically, which is not achievable in earlier dual-AOM systems mentioned.

2. Image Processing with One AOM

2.1. Principle of AOM and Acousto-Optic Transfer Function

Due to the acousto-optic interaction between light and sound in the AOM, the diffracted light contains the result of the modulation of the light signal. When the acoustic wave propagating in the medium is a plane wave, and the angle between the incident plane wave of light and the z axis satisfies Equation (1), we have Bragg diffraction. The angle of incidence ϕ i n c =   ϕ B is called the Bragg angle. As shown in Figure 1, after acousto-optic interaction, only two beams of diffracted light, the 0th- and 1st-order diffraction, are produced. It can be seen that the incident light and the 0th-order beam have the same direction, and the 0th-order beam differs in direction by 2 ϕ B from the 1st-order beam. In practice, we do not have plane wave of sounds but have a finite sound column of width L as shown in Figure 1. Under this situation, the sound fields spread as they propagate in the medium. Acousto-optic interaction occurs even when the direction of the incident plane wave of light is not exactly at ϕ B , generating multiple orders of light beam.
The Bragg angle ϕ B is determined by the wavelength of the acoustic wave and the incident light [12,18]:
ϕ B = sin 1 ( λ 2 Λ ) ,
where λ is the wavelength of the incident light in the acoustic medium, and Λ is the wavelength of the acoustic waves.
When an optical image is incident at the Bragg angle, we can describe the interaction of the image and the AOM by using a spatial transfer function. Figure 1 depicts the interaction between the incident light field ψ i n c x and the sound waves.
The transfer function H 0 for the 0th-order beam can be expressed as [23]
H 0 k x Λ π = F ψ 0 x F ψ i n c x = exp j Q Λ k x 4 π cos Q Λ k x 4 π 2 + α 2 2 1 2 + j Q Λ k x 4 π sin c Q Λ k x 4 π 2 + α 2 2 1 2 .
Here, F denotes the Fourier transform; F ψ i n c x represents the spectrum of the incident light entering the AOM with   k x being the spatial radian frequency along the x -direction; and F ψ 0 x represents the spectrum of the output 0th-order diffracted light. α denotes the peak phase delay of the light due to the passage of the sound wave, which is proportional to the sound pressure within the AOM. Additionally, sin c x = sin ( x ) x , j = 1 , and finally, Q = 2 π L λ Λ 2 is the Klein–Cook parameter [27].
Equation (2) shows that image processing is along the x -direction. Note that the   x -direction and the x -direction is basically the same, given a small Bragg angle typically for AOMs.
We have generated a circle pattern of 1024 × 1024 pixels as the original input object by assuming that the incident beam is of two transverse dimensions, i.e., ψ i n c x , y = O x , y in the simulation. The diameter has 500 pixels corresponding to 4 mm. The wavelength of the incident light is λ = 532 nm, and the refractive index of the acousto-optic crystal in the AOM is 2.3, giving ϕ B 3.85   mrad . The carrier frequency of the AOM used is 120 MHz, and the wavelength of the acoustic wave in AOM is Λ = 0.03 mm. The peak phase delay of the light through the acousto-optic medium α is chosen to be π. Finally, the width of the piezoelectric transducer providing the sound wave signal at the lower end of the AOM is L = 8 mm, giving the Klein–Cook parameter Q = 14 . Figure 2a shows the circle pattern, and Figure 2b gives the output according to Equation (3) for two transverse dimensions:
ψ 0 x , y = F 1 H 0 k x Λ π · F O x , y .
Clearly, we see that there is an edge extraction along the x -direction. As the acoustic waves propagate along the x -direction, the filtering effect is along the x -direction [see Figure 1], and thus, no edge information is extracted along the y direction. The edge extraction effect is unidirectional (anisotropic), and the filtering effect of a single AOM provides approximately a first-order differential operator, which can be explained as follows.
When Q Λ k x 4 π α 2 , Equation (2) becomes
H 0 k x Λ π = exp j Q Λ k x 4 π cos α 2 + j Q Λ k x 4 π sin c α 2 .
Now, with cos ( α 2 ) = 0 condition, the above equation becomes
H 0 k x Λ π = exp j Q Λ k x 4 π j Q Λ k x 4 π sin c α 2 = exp j Q Λ k x 4 π j B k x ,
where B = Q Λ 4 π sin c α 2 is a constant. Finally, we arrive at Equation (6)
H 0 k x Λ π exp j Q Λ k x 4 π j k x j k x .
From Equation (6), it can be seen that the 0th-order diffracted light is the result of high-pass filtering of the input light profile. Since F ψ i n c x x = j k x F ψ i n c x , this indicates that the transfer function H 0 k x Λ π in Equation (3) will have the effect of a first-order differential operation, which is evidenced by combining Equations (3) and (6) to obtain, with ψ i n c x = O x , y ,
ψ 0 x , y = F 1 H 0 k x   Λ π · F ψ i n c x , y = F 1 j k x   F O x , y = O x , y x .
It should be noted that the linearity of k x in the exponential term in Equation (6) causes a shift in the position of the processed output image, which is ignored as it is inconsequential to image processing. That is the reason why we give the final result of j k x   as the expression for H 0 k x Λ π shown in Equation (6).
Figure 3 shows a practical version of a single-AOM optical setup. The incident light is emitted from the laser, passes through the pinhole, and is transformed into a plane wave by the collimating lens L1. It then passes through the object O x , y , which serves as the system input in the form of the incident light beam profile, and the AOM is placed between the object plane and imaging lens L2, which images the object onto the CCD. An Iris is used in the system to select the different diffraction orders. We shall use Figure 3 as part of the system to implement anisotropic filtering in the subsequent section.

2.2. Rotation of AOM on the x′−y′ Plane

To demonstrate the anisotropic edge extraction capability of the acousto-optic modulation system, we rotate the AOM around the z-axis by an angle θ on the x y plane, as shown in Figure 4.
According to rotation transformation, the x y coordinates, accordingly, will be transformed to a new set of coordinates x″−y″ as
x   y = cos θ sin θ sin θ cos θ x y ,  
where x and y are the rotated coordinates. Correspondingly, the frequency domain rotation relationship is given by
k x k y = cos θ sin θ sin θ cos θ k x k y .
So the processing direction is actually along the x -direction instead of the x -direction.
Therefore, the transfer function H 0 k x Λ π of the AOM modulation on the incident light after rotation by an angle θ is now given by H 0 k x Λ π , θ . In terms of k x , we have, using Equation (9),
H 0 k x Λ π , θ = exp j Q Λ k x cos θ + k y sin θ 4 π · { cos Q Λ k x cos θ + k y sin θ 4 π 2 + α 2 2 1 2 + j Q Λ k x cos θ + k y sin θ 4 π sin c Q Λ k x cos θ + k y sin θ 4 π 2 + α 2 2 1 2 } ,
and the processed image is given by
ψ 0 ( x , y ) = F 1 H 0 k x Λ π ; θ F O x , y .
Figure 5a,b show the output when the AOM is rotated for θ = π / 4 and θ = π / 4 , respectively. By adjusting the angle of the AOM rotation, anisotropic edge extraction can be achieved according to the angle of rotation.

3. Mach–Zehnder Interferometer (MZI) with Dual Acousto-Optic Modulators

From the simulations in the last section, it can be seen that when using a single acousto-optic modulator (AOM) to modulate the image, the edge information of the incident light O x , y approximately along the direction of the acoustic wave in the AOM is extracted, and there is no filtering effect in the direction that is perpendicular to the direction of sound, giving anisotropic filtering. In the field of image processing, in some cases, we hope to obtain isotropic filtering. To solve this problem, two orthogonally oriented AOMs within the two arms of a Mach–Zehnder interferometer have been proposed [23]. The system is shown in Figure 6.
Within the interferometer, each arm uses an AOM for modulation. The incident light is emitted from the laser, passes through the pinhole and is transformed into a plane wave by the collimating lens L1. The plane wave then illuminates the input object O x , y , which is then split by a 1:1 beam splitter (BS1). The two split beams are modulated by AOM1 and AOM2, respectively, under the Bragg condition. L2 and L3 are imaging lenses that image the input object onto the CCD. Iris1 and Iris2 are used to select the zeroth-order beams. The beamsplitter BS2 is used to add the two images from the two arms.
In principle, the AOM can be rotated arbitrarily on the x   y plane as shown in Figure 1. Assuming that the travelling sound waves of AOM1 in the upper arm of the interferometer are along the x-direction θ = 0 , and the sound waves of AOM2 in the lower arm are along the y-direction θ = π / 2 , we can obtain the filtered light field
ψ 0 x , y = F 1 H 0 k x Λ π ; θ = 0 F O x , y + F 1 H 0 k x Λ π ; θ = π 2 F O x , y .  
Figure 7a,c show the original input for a circular and a rectangular input profiles. Figure 7b,d show the corresponding outputs. The same AOM parameters have been used as in the simulations for Figure 2.
Some optical experiments have also been performed using the dual-AOM MZI system shown in Figure 6. The laser is a green light with wavelength of 532 nm. The pinhole and collimating lens L1 are used to collimate light into plane waves. The circular pattern on the object plane has a diameter of 4 mm. The refractive index of the AOM crystal is 2.3, and the carrier frequency of the AOM is 120 MHz. The focal length of imaging lens L2 and L3 is 150 mm. The photographs (by MMRY UC900C CCD camera) of the experimental results are shown in Figure 8. Figure 8a shows the anisotropic edge extraction result of the circular pattern with only AOM1 working at a rotation angle of θ = π / 4 based on Equation (11), in which it can be observed that the edge information along the θ = π / 4 direction was not preserved, consistent with the simulation result in Figure 5b. Figure 8b shows the isotropic edge extraction result of the rectangular pattern using the dual AOMs in the x - and y -directions. The rectangular pattern has dimensions of 4 mm × 4 mm.
We observe that in the output in Figure 8b for the rectangular pattern, there are edge extractions along the x- and y-directions, but in addition, there seem to be bright spots on the four corners, which is not fully consistent with the result shown in Figure 7d. However, this discrepancy can be fully explained if we rewrite Equation (12) and employ the result of Equation (7) for each direction to obtain
ψ 0 x , y = O x , y x + O x , y y ,
and this clearly supports the result shown in Figure 7d. Now, since the CCD only captures intensity, i.e., ψ 0 x , y 2 , what is displayed on Figure 8b is then
ψ 0 x , y 2 = O x , y x + O x , y y 2 = O x , y x 2 + O x , y y 2 + 2 O x , y x O x , y y .
Note that the first two terms give the results shown in Figure 9b. However, the fixed derivative will extract the four corners and therefore give bright spots at the corners. The superposition of all the terms in Equation (14) gives the result shown in Figure 9c in simulations and in Figure 8b from the optical experiment, where the four corners of the square are also emphasized. We, therefore, have consistent results both from simulations as well as optical experiments.

4. Conclusions

We have presented an AOM-based method for real-time edge extraction that enables edge extraction both isotropically and anisotropically. By rotating the AOM, anisotropic edge extraction can be achieved to extract edge information along the direction of sound propagation. We have also implemented a previously proposed Mach–Zehnder interferometric optical system based on dual AOMs, where the use of two AOMs results in a summation operation of two first-order differentiation operations previously not achievable with dual-AOM systems. We have verified our approach through both computer simulations and optical experiments.

Author Contributions

Conceptualization, P.W. and H.F.; methodology, software, P.W. and H.F.; validation, P.W. and W.Q.; analysis, review and editing, Y.Z., Y.Y. and B.Z.; original draft preparation, P.W. and H.F.; review and editing, T.-C.P. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to acknowledge the support of this work by the National Natural Science Foundation of China (Grant No. 62275113), Yunnan Provincial Science and Technology Department (Xing Dian Talent Support Program), Research Project of Research Center for Analysis and Measurement Kunming University of Science and Technology (Grant No. 2021P20193103002), and Youth Fund of Yunnan Provincial Department of Science and Technology (Grant No. 202201AU070159) from Yunnan Province.

Data Availability Statement

The data that support the results within this paper and other findings of the study are available from the corresponding authors upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ren, Z.; Lam, E.Y.; Zhao, J. Acceleration of autofocusing with improved edge extraction using structure tensor and Schatten norm. Opt. Express 2020, 10, 14712–14728. [Google Scholar] [CrossRef] [PubMed]
  2. Wang, R.D.; Zhang, Y.P.; Wang, F.; Zhu, X.F.; Li, C.G.; Zhang, Y.A.; Xu, W. Edge extraction based on optical scanning holography system with annular pupils. Chin. J. Lasers 2019, 46, 0109001. [Google Scholar] [CrossRef]
  3. Zhang, Y.P.; Poon, T.-C.; Tsang, P.W.; Wang, R.D.; Wang, L. Review on feature extraction for 3-D incoherent image processing using optical scanning holography. IEEE Trans. Ind. Inform. 2019, 15, 6146–6154. [Google Scholar] [CrossRef]
  4. Zhang, Y.P.; Wang, R.D.; Tsang, P.W.; Poon, T.-C. Sectioning with edge extraction in optical incoherent imaging processing. OSA Continuum. 2020, 3, 698–708. [Google Scholar] [CrossRef]
  5. Bi, G.L.; Xu, Z.J.; Zhao, J.; Sun, Q. Multispectral image enhancement based on irradiation-reflection model and bounded operation. Acta Phys. Sin. 2015, 64, 78–86. [Google Scholar]
  6. Gao, J.Y.; Xu, H.L.; Shao, K.L.; Yin, H. An adaptive edge detection method based on local edge feature descriptor. Chin. J. Lasers 2020, 47, 193–201. [Google Scholar]
  7. Vo, A.; Oraintara, S. A study of relative phase in complex wavelet domain: Property, statistics and applications in texture image retrieval and segmentation. Signal Process.-Image 2010, 25, 28–46. [Google Scholar] [CrossRef]
  8. Gonzalez, R.C. Woods R E. Digital Image Processing, 3rd ed.; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2008; pp. 61–121. [Google Scholar]
  9. Zhou, Y.; Feng, S.; Nie, S.; Ma, J.; Yuan, C. Image edge enhancement using Airy spiral phase filter. Opt. Express 2016, 24, 25258–25268. [Google Scholar] [CrossRef]
  10. Situ, G.H.; Pedrini, G.; Osten, W. Spiral phase filtering and orientation-selective edge detection/enhancement. J. Opt. Soc. Am. A 2009, 26, 1788–1797. [Google Scholar] [CrossRef]
  11. Pan, Y.J.; Jia, W.; Yu, J.J.; Dobson, K.; Zhou, C.H.; Wang, Y.T.; Poon, T.-C. Edge extraction using a time-varying vortex beam in incoherent digital holography. Opt. Lett. 2014, 39, 4176–4179. [Google Scholar] [CrossRef]
  12. Zhang, Y.P.; Poon, T.-C. Modern Information Optics with MATLAB; Cambridge University Press and Higher Education Press: Cambridge, UK, 2023; Chapter 7. [Google Scholar]
  13. Chrostowski, J.; Delisle, C. Bistable optical switching based on Bragg diffraction. Opt. Commun. 1982, 41, 71–74. [Google Scholar] [CrossRef]
  14. Pieper, R.J.; Poon, T.C. An acoustooptic FM receiver demonstrating some principles of modern signal processing. IEEE Trans. Educ. 1985, 28, 11–17. [Google Scholar] [CrossRef]
  15. Chatterjee, M.R.; Poon, T.-C.; Sitter, D.N. Transfer function formalism for strong acousto-optic Bragg diffraction of light beams with arbitrary profiles. Acta Acust. United Acust. 1990, 71, 81–92. [Google Scholar]
  16. Chen, S.T.; Chatterjee, M.R. Dual-input hybrid acousto-optic set-reset flip-flop and its nonlinear dynamics. Appl. Optics 1997, 36, 3147–3154. [Google Scholar] [CrossRef]
  17. Wang, T.S.; Zhang, C.; Aleksov, A.; Salama, I.; Kar, A. Gaussian beam diffraction by two-dimensional refractive index modulation for high diffraction efficiency and large deflection angle. Opt. Express 2017, 25, 16002. [Google Scholar] [CrossRef] [Green Version]
  18. Korpel, A. Acousto-Optics; Marcel Dekker: New York, NY, USA, 1988; pp. 43–93. [Google Scholar]
  19. Balakshy, V.I. Scanning of images. Sov. J. Quantum Electron. 1979, 6, 965–971. [Google Scholar] [CrossRef]
  20. Xia, J.G.; Dunn, D.B.; Poon, T.-C.; Banerjee, P.P. Image edge enhancement by Bragg diffraction. Opt. Commun. 1996, 128, 1–7. [Google Scholar] [CrossRef]
  21. Cao, D.Q.; Banerjee, P.P.; Poon, T.-C. Image edge enhancement with two cascaded acousto-optic cells with contrapropagating sound. Appl. Optics 1998, 37, 3007–3014. [Google Scholar] [CrossRef] [Green Version]
  22. Banerjee, P.P.; Cao, D.Q.; Poon, T.-C. Basic image-processing operations by use of acousto-optics. Appl. Optics 1997, 36, 3086–3089. [Google Scholar] [CrossRef] [Green Version]
  23. Zhang, Y.P.; Fan, H.X.; Poon, T.-C. Optical image processing using acousto-optic modulators as programmable volume holograms: A review. Chin. Opt. Lett. 2022, 20, 29–38. [Google Scholar] [CrossRef]
  24. Voloshinov, V.B.; Babkina, T.M.; Molchanov, V.Y. Two-dimensional selection of optical spatial frequencies by acousto-optic methods. Opt. Eng. 2002, 41, 1273. [Google Scholar] [CrossRef]
  25. Sharma, M.K.; Joseph, J.; Senthilkumaran, P. Selective edge enhancement using shifted anisotropic vortex filter. J. Optics-UK 2013, 42, 1–7. [Google Scholar] [CrossRef]
  26. Dobson, K.K.; Jia, W.; Poon, T.-C. Anisotropic edge enhancement in optical scanning holography with spiral phase filtering. Chin. Opt. Lett. 2016, 14, 010006. [Google Scholar] [CrossRef] [Green Version]
  27. Klein, W.R.; Cook, B.D. A unified approach to ultrasonic light diffraction. IEEE Trans. Sonics Ultrason. 1967, 14, 123. [Google Scholar] [CrossRef]
Figure 1. Acousto-optic modulator under Bragg diffraction. (P.S.: green arrows: direction of incident light, 1st order and 0th order light; dark blue arrow: direction of propagating sound waves).
Figure 1. Acousto-optic modulator under Bragg diffraction. (P.S.: green arrows: direction of incident light, 1st order and 0th order light; dark blue arrow: direction of propagating sound waves).
Photonics 10 00828 g001
Figure 2. (a) Original input object; (b) Output according to Equation (3).
Figure 2. (a) Original input object; (b) Output according to Equation (3).
Photonics 10 00828 g002
Figure 3. Practical single-AOM optical setup. (P.S.: green arrow: direction of incident light; dark blue arrow: direction of propagating sound waves).
Figure 3. Practical single-AOM optical setup. (P.S.: green arrow: direction of incident light; dark blue arrow: direction of propagating sound waves).
Photonics 10 00828 g003
Figure 4. Rotation of AOM.
Figure 4. Rotation of AOM.
Photonics 10 00828 g004
Figure 5. (a) θ = π / 4 ; (b) θ = π / 4 .
Figure 5. (a) θ = π / 4 ; (b) θ = π / 4 .
Photonics 10 00828 g005
Figure 6. Dual AOM optical system using a Mach–Zehnder interferometer. (P.S.: green arrow: direction of incident light; dark blue arrow: direction of propagating sound waves).
Figure 6. Dual AOM optical system using a Mach–Zehnder interferometer. (P.S.: green arrow: direction of incident light; dark blue arrow: direction of propagating sound waves).
Photonics 10 00828 g006
Figure 7. Using two AOMs with one in the x-direction and the other in the y-direction: (a,c) original input; (b,d) output beam profiles corresponding to (a,c).
Figure 7. Using two AOMs with one in the x-direction and the other in the y-direction: (a,c) original input; (b,d) output beam profiles corresponding to (a,c).
Photonics 10 00828 g007
Figure 8. Results of optical experiment: (a) output profile for an input circular pattern; (b) output profile for the rectangular beam profile.
Figure 8. Results of optical experiment: (a) output profile for an input circular pattern; (b) output profile for the rectangular beam profile.
Photonics 10 00828 g008
Figure 9. Results of simulations of a rectangular object. (a) Original rectangular input object, (b) output field distribution ψ 0 x , y   according to Equation (12), (c) output intensity | ψ 0 x , y | 2 .
Figure 9. Results of simulations of a rectangular object. (a) Original rectangular input object, (b) output field distribution ψ 0 x , y   according to Equation (12), (c) output intensity | ψ 0 x , y | 2 .
Photonics 10 00828 g009
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, P.; Fan, H.; Zhang, Y.; Yao, Y.; Zhang, B.; Qin, W.; Poon, T.-C. Isotropic Two-Dimensional Differentiation Based on Dual Dynamic Volume Holograms. Photonics 2023, 10, 828. https://doi.org/10.3390/photonics10070828

AMA Style

Wang P, Fan H, Zhang Y, Yao Y, Zhang B, Qin W, Poon T-C. Isotropic Two-Dimensional Differentiation Based on Dual Dynamic Volume Holograms. Photonics. 2023; 10(7):828. https://doi.org/10.3390/photonics10070828

Chicago/Turabian Style

Wang, Pin, Houxin Fan, Yaping Zhang, Yongwei Yao, Bing Zhang, Wenlong Qin, and Ting-Chung Poon. 2023. "Isotropic Two-Dimensional Differentiation Based on Dual Dynamic Volume Holograms" Photonics 10, no. 7: 828. https://doi.org/10.3390/photonics10070828

APA Style

Wang, P., Fan, H., Zhang, Y., Yao, Y., Zhang, B., Qin, W., & Poon, T. -C. (2023). Isotropic Two-Dimensional Differentiation Based on Dual Dynamic Volume Holograms. Photonics, 10(7), 828. https://doi.org/10.3390/photonics10070828

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop