A Separation Method of Superimposed Gratings in Double-Projector Fringe Projection Proﬁlometry Using a Color Camera

Featured Application: Inspired by the characteristics of the red and blue channels of color cameras, this paper proposes a separation method of superimposed gratings in double-projector fringe projection profilometry. According to this method, the superimposed grating can be separated ef-fectively without complex projection coding and separation algorithms. There is no additional device, and there is no requirement for device location. At the same time, the measurement efficiency is increased by 50%. Abstract: Fringe projection profilometry has been intensively studied for several decades. However, due to the limitation of the field range of a single projector, when measuring objects with complex surfaces, there are always shadow areas in the captured images, resulting in missing measurement data in the dark areas. To solve this problem, systems with double projectors and single camera were employed. Not only were the shadow areas reduced, but system recalibration and multiple measurements were not needed, improving measuring efficiency. Nevertheless, separating the corresponding projection pattern from the superimposed fringe presented a difficult problem. A color camera has RGB three color channels. When the color camera is applied to fringe projection profilometry, the information obtained is three times as much as that of the monochrome camera. Due to the small overlap between the red- and blue-light spectra response of color cameras, the channel color crosstalk can be ignored. This paper proposes a method to project red and blue fringe patterns from two projectors and utilize the characteristics of the red and blue channels of the color camera to separate the superposition grating pattern. The original patterns can be recovered integrally and easily. To explain the effectiveness of superimposed fringe separation, a simulation and experiments were carried out. Both of them showed that the superimposed fringe can be separated correctly, proving that our method is feasible.


Introduction
Three-dimensional shape measurement is widely applied to reverse engineering, product inspection, and physical imitation. Fringe projection profilometry has become one of the most important methods due to the advantages it offers: high speed and precision, noncontact and full-field measurement, and simple data processing. For a three-dimensional object with a complex surface, the measuring range with a single projector and a single camera is restricted. The information in the shadow area will be submerged, resulting in incomplete measurement data. To cope with this problem, a double-projector structured-light three-dimensional measurement system has been created [1][2][3][4][5]. Y. Jin et al. [1] measured the dimension of holes based on a double-projector system. In addition to minimizing occlusions, the double-projector structured-light three-dimensional measurement system has other advantages, such as increasing projector light intensity, reducing the number of images needed for scanning, and removing the bimodal multi-path [2]. However, two projectors casting separately require a long experiment time. If two projectors can be run at the same time, the measurement will be more efficient. A method of separating the corresponding projection pattern from the superimposed fringe is urgently needed. Yu, Ying, et al. [2] choose two groups of patterns that temporally shifted at different rates, and used DFT to decouple them along the time axis. Griesser et al. [6] treated a projector and a camera as a module, and two modules were set in opposite directions. As a result, patterns projected by the opposite projector could blind the other camera, avoiding pattern superposition. Maimone et al. [7] utilized the fact that the fixed unit of a projector and a camera will see a clear version of its own pattern and blurred ones that come from other units. The overlapping patterns can be distinguished by the blurring patterns. Wang Jianfeng et al. [8] recovered the depth information of overlapped and non-overlapped regions by considering the correlation between multiple projectors and the infrared images, as well as that between the infrared images. Tardif. J. P. et al. [9] mentioned intensity-blending algorithms for correcting the overlap area. Yan Zengqiang et al. [10] proposed hierarchical patterns that can be separated from each other. Je Changsoo et al. [11] separated the superposition grating by acquiring partial derivation of color fringes in different directions from different projections. Xiang Sen et al. [12] categorized the interfered regions into flat regions and boundary regions under the guidance of texture segments, then applied different Markov random field (MRF) models to calculate the final depth results. Petkovic Tomislav et al. [13] uploaded their own specifically selected group of temporal phase shifts for each projector, resulting in simple and efficient separation between projected patterns. All of the aforementioned studies either kept the special position relationship between projectors, or projected specific patterns and dealt with the superimposed images using complicated algorithms.
In our previous work, we designed a particular projection order to project the phaseshifting gratings. But six images are needed to be captured if the four-step phase-shifted method is used to acquire a wrapped phase from two projectors [14]. In this study, we utilized a color camera for fringe projection profilometry. Two projectors projected red and blue stripes, respectively. The color camera captured them at the same time and used its optical characteristics to separate the superposition grating. The proposed method did not require complex projection coding and separation algorithms, and there was no special requirement for device placement. Four images are sufficient for the wrapped phase using the four-step phase-shifted method from two projectors.
The rest of this paper is organized as follows. Section 2 analyzes optical characteristics of the color camera and presents the proposed method. Sections 3 and 4 respectively describe the simulation and experiments. Section 5 discusses the advantages of the proposed method and Section 6 summarizes our chief conclusions.

Optical Characteristics of the Color Camera
Color cameras are divided into single-chip color cameras and three-chip color cameras. A color camera has red, green, and blue color channels. The single-chip color camera uses a Bayer filter to show one of the three colors, and the camera' processing unit performs spatial color interpolation to obtain the other two colors. The three-chip color camera has three semiconductor devices, each of which corresponds to a primary color. A prism can separate the primary colors of received light. The first type is significantly cheaper, and the second one provides better-quality captured color images [15]. It is common that in color cameras, the spectra of red, green, and blue channels are always made to overlap so that there are no color-blind regions in the spectrum [16]. But the overlaps between different color channels are not alike. Generally speaking, the overlaps between the green channel and the other two channels are serious. Red and blue channels have very little color crosstalk. We used a single-chip color camera; its spectral response curve is shown in Figure 1. In our projector, the wavelength of blue LEDs was 459nm, and that of the red LEDs was 618nm. Figure 1 shows that the quantum efficiency of the red channel, at 459nm, and that of the blue channel, at 618nm, were very low. This means that the red and blue superimposed fringe patterns can be separated well. between different color channels are not alike. Generally speaking, the overlaps between the green channel and the other two channels are serious. Red and blue channels have very little color crosstalk. We used a single-chip color camera; its spectral response curve is shown in Figure 1. In our projector, the wavelength of blue LEDs was 459nm, and that of the red LEDs was 618nm. Figure 1 shows that the quantum efficiency of the red channel, at 459nm, and that of the blue channel, at 618nm, were very low. This means that the red and blue superimposed fringe patterns can be separated well.

Double-Projector Fringe Projection Profilometry with Red and Blue Light
As mentioned above, the red and blue band passes are separated in color cameras. We proposed to project red and blue fringe patterns from two projectors and utilize the characteristics of the red channel and blue channel of color cameras to separate the superposition grating pattern. Figure 2 shows the structure of the system. Phase-shifting profilometry is one of the most popular phase-extraction methods because it can eliminate interference from ambient light and surface reflectivity [17]. In our

Double-Projector Fringe Projection Profilometry with Red and Blue Light
As mentioned above, the red and blue band passes are separated in color cameras. We proposed to project red and blue fringe patterns from two projectors and utilize the characteristics of the red channel and blue channel of color cameras to separate the superposition grating pattern. Figure 2 shows the structure of the system. between different color channels are not alike. Generally speaking, the overlaps between the green channel and the other two channels are serious. Red and blue channels have very little color crosstalk. We used a single-chip color camera; its spectral response curve is shown in Figure 1. In our projector, the wavelength of blue LEDs was 459nm, and that of the red LEDs was 618nm. Figure 1 shows that the quantum efficiency of the red channel, at 459nm, and that of the blue channel, at 618nm, were very low. This means that the red and blue superimposed fringe patterns can be separated well.

Double-Projector Fringe Projection Profilometry with Red and Blue Light
As mentioned above, the red and blue band passes are separated in color cameras. We proposed to project red and blue fringe patterns from two projectors and utilize the characteristics of the red channel and blue channel of color cameras to separate the superposition grating pattern. Figure 2 shows the structure of the system. Phase-shifting profilometry is one of the most popular phase-extraction methods because it can eliminate interference from ambient light and surface reflectivity [17]. In our Phase-shifting profilometry is one of the most popular phase-extraction methods because it can eliminate interference from ambient light and surface reflectivity [17]. In our study, the Appl. Sci. 2021, 11, 890 4 of 12 four-step phase-shifted method was used, and the phase-shifting was π/2. The fringe images of a four-step phase-shifted algorithm with equal phase-shifted can be described as: where i is the i-th phase shifting, a(x, y) represents the average intensity of the fringe brightness and background illumination, and b(x, y) represents the intensity modulation of the fringe contrast and surface reflectivity. φ(x, y) is the wrapped phase, which can be calculated by the following equation: Since the arctangent function only ranges from −π to π, the phase value provided by Equation (2) will have π phase discontinuities. Therefore, we used the dual-frequency method to obtain the unwrapped phase map: where Φ(x, y) represents the unwrapped phase, and k(x, y) is the fringe order to represent phase jumps.

Simulation
In order to explain the effectiveness of the proposed method, a simulation was carried out. Because we only projected red and blue stripes, the response of three channels in the color camera can be described with Equations (4)- (6): I r (x, y), I g (x, y), and I b (x, y) represent the images of three channels, respectively. I pr (x, y) and I pb (x, y) indicate the red and blue stripes we projected. α rr is the response coefficient of the red channel to the red projection. α br is the response coefficient of the red channel to the blue projection. α rg is the response coefficient of the green channel to the red projection. α bg is the response coefficient of the green channel to the blue projection. α rb is the response coefficient of the blue channel to the red projection. α bb is the response coefficient of the blue channel to the blue projection. We set α rr and α bb as 0.8; α rg as 0.2; and α bg , α br , and α rb as 0.03. The red fringe was vertical and the blue fringe was horizontal so they could be easily distinguished. Figures 3 and 4 show the results. The images measured 1140 × 912 pixels, which is consistent with the resolution of the projector we used. As shown in Figures 3 and 4, both low-frequency superimposed grating and highfrequency superimposed grating were separated well.
Using Equations (2) and (3), we unwrapped the wrapped phase. As shown in Figures 5  and 6, the wrapped phase was unwrapped as an absolute phase successfully, and the cross profiles had good linearity. The simulation results revealed that color cross-talk between the red and blue channels did not affect the phase. We then simulated the three-dimensional measurement. The measured object was a hemisphere with a radius of 300 mm, as shown in Figure 7, and the spatial scale of the image was set to 1 mm/pixel. Figure 8 shows the simulation of the phase image captured by the camera, where (a) is the low-frequency phaseshifting image, and (b) is the high-frequency phase-shifting image, and they were all added color crosstalk. Based on the proposed method, we acquired the measurement results from the left (red) and right (blue) projection. We then fit the hemisphere with 3D data.     red and blue channels did not affect the phase. We then simulated the three-dimensional measurement. The measured object was a hemisphere with a radius of 300 mm, as shown in Figure 7, and the spatial scale of the image was set to 1 mm/pixel. Figure 8 shows the simulation of the phase image captured by the camera, where (a) is the low-frequency phase-shifting image, and (b) is the high-frequency phase-shifting image, and they were all added color crosstalk. Based on the proposed method, we acquired the measurement results from the left (red) and right (blue) projection. We then fit the hemisphere with 3D data. Using Equations (2) and (3), we unwrapped the wrapped phase. As shown in Figures 5  and 6, the wrapped phase was unwrapped as an absolute phase successfully, and the cross profiles had good linearity. The simulation results revealed that color cross-talk between the red and blue channels did not affect the phase. We then simulated the three-dimensional measurement. The measured object was a hemisphere with a radius of 300 mm, as shown in Figure 7, and the spatial scale of the image was set to 1 mm/pixel. Figure 8 shows the simulation of the phase image captured by the camera, where (a) is the low-frequency phase-shifting image, and (b) is the high-frequency phase-shifting image, and they were all added color crosstalk. Based on the proposed method, we acquired the measurement results from the left (red) and right (blue) projection. We then fit the hemisphere with 3D data.    (d) (e) (f) Figure 6. Blue-channel image phase unwrapping: (a) low-frequency wrapped phase; (b) high-frequency wrapped phase; (c) unwrapped phase; (d) cross profile of (a); (e) cross profile of (b); (f) cross profile of (c).   Figure 9 and Table 1 show the fitting results. In Table 1, we can see the measurement accuracy is under 100 microns. On the basis of the successful separation of the left and right projection results, the accuracy of the 3D measurement can be confirmed by the proposed method. To consider ambient light and the surface reflectivity of the object, we carried out some experiments to explain the separation effect using actual measurements.
Appl. Sci. 2021, 11, x FOR PEER REVIEW 8 of 12 Figure 9 and Table 1 show the fitting results. In Table 1, we can see the measurement accuracy is under 100 microns. On the basis of the successful separation of the left and right projection results, the accuracy of the 3D measurement can be confirmed by the proposed method. To consider ambient light and the surface reflectivity of the object, we carried out some experiments to explain the separation effect using actual measurements.

Results
We constructed an apparatus similar to the one simulated, and measured a complex 3D object in order to demonstrate the validity and applicability of our proposed method. The system included a color camera (Model: GS3-U3-41C6C-C, FLIR, Wilsonville, OR, USA) with a 25 mm focal length lens and two digital projectors (Model: LightCrafter 4500, Texas Instruments DLP ® Technology, Dallas, TX, USA), as shown in Figure 10. We used an Agrippa statue as the measured object due to its complex surface. The left projector was uploaded with red vertical stripes, and the right projector was uploaded with blue horizontal stripes. The color camera captured the superimposed gratings in the middle of the two projectors. We adopted the dual-frequency four-step phase-shifted method to obtain the unwrapped phase map. The frequency of the low-frequency gratings was 1, and that of the high-frequency gratings was 6. Figure 11 shows the pictures of the superimposed gratings. We used an Agrippa statue as the measured object due to its complex surface. The left projector was uploaded with red vertical stripes, and the right projector was uploaded with blue horizontal stripes. The color camera captured the superimposed gratings in the middle of the two projectors. We adopted the dual-frequency four-step phase-shifted method to obtain the unwrapped phase map. The frequency of the low-frequency gratings was 1, and that of the high-frequency gratings was 6. Figure 11 shows the pictures of the superimposed gratings.
In Figure 11, the upper pictures are low-frequency, and the bottom ones are highfrequency. By separating different color channels, we obtained 16 pictures of high and low frequency fringes using the double projectors (Figures 12 and 13). According to Equations (2) and (3), we unwrapped the wrapped phase. The unwrapped phases are shown in Figures 14 and 15.
We used an Agrippa statue as the measured object due to its complex surface. The left projector was uploaded with red vertical stripes, and the right projector was uploaded with blue horizontal stripes. The color camera captured the superimposed gratings in the middle of the two projectors. We adopted the dual-frequency four-step phase-shifted method to obtain the unwrapped phase map. The frequency of the low-frequency gratings was 1, and that of the high-frequency gratings was 6. Figure 11 shows the pictures of the superimposed gratings. Figure 11. The superimposed gratings.
In Figure 11, the upper pictures are low-frequency, and the bottom ones are highfrequency. By separating different color channels, we obtained 16 pictures of high and low         As shown in Figures 14a and 15a, the red and blue channel pictures were used to unwrap the phase directly, with good-quality results. The blue lines in Figures 14a and  15a are the background parts, and we made a linear fit to their cross profiles, as shown in Figures 14b and 15b. R-square represents the goodness of fit. The R-square in Figure 14b was 0.9992, and in Figure 15b it was 1, demonstrating excellent linearity. The red ones are cross profiles of the measured object that have fluctuations, and the disorder areas are shadows. The background and the measured object both had a fine unwrapped phase. In the double-projector structured-light three-dimensional measurement system, the two projectors projected red and blue stripes, respectively, and the superimposed grating was separated simply without a complicated coding mode or separation algorithm. The separation results can be directly used in phase unwrapping and subsequent processing without other filtering operations.

Discussion
In this paper, we adopted the double-projector structured-light three-dimensional measurement system. In [2], the authors revealed the advantages of this system: (1) it reduces occlusions; (2) it increases the signal-to-noise ratios for its double brightness compared to a single projector system; (3) it decreases the number of images required for scanning; and (4) it tests multi-path interference. Our method showed the third advantage again; that is, the number of images required was reduced, and the measurement speed was significantly improved. Taking the dual-frequency four-step phase-shifting method as an example, our method only needs to obtain eight superimposed gratings to obtain the unwrapped phase for both sides. When using a traditional single-projector and singlecamera system, 16 pictures must be captured to acquire the same results. The measuring efficiency was improved by 50%. Second, our method projected the most primitive sinusoidal grating without complex coding patterns, and the separation method was simple. Third, our method only replaced the monochrome camera with the color camera in the system, and did not add other devices. Finally, our experimental process is not compli- As shown in Figure 14a and Figure 15a, the red and blue channel pictures were used to unwrap the phase directly, with good-quality results. The blue lines in Figure 14a and Figure 15a, are the background parts, and we made a linear fit to their cross profiles, as shown in Figure 14b and Figure 15b. R-square represents the goodness of fit. The R-square in Figure 14b was 0.9992, and in Figure 15b it was 1, demonstrating excellent linearity. The red ones are cross profiles of the measured object that have fluctuations, and the disorder areas are shadows. The background and the measured object both had a fine unwrapped phase. In the double-projector structured-light three-dimensional measurement system, the two projectors projected red and blue stripes, respectively, and the superimposed grating was separated simply without a complicated coding mode or separation algorithm. The separation results can be directly used in phase unwrapping and subsequent processing without other filtering operations.

Discussion
In this paper, we adopted the double-projector structured-light three-dimensional measurement system. In [2], the authors revealed the advantages of this system: (1) it reduces occlusions; (2) it increases the signal-to-noise ratios for its double brightness compared to a single projector system; (3) it decreases the number of images required for scanning; and (4) it tests multi-path interference. Our method showed the third advantage again; that is, the number of images required was reduced, and the measurement speed was significantly improved. Taking the dual-frequency four-step phase-shifting method as an example, our method only needs to obtain eight superimposed gratings to obtain the unwrapped phase for both sides. When using a traditional single-projector and single-camera system, 16 pictures must be captured to acquire the same results. The measuring efficiency was improved by 50%. Second, our method projected the most primitive sinusoidal grating without complex coding patterns, and the separation method was simple. Third, our method only replaced the monochrome camera with the color camera in the system, and did not add other devices. Finally, our experimental process is not complicated, has no special requirements for the placement of cameras and projectors, and is easy to operate.

Conclusions
In this paper, we put forward a separation method of superimposed gratings in double-projector fringe projection profilometry using a color camera. According to the spectral-response characteristics of the color camera, there was a small amount of color crosstalk between the red and blue channels that can be ignored. If we project red and blue fringes respectively from two projectors, the projection information on both sides can be simply separated from the color superimposed grating by different channels. Our method included a simulation and experiments. Compared to the traditional single projector and single-camera system, our method improved the measurement speed by 50%. It did not require complex projection coding and separation algorithms. No additional device was added, and there was no special requirement for device placement.

Data Availability Statement:
The data presented in this study are available in this article.