Next Article in Journal
CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording
Previous Article in Journal
Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Parallax Image Mosaic Method for Low Altitude Aerial Photography with Artifact and Distortion Suppression

1
School of Computer Science and Technology, Changchun University of Science and Technology, Changchun 130022, China
2
Zhongshan Institute of Changchun University of Science and Technology, Zhongshan 528403, China
3
School of Artificial Intelligence, Changchun University of Science and Technology, Changchun 130022, China
4
Science and Technology on Electro-Optical Information Security Control Laboratory, Tianjin 300308, China
*
Author to whom correspondence should be addressed.
J. Imaging 2023, 9(1), 5; https://doi.org/10.3390/jimaging9010005
Submission received: 16 November 2022 / Revised: 16 December 2022 / Accepted: 20 December 2022 / Published: 25 December 2022
(This article belongs to the Section Image and Video Processing)

Abstract

:
In this paper, we propose an aerial images stitching method based on an as-projective-as-possible (APAP) algorithm, aiming at the problem artifacts, distortions, or stitching failure due to fewer feature points for multispectral aerial image with certain parallax. Our method incorporates accelerated nonlinear diffusion algorithm (AKAZE) into APAP algorithm. First, we use the fast and stable AKAZE to extract the feature points of aerial images, and then, based on the registration model of the APAP algorithm, we add line protection constraints, global similarity constraints, and local similarity constraints to protect the image structure information, to produce a panorama. Experimental results on several datasets demonstrate that proposed method is effective when dealing with multispectral aerial images. Our method can suppress artifacts, distortions, and reduce incomplete splicing. Compared with state-of-the-art image stitching methods, including APAP and adaptive as-natural-as-possible image stitching (AANAP), and two of the most popular UAV image stitching tools, Pix4D and OpenDroneMap (ODM), our method achieves them both quantitatively and qualitatively.

1. Introduction

At present, when UAVs perform aerial photography in low altitude areas, the field of view is relatively small, thus the captured aerial images cannot contain all the information needed for research. In order to obtain a high-resolution panoramic image, single aerial images are stitched into a wide field of view aerial panorama [1], which provides convenience for the research in meteorological, geological survey, and other fields. Therefore, it is of great significance to carry out the research of UAV aerial image mosaic.
During aerial photography, the UAVs shake due to air flow and other environmental factors, resulting in certain parallax of adjacent aerial images. In this case, if only a single homography matrix is used to align the adjacent images, it is easy to cause the problems of misalignments, artifacts, or due to fewer feature points, leading to mosaic failure of multispectral aerial images. In view of the above problems, we propose an improved APAP image mosaic method, which can stitch seamlessly multispectral aerial images.

2. Related Work

The current aerial image mosaic methods are generally divided into two categories, namely, image feature based stitching method and UAV position and attitude information based stitching method [2]. Ruizhe Shao et al. proposed a fast UAV image stitching method that uses the position and attitude information of UAV images to improve the speed of image stitching [3]. This method can quickly find several anchor points to match to stitch images. Compared with the most advanced methods, this method reduces the time cost, but when position and attitude information of aerial images is not available, it cannot be applied. Several authors [4,5,6] proposed UAV aerial images mosaic methods based on SURF, SIFT [7,8,9] or KAZE features. Approach [4] is limited by the time cost of feature points extraction. Methods [5,6] suffer from the problems of ghosting and blurring for parallax image mosaics. In order to improve the efficiency and accuracy of aerial image stitching, Huang et al. [10] proposed to use accelerated nonlinear diffusion (AKAZE) algorithm to match the features of aerial images, and then use the image registration model of As-Projective-As-Possible Image Stitching (APAP) [11] to align aerial images, and finally use Laplace Pyramid Fusion Algorithm [12] to blend overlapping regions. If the error is not properly controlled, the method [10] will cause overall distortions of the mosaic image when the number of UAV aerial images to be stitched is large. While considering improving image alignment, more and more scholars focus on reducing the projection distortion of nonoverlapping regions. Li [13] and Xiang [14] proposed to protect non-overlapping regions to ensure the final mosaic panoramic image more natural.
When most algorithms eliminate the projection distortion of non-overlapping regions, meanwhile, they also destroy the structure information in the images, resulting in the discontinuity of the transition from the overlapping to the non-overlapping regions.
The major work of this paper is as follows
  • Align the aerial images with AKAZE algorithm, which takes into account the registration efficiency and preserves the boundary of the region in the image while suppressing noise.
  • Introduce the local and the global similarity constraints and focus on the transition from overlapping regions to non-overlapping regions during the registration process.
  • Introduce the line protection constraint to deal with the projection distortion of nonoverlapping areas, moreover the geometric structure of the image is also concerned.

3. Enhanced Image Mosaic Method Based on APAP

APAP algorithm can solve the problem of artifacts caused by parallax of aerial images, so we propose an aerial image registration approach with improved APAP mesh warp. The algorithm flow is shown in Figure 1. First, preprocess the aerial image, and then extract the feature points of the aerial image through the fast and stable AKAZE algorithm instead of the SIFT algorithm. Then, incorporate the APAP algorithm and improve it. In the registration process, add local and global similarity constraints to reduce the global projection distortion as much as possible. Furthermore, the straight line of the images is detected, and the line protection constraint is added to ensure the structural information of the images to increase the naturalness of panorama. The over overlapped images are not used to create the results. Finally, generate mosaic panoramic image.
1. AKAZE module
KAZE feature is one of most popular multi-scale two-dimensional feature detection and description algorithm in nonlinear scale space [15]. The traditional method is to detect and describe features at different scales by constructing or approximating the Gaussian scale space of the image. However, Gaussian blur does not retain the natural boundary of the object, and smoothing processing is carried out in detail and noise, reducing the positioning accuracy and uniqueness. The larger the Gaussian blur, the greater the loss of local area detection features in the rough scale space. KAZE uses nonlinear diffusion filtering [16,17] to detect and describe two-dimensional features in the nonlinear space, so that the fuzzy part can adapt to the image data, reduce noise, and moreover retain the boundary of the object and obtain uniqueness and positioning accuracy.
AKAZE is an accelerated version of KAZE feature, which uses nonlinear diffusion filtering to build the scale space, introduces an efficient improved local difference binary descriptor (M-LDB), and the descriptor obtained through the AKAZE feature algorithm has rotation invariance, scale invariance, illumination invariance [18,19,20]. The AKAZE detector is a determinant based on the Hessian matrix. The Scharr filter is used to improve the rotation invariance quality. The maximum value of the detector response in the spatial position is picked up as a feature point, which makes the fuzzy local adaptive feature points in the image. The boundary of the area in the subject image is retained while reducing noise. Compared with SIFT and SURF algorithms, AKAZE algorithm is faster.
2. Global similarity constraint
In order to reduce the distortions in the process of image registration, global similarity constraint is added. If there is no global similarity constraint, the final stitching results may be skewed and deformed. We adopt the method in [21] to compute the global similarity constraint. Assume that the scale factor s i and rotation angle θ i have been determined for I i , the global similarity constraint can be expressed as
φ g V = i = 1 N e j i E i w ( e j i ) 2 ( c e j i s i cos θ i ) 2 + ( s e j i s i sin θ i ) 2 .
Among them, V represents all grid points, E i represents the edge of image I i grid, c e j i and s e j i are the coefficients of similar transformation, and w e j i is the weight function, which assigns more weight to the mesh farther from the overlapping area. Alignment items play an important role in overlapping areas, for the part far from the overlapping area, because there is no alignment constraint, similarity experience is more important.
3. Local similarity constraint
To reduce overall shape distortion, ensure that overlapping and non-overlapping areas are transformed similarly, the transformation from overlapping regions gradually transits to non-overlapping regions, and the local similarity constraint expression is
φ l V = i = 1 N j , k E i v ˜ k i v ˜ j i S j k i v k i v j i 2 .
Among them, v i represents the point of image I i grid, v j i represents the position of original image grid points, and v ˜ j i represents the position of deformed grid points. S j k i is the similarity transformation of edge j , k . We use the similarity transformation equation of [22] to calculate S j k i .
4. Line protection constraint
When the scene in the image is not in a plane, the structure information in the image may change significantly by adding a similarity constraint, which affects the subsequent splicing effect. In order to protect the structure of lines in the image, the line protection constraint is introduced into the geometric constraint of overlapping area alignment. We adopt the method in [23] to compute line protection constraint. First, select the local line segments in the image, and then sample them. The line protection constraint can be expressed as
φ l V = l k L m i M k S i v m 1 v m 0 .
Among them, L is the collection of collected line segments, m 0 and m 1 are the two endpoints of the sampled line segments, v m i is the coordinate of the point m i , and S i is the quadrilateral area formed by the vector m i m 0 and the vector m 1 m 0 . S i can be computed as
S i = v m i v m 0 v m 1 v m 0 .
Based on global, local similarity, and line protection constraints, we construct the cost function to estimate the quality of the deformation mesh. The cost function is
f V = α 1 φ g V + α 2 φ l V + α 3 φ l l V .
Among them, α 1 , α 2 , and α 3 are weight coefficients of three constraints. We set α 1 , α 2 , and α 3 to 1, 1, and 1.5, respectively. Our goal is to minimize the cost function to obtain the optimal vertex coordinates of deformable surface mesh. Since all terms in the cost function are quadratic terms, a sparse linear solver is used to solve the optimization problem, finally the optimal vertex set is obtained.

4. Experimental Results

In order to verify the effectiveness of proposed aerial image mosaic method, we conducted three groups of experiments. In the first group of experiments, we conducted ablation experiments and compared the image mosaic results of the APAP, AANAP [24] and our method. In the second group of experiments, we compared the stitching results of our method and one of current the most popular UAV image stitching tools, ODM, on color aerial images. In the third group of experiments, we compared the stitching results of the ODM, Pix4D, and our methods on BGNIR multispectral aerial images.
Hardware and software environment configuration in the experiments:
The server: Intel (R) Core (TM) i9-7960X CPU @ 3.30 GHz, Ubuntu 20.04, 64-bit operating system, 32 GB running memory, NVIDIA Corporation GV100 [TITAN V], Python 3.6.
Local computer: Intel (R) Core (TM) i7-10750H CPU @ 2.60 GHz, Windows10, Visual C++11, OpenCV 2.4.0.

4.1. Ablation Experiments

In order to verify the effectiveness of the global similarity constraint, local similarity constraint and line protection constraint in our method, we have conducted ablation experiments on the existing forest, road, and bridge dataset. The results are provided in Figure 2. The white dotted boxes in stitching result images mark parts of the overlapping region. The small image on the right side of each result image is a part of enlarged region in white dotted boxes. The APAP method is the baseline, and the APAP results are shown in Figure 2a–c with the obvious artifacts and severe distortions in the overlapping areas. The results of the APAP with local and global similarity constraints are shown in Figure 2d–f. The results of Figure 2d,e show distortion issues are improved, but artifacts are non-negligible. The result of Figure 2f still shows artifacts and distortions. The results of the proposed method, incorporate the APAP with local, global similarity constraints and line protection constraints, show artifacts and distortions are effectively suppressed in overlapping area in Figure 2g–i.
We have computed the peak signal to noise ratio (PSNR), structural similarity (SSIM), and root mean square error (RMSE) of the enlarged areas in Figure 2. The best results are obtained when the APAP combines local, global similarity constraints and line protection constraints. The quantitative data are shown in Table 1. The ablation experiment proves that our method is effective.
We also have conducted comparative experiments of proposed algorithm on the images with straight lines to further verify the effect. The compared methods include the APAP and AANAP. The results of the APAP, AANAP, and proposed methods are shown in Figure 3. Two areas of each result have been enlarged. For the results of the APAP and AANAP methods, the artifact and perspective distortion on the straight line and pedestrians on the road are non-negligible. However, our method can successfully mitigate the artifact and perspective distortion.

4.2. Construction of UAV Aerial Photography Dataset

We also obtained the dataset from UAV aerial photography. The UAV aerial image dataset includes aerial images in spring, summer, autumn, and BGNIR multispectral aerial images. Figure 4a–c show some real color aerial images, and Figure 5a–c show some real BGNIR multispectral aerial images.

4.3. Color Aerial Image Stitching

We used ODM and our method to conduct color aerial image stitching experiment, stitching 20 adjacent frames into a panoramic image. Experiments were conducted on the images of spring, summer, and autumn in the dataset. The experimental results are shown in Figure 6.
The stitching results of ODM and our methods in Figure 6a look well visually. The splicing effect of the two splicing methods on the spring aerial images is similar in Figure 6a. We use white dotted boxes to tag the stitching results of lines in panoramas for ODM and our methods on the summer and autumn aerial images in Figure 6b,c. The results of ODM method in Figure 6b,c show obvious artifacts and distortions of straight-line information due to parallax exist. In contrast, the results of our method in Figure 6b,c can protect the lines in the images and successfully handle the parallax issue. Our method works well and maintains the integrity of straight-line contents.
We quantitatively analyze the two stitching methods using PSNR, SSIM, and RMSE. The specific data are in Table 2, Table 3 and Table 4.
PSNR is an evaluation index based on pixel error, and the higher the peak signal to noise ratio, the better the method used. From the comparison of experimental data in Table 2, the PSNR value of our method is greater than that of ODM method.
The larger the SSIM value, the smaller the difference between the output image and the undistorted image, that is, the better the image quality. From the experimental data in Table 3, our method has less distortion and better stitching effect.
RMSE is an evaluation index to judge the image quality. The smaller the value, the higher the registration accuracy. From the experimental data in Table 4, the quality of our method is better than that of ODM method.

4.4. BGNIR Multispectral Aerial Image Stitching

Multispectral image refers to the acquisition of multiple single bands of ground object radiation, and the obtained data will contain spectral information of multiple bands.
In this part of the experiment, ODM, Pix4D, and our method are employed to stitch the BGNIR multispectral aerial images, respectively. Through the experiment, the ODM method cannot stitch without pose information, and the ODM with pose information could stitch successfully, but the effect was relatively poor. Similarly, the Pix4D method also cannot stitch without pose information, and the Pix4D with pose information can stitch, but the results are the worst of all methods. Some experimental results are shown in Figure 7.
From Figure 7, we find that our method provides a natural look to the panorama even position and attitude information of UAV are not used. There are no visible parallax errors and perspective distortions. In contrast, obvious distortions, artifacts, and information loss occur in the results of ODM and Pix4D with position and attitude information.
Similarly, we use PSNR, SSIM, and RMSE to quantitatively analyze the results of the three methods in BGNIR multispectral dataset. Due to the stitching results of Pix4D incomplete, we cannot calculate PSNR, SSIM, and RMSE values of them. The specific data are shown in Table 5, Table 6 and Table 7.
From the experimental data in Table 5, Table 6 and Table 7, our method is superior to ODM in PSNR, SSIM, and RMSE values.

4.5. Running Time

The stitching running time of ODM, Pix4D, and our method is calculated, respectively. The results are shown in Table 8. The speed of our method is relatively fast, whether in color aerial images or BGNIR multispectral aerial images. In the process of color aerial image stitching, ODM and our methods can perform stitching without position and attitude information, and the ODM splicing speed is a little slower. When ODM and Pix4D methods do not use position and attitude information on BGNIR multispectral aerial image, the stitching fails, and the splicing time with position information is about twice that of our method. Therefore, our method is the fastest of threes methods.

5. Discussion

In order to solve the problems of artifacts and distortions in the mosaic of aerial images with certain parallax, and mosaic failure of multispectral aerial images due to fewer feature points, we use AKAZE algorithm to extract feature points of aerial images, and add line protection constraints, global, and local similarity constraints to protect image structure information based on APAP, and finally obtain panoramic aerial image. Our method has good results when stitching real color aerial images and BGNIR multispectral aerial images. The experimental results demonstrate that our method performs well in artifact and distortion suppression and stitching time. The proposed stitching method has practical application value in the field of land resource survey and environmental monitoring.

Author Contributions

Conceptualization, J.X. and D.Z.; Investigation, D.Z., Z.R. and F.F.; Formal analysis, Z.R., F.F. and M.F.; Validation, Z.R. and D.Z.; Project administration, M.F. and Y.S.; Data curation, D.Z.; Resources, M.F.; Supervision, J.X. and M.F.; Visualization, D.Z. and Z.R.; Writing—original draft, J.X., D.Z. and M.F.; Writing—review and editing, J.X. and M.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Key Laboratory Fund, grant number 2021JCJQLB055011.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets presented in this study are available from the corresponding authors upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tsai, C.H.; Lin, Y.C. An accelerated image matching technique for UAV orthoimage registration. ISPRS J. Photogramm. Remote Sens. 2017, 128, 130–145. [Google Scholar] [CrossRef]
  2. Cui, S.; Zhong, Y. Multi-Modal Remote Sensing Image Registration Based on Multi-Scale Phase Congruency. In Proceedings of the 2018 10th IAPR Workshop on Pattern Recognition in Remote Sensing (PRRS), Beijing, China, 19–20 August 2018; pp. 1–5. [Google Scholar] [CrossRef]
  3. Shao, R.; Du, C.; Chen, H.; Li, J. Fast Anchor Point Matching for Emergency UAV Image Stitching Using Position and Pose Information. Sensors 2020, 20, 2007. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Zhang, Y.; Zhang, C.; Sha, G. Color Image Mosaic Algorithm Based on Improved SIFT. Comput. Meas. Control 2016, 24, 236–239+247. [Google Scholar]
  5. Rumeng, W.U.; Fang, T.; Wenqin, L.I. Image Mosaic Method of UAV Aerial Photography Based on SURF Feature. Mod. Manuf. Technol. Equip. 2019, 8, 34–36. [Google Scholar]
  6. Han, M.; Yan, K.; Qin, G. A Mosaic Algorithm for UAV Aerial Image with Improved KAZE. Acta Autom. Sin. 2019, 45, 305–314. [Google Scholar] [CrossRef]
  7. Huang, H.; Li, X.; Nie, X.; Zhang, Y.; Feng, L. Research on remote sensing image registration based on SIFT algorithm. Laser J. 2021, 42, 97–102. [Google Scholar] [CrossRef]
  8. Xu, K.; Liu, J.; Miao, J.; Liu, F. An improved SIFT algorithm based on adaptive fractional differential. J. Ambient. Intell Hum. Comput. 2019, 10, 3297–3305. [Google Scholar] [CrossRef]
  9. Swathi, R.; Srinivas, A. An Improved Image Registration Method Using E-SIFT Feature Descriptor with Hybrid Optimization Algorithm. J. Indian Soc. Remote Sens. 2020, 48, 215–226. [Google Scholar] [CrossRef]
  10. Huang, S.; He, X.; Liao, K.; Chen, D. Research on Image Mosaic Algorithm of UAV Aerial Photography with Parallax. Mapp. Geogr. Inf. 2022, 47, 1–5. [Google Scholar] [CrossRef]
  11. Zaragoza, J.; Chin, T.; Tran, Q.; Brown, S.M.; Suter, D. As-Projective-As-Possible Image Stitching with Moving DLT. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 1285–1298. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Huang, F.; Lin, S. Multi-Band Image Fusion Rules Comparison Based on the Laplace Pyramid Transformation Method. Infrared Technol. 2019, 41, 64–71. [Google Scholar]
  13. Li, N.; Xu, Y.; Wang, C. Quasi-Homography Warps in Image Stitching. IEEE Trans. Multimed. 2018, 20, 1365–1375. [Google Scholar] [CrossRef]
  14. Xiang, T.Z.; Xia, G.S.; Xiang, B.; Zhang, L. Image Stitching by Line-guided Local Warping with Global Similarity Constraint. Pattern Recognit. 2018, 83, 481–497. [Google Scholar] [CrossRef] [Green Version]
  15. Mukherjee, P.; Lall, B. Saliency and KAZE features assisted object segmentation. Image Vis. Comput. 2017, 61, 82–97. [Google Scholar] [CrossRef]
  16. Li, P. Image Registration Method Based on Optimized KAZE Algorithm. J. Ordnance Equip. Eng. 2021, 42, 248–253. [Google Scholar]
  17. Bao, W.; Sang, S.; Shen, X. Remote sensing image registration algorithm based on entropy constrained and KAZE feature extraction. Opt. Precis. Eng. 2020, 28, 1810–1819. [Google Scholar]
  18. Tareen, S.A.K.; Saleem, Z. A comparative analysis of SIFT, SURF, KAZE, AKAZE, ORB, and BRISK. In Proceedings of the 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur, Pakistan, 3–4 March 2018; pp. 1–10. [Google Scholar]
  19. Zhu, D.; Wu, D.; Liu, S.; Liu, L. Disparity image feature matching algorithm based on AKAZE and adaptive local affine matching. J. Appl. Opt. 2021, 42, 1048–1055. [Google Scholar]
  20. Wu, L.; Chen, X. An Image Stitching Algorithm Based on Improved AKAZE Feature and RANSAC. Comput. Eng. 2021, 47, 246–254. [Google Scholar]
  21. Chen, Y.S.; Chuang, Y.Y. Natural Image Stitching with the Global Similarity Prior. ECCV 2016. In Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2016. [Google Scholar] [CrossRef]
  22. Igarashi, T.; Igarashi, Y. Implementing as-rigid-as-possible shape manipulation and surface flattening. J. Graphs GPU Game Tools 2009, 14, 17–30. [Google Scholar] [CrossRef] [Green Version]
  23. He, C.; Zhou, J. Mesh-based image stitching algorithm with linear structure protection. J. Image Graph. 2018, 23, 973–983. [Google Scholar]
  24. Lin, C.C.; Pankanti, S.U.; Ramamurthy, K.N.; Aravkin, A.Y. Adaptive as-natural-as-possible image stitching. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1155–1163. [Google Scholar] [CrossRef]
Figure 1. Flow chart of aerial image stitching.
Figure 1. Flow chart of aerial image stitching.
Jimaging 09 00005 g001
Figure 2. Ablation experiments on Forest, Road, and Bridge dataset. (ac) show stitching results of the APAP. (df) show stitching results of the APAP with local and global similarity constraints. (gi) show stitching results of the APAP with local, global similarity constraints and line protection constraints (One area of each result is enlarged in the figure).
Figure 2. Ablation experiments on Forest, Road, and Bridge dataset. (ac) show stitching results of the APAP. (df) show stitching results of the APAP with local and global similarity constraints. (gi) show stitching results of the APAP with local, global similarity constraints and line protection constraints (One area of each result is enlarged in the figure).
Jimaging 09 00005 g002
Figure 3. Comparison of Image Mosaic Results. (ac) show stitching results of the APAP, AANAP, and ours separately. (a) Stitching result of the APAP; (b) Stitching result of the AANAP; (c) Stitching result of Ours (Two areas of each result are enlarged in the figure).
Figure 3. Comparison of Image Mosaic Results. (ac) show stitching results of the APAP, AANAP, and ours separately. (a) Stitching result of the APAP; (b) Stitching result of the AANAP; (c) Stitching result of Ours (Two areas of each result are enlarged in the figure).
Jimaging 09 00005 g003aJimaging 09 00005 g003b
Figure 4. Color aerial dataset images. (a) color aerial images in spring (1920 × 1080 pixels); (b) color aerial images in summer (1472 × 1852 pixels); (c) color aerial images in autumn (1472 × 1852 pixels).
Figure 4. Color aerial dataset images. (a) color aerial images in spring (1920 × 1080 pixels); (b) color aerial images in summer (1472 × 1852 pixels); (c) color aerial images in autumn (1472 × 1852 pixels).
Jimaging 09 00005 g004
Figure 5. BGNIR multispectral aerial dataset images. (a) Blue spectrum images (2160 × 750 pixels); (b) Green spectrum images (2160 × 940 pixels); (c) Near infrared spectrum images (2160 × 796 pixels).
Figure 5. BGNIR multispectral aerial dataset images. (a) Blue spectrum images (2160 × 750 pixels); (b) Green spectrum images (2160 × 940 pixels); (c) Near infrared spectrum images (2160 × 796 pixels).
Jimaging 09 00005 g005aJimaging 09 00005 g005b
Figure 6. Panoramas of ODM method and our method on real color aerial dataset. (a) Panoramas for ODM and ours on spring aerial images; (b) Panoramas for ODM and ours on summer aerial images; (c) Panoramas for ODM and ours on autumn aerial images.
Figure 6. Panoramas of ODM method and our method on real color aerial dataset. (a) Panoramas for ODM and ours on spring aerial images; (b) Panoramas for ODM and ours on summer aerial images; (c) Panoramas for ODM and ours on autumn aerial images.
Jimaging 09 00005 g006
Figure 7. Stitching results of ODM, Pix4D, and proposed method on BGNIR multispectral aerial image dataset. (a) Panoramas for ODM, Pix4D, and ours on blue spectrum aerial images; (b) Panoramas for ODM, Pix4D, and ours on green spectrum aerial images; (c) Panoramas for ODM, Pix4D, and ours on near infrared spectral aerial images.
Figure 7. Stitching results of ODM, Pix4D, and proposed method on BGNIR multispectral aerial image dataset. (a) Panoramas for ODM, Pix4D, and ours on blue spectrum aerial images; (b) Panoramas for ODM, Pix4D, and ours on green spectrum aerial images; (c) Panoramas for ODM, Pix4D, and ours on near infrared spectral aerial images.
Jimaging 09 00005 g007aJimaging 09 00005 g007b
Table 1. Numerical Results of Ablation Experiment.
Table 1. Numerical Results of Ablation Experiment.
ImagesConstraintsEvaluation Data
Local SimilarityGlobal SimilarityStraight Line ProtectionPSNRSSIMRMSE
Forest image\\\13.5420.1192.876
\15.4540.1301.768
15.4910.1461.714
Road image\\\10.1000.0446.353
\10.3460.0436.004
11.6010.0604.497
Bridge image\\\15.7810.2371.718
\15.3680.2551.889
15.8860.2671.677
Table 2. PSNR values of two methods on color aerial image mosaic.
Table 2. PSNR values of two methods on color aerial image mosaic.
MethodsSpringSummerAutumn
ODM14.624115.082714.3579
Ours14.661216.862916.2582
Table 3. SSIM values of two methods on color aerial image mosaic.
Table 3. SSIM values of two methods on color aerial image mosaic.
MethodsSpringSummerAutumn
ODM0.17220.20570.2019
Ours0.22460.32970.2051
Table 4. RMSE values of two methods on color aerial image mosaic.
Table 4. RMSE values of two methods on color aerial image mosaic.
MethodsSpringSummerAutumn
ODM2.2422.3792.384
Ours2.2231.3391.539
Table 5. PSNR values of ODM and our methods on BGNIR multispectral aerial image mosaic.
Table 5. PSNR values of ODM and our methods on BGNIR multispectral aerial image mosaic.
MethodBlue Spectrum Green SpectrumNear Infrared Spectrum
ODM13.618414.339014.3967
Ours18.187418.455214.7783
Table 6. SSIM values of ODM and our methods on BGNIR multispectral aerial image mosaic.
Table 6. SSIM values of ODM and our methods on BGNIR multispectral aerial image mosaic.
MethodBlue Spectrum Green SpectrumNear Infrared Spectrum
ODM0.45960.42850.2148
Ours0.58060.59140.3210
Table 7. RMSE values of ODM and our methods on BGNIR multispectral aerial image mosaic.
Table 7. RMSE values of ODM and our methods on BGNIR multispectral aerial image mosaic.
MethodBlue Spectrum Green SpectrumNear Infrared Spectrum
ODM 2.826 2.394 2.363
Ours 1.066 1.029 2.164
Table 8. Stitching time of two methods on aerial images mosaic (s).
Table 8. Stitching time of two methods on aerial images mosaic (s).
MethodsSpringSummerAutumnBlue SpectrumGreen SpectrumNear Infrared Spectrum
ODM 34.33 44.87 38.64 63.45 72.43 70.36
Pix4D \ \ \ 64.02 113.97 61.01
Ours 21.99 35.15 25.95 30.23 37.16 37.18
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xu, J.; Zhao, D.; Ren, Z.; Fu, F.; Sun, Y.; Fang, M. A Parallax Image Mosaic Method for Low Altitude Aerial Photography with Artifact and Distortion Suppression. J. Imaging 2023, 9, 5. https://doi.org/10.3390/jimaging9010005

AMA Style

Xu J, Zhao D, Ren Z, Fu F, Sun Y, Fang M. A Parallax Image Mosaic Method for Low Altitude Aerial Photography with Artifact and Distortion Suppression. Journal of Imaging. 2023; 9(1):5. https://doi.org/10.3390/jimaging9010005

Chicago/Turabian Style

Xu, Jing, Dandan Zhao, Zhengwei Ren, Feiran Fu, Yuxin Sun, and Ming Fang. 2023. "A Parallax Image Mosaic Method for Low Altitude Aerial Photography with Artifact and Distortion Suppression" Journal of Imaging 9, no. 1: 5. https://doi.org/10.3390/jimaging9010005

APA Style

Xu, J., Zhao, D., Ren, Z., Fu, F., Sun, Y., & Fang, M. (2023). A Parallax Image Mosaic Method for Low Altitude Aerial Photography with Artifact and Distortion Suppression. Journal of Imaging, 9(1), 5. https://doi.org/10.3390/jimaging9010005

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop