# Seamless Mosaicking of UAV-Based Push-Broom Hyperspectral Images for Environment Monitoring

^{1}

^{2}

^{3}

^{4}

^{5}

^{6}

^{*}

## Abstract

**:**

## 1. Introduction

#### 1.1. Geometric Correction

#### 1.2. Radiometric Correction

^{2}= 0.99) for the normalized difference vegetation index (NDVI). For the individual wavelength bands, R

^{2}was 0.80–0.97 for the red-edge, near-infrared, and red bands. They stated that surface reflectance is not directly measured by the UAV-based cameras. When an image is captured, the image sensor in the camera records the radiant (light) received by each pixel as a digital number (DN). To convert the DN to surface reflectance, a radiometric correction must be performed by (1) applying sensor-related corrections to obtain the radiance received by the camera from the DN, and (2) converting the radiance received by the camera to surface reflectance [35].

## 2. Materials and Methods

#### 2.1. Data Acquisition

#### 2.2. Data Preprocessing

_{ijλ}can be calculated using the following equation:

_{ijλ}is the radiance of the pixel (i, j) in the hyperspectral image, D

_{ijλ}is the spectral density of dark current, and T

_{λ}is the radiance of the reference of a calibration whiteboard in the hyperspectral image which has the reflectance R

_{rλ}.

#### 2.3. Image Mosaicking Methodology

#### 2.3.1. Geometrical Rectification

_{s}, Y

_{s}), which is the location of the light beam center, the pitch, yaw, roller which are the three angles that describe the flight attitude. The focal length ƒ is derived from the camera calibration report. The ground is assumed as a flat plain because the area coverage is small, and then the Z

_{A}coordinate of each ground point A is set as Z

_{S}-h; h represents the flight height. The coordinates of each ground point A (X

_{A}, Y

_{A}) is calculated by direct geo-referencing based on the collinear condition equation [39] as shown in Equation (2). In the equation, the a

_{1}, a

_{2}, a

_{3}, b

_{1}, b

_{2}, b

_{3}, c

_{1}, c

_{2}, c

_{3}are calculated based on the three angles of pitch, yaw, roller to construct the rotation matrix. By a bilinear interpolation method, each image strip can be rectified and resampled to be the coarsely rectified image with geographical coordinates. In this process, the self-developed hyperspetral image processing software, which is programmed by Visual C++, was used.

#### 2.3.2. Image Registration of Neighboring Image Strips

#### 2.3.2.1. SIFT and RANSAC-Based Method

#### 2.3.2.2. The Improved Phase Correlation Method

_{0}and scaling factor is k, then the spatial relationship between the neighboring image strip can be represented in Equation (4):

_{1}and F

_{2}satisfies Equation (5):

^{p}, p is an integer), its calculation efficiency is highest, sub-images (Figure 5), which are cut from the original image with the image size of 2-power, can be used to calculate the transformation parameters first. Then, the translation offset for the original images can be calculated according to the location of sub-images. There are two overlap patterns including the left–right overlap as shown in Figure 5a and the up-down overlap as shown in Figure 5b. Using the 2-power sub-image, the image size becomes smaller, the overlapped rate is increased and the efficiency can be promoted.

- The 2-power sub-images of the reference image ${F}_{1}(x,y)$ and the registered image ${F}_{2}(x,y)$ are set as ${f}_{1}(x,y)$ and ${f}_{2}(x,y)$ respectively. The Canny edge detector is used to extract the edge of ${f}_{1}(x,y)$ and ${f}_{2}(x,y)$, which are represented as ${f}_{1}^{\prime}(x,y)$ and ${f}_{2}^{\prime}(x,y)$.
- The Fourier transformation is implemented to ${f}_{1}^{\prime}(x,y)$ and ${f}_{2}^{\prime}(x,y)$ respectively, and then the amplitude spectra are obtained as ${M}_{1}(u,v)$ and ${M}_{2}(u,v)$. The amplitude spectra is converted from the Cartesian coordinate system to the polar coordinate system, and ${M}_{1}(\theta ,\mathrm{lg}\rho )$ and ${M}_{2}(\theta ,\mathrm{lg}\rho )$ were got by logarithmic operation.
- The traditional phase correlation method is applied to ${M}_{1}(\theta ,\mathrm{lg}\rho )$ and ${M}_{2}(\theta ,\mathrm{lg}\rho )$ to get the rotation angle ${\theta}_{0}$ and the scale parameter $k$. Then, the ${\theta}_{0}$ and $k$ are used to do the inverse transformation to the 2-power sub-image of the registered image ${f}_{2}(x,y)$ to get a transition image ${f}_{3}(x,y)$ with only translation offsets.
- Canny edge detector is used to get the edge image of ${f}_{3}(x,y)$ as ${f}_{3}^{\prime}(x,y)$. Then, the phase correction method is used to calculate the relative translation offset $({x}_{0},{y}_{0})$ between the registered image ${F}_{2}(x,y)$ and ${f}_{3}(x,y)$.
- Based on the calculated ${\theta}_{0}$, $k$ and $({x}_{0},{y}_{0})$, the spatial similarity transformation is used and incorporated in a bilinear interpolation resampling procedure is used in image registration to produce the geometrically rectified hyperspectral image strips for the latter seamless mosaicking.

#### 2.3.3. Image Fusion

## 3. Experimental Results

#### 3.1. Geometric Rectification Results of River Course Area

#### 3.2. Image Registration Results

#### 3.3. Image Fusion Results

#### 3.3.1. Urban Scape Image Fusion Results

#### 3.3.2. The Weighted Average Fusion Results of River Course and Forest Area

## 4. Discussion

#### 4.1. Geometric Rectification

#### 4.2. Image Registration

#### 4.3. Image Fusion

## 5. Conclusions

## Author Contributions

## Funding

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Goetz, A.F.H. Three decades of hyperspectral remote sensing of the Earth: A personal view. Remote Sens. Environ.
**2009**, 113, S5–S16. [Google Scholar] [CrossRef] - Kutser, T.; Hedley, J.; Giardino, C.; Roelfsema, C.; Brando, V.E. Remote sensing of shallow waters—A 50 year retrospective and future directions. Remote Sens. Environ.
**2020**, 240, 111619. [Google Scholar] [CrossRef] - Banerjee, B.P.; Raval, S.; Cullen, P.J. UAV-hyperspectralimaging of spectrally complex environments. Int. J. Remote Sens.
**2020**, 41, 4136–4159. [Google Scholar] [CrossRef] - Avta, R.; Watanabe, T. Unmanned Aerial Vehicle: Applications in Agriculture and Environment; Springer Nature: Basingstoke, UK, 2020. [Google Scholar]
- Jang, G.; Kim, J.; Yu, J.-K.; Kim, H.-J.; Kim, Y.; Kim, D.-W.; Kim, K.-H.; Lee, C.W.; Chung, Y.S. Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application. Remote Sens.
**2020**, 12, 998. [Google Scholar] [CrossRef] [Green Version] - Zhang, S.M.; Zhao, G.X.; Lang, K.; Su, B.W.; Chen, X.N.; Xi, X.; Zhang, H.B. Integrated Satellite, Unmanned Aerial Vehicle (UAV) and Ground Inversion of the SPAD of Winter Wheat in the Reviving Stage. Sensors
**2019**, 19, 1485. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Honkavaara, E.; Rosnell, T.; Oliveira, R.; Tommaselli, A. Band registration of tuneable frame format hyperspectral UAV imagers in complex scenes. ISPRS J. Photogramm. Remote Sens.
**2017**, 134, 96–109. [Google Scholar] [CrossRef] - Wu, W.B.; Zhang, Z.B.; Zheng, L.J.; Han, C.Y.; Wang, X.M.; Xu, J.; Wang, X.R. Research Progress on the Early Monitoring of Pine Wilt Disease Using Hyperspectral Techniques. Sensors
**2020**, 20, 3729. [Google Scholar] [CrossRef] - Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ.
**2012**, 117, 322–337. [Google Scholar] [CrossRef] - Stuart, M.B.; Mcgonigle, A.J.S.; Willmott, J.R. Hyperspectral Imaging in Environmental Monitoring: A Review of Recent Developments and Technological Advances in Compact Field Deployable Systems. Sensors
**2019**, 19, 3071. [Google Scholar] [CrossRef] [Green Version] - Cao, J.; Leng, W.; Liu, K.; Liu, L.; He, Z.; Zhu, Y. Object-Based Mangrove Species Classification Using Unmanned Aerial Vehicle Hyperspectral Images and Digital Surface Models. Remote Sens.
**2018**, 10, 89. [Google Scholar] [CrossRef] [Green Version] - Lu, B.; He, Y. Optimal spatial resolution of Unmanned Aerial Vehicle (UAV)-acquired imagery for species classification in a heterogeneous grassland ecosystem. GIScience Remote Sens.
**2018**, 55, 205–220. [Google Scholar] [CrossRef] - Lu, B.; He, Y. Species classification using Unmanned Aerial Vehicle (UAV)-acquired high spatial resolution imagery in a heterogeneous grassland. ISPRS J. Photogramm. Remote Sens.
**2017**, 128, 73–85. [Google Scholar] [CrossRef] - Lu, B.; He, Y.; Liu, H.H.T. Mapping vegetation biophysical and biochemical properties using unmanned aerial vehicles-acquired imagery. Int. J. Remote Sens.
**2017**, 39, 5265–5287. [Google Scholar] [CrossRef] - Büttner, A.; Röser, H.-P. Hyperspectral remote sensing with the UAS ‘‘Stuttgarter Adler”—system setup, calibration and first results. Photogramm. Fernerkund. Geoinf.
**2014**, 4, 265–274. [Google Scholar] [CrossRef] [PubMed] - Barreto, M.A.P.; Johansen, K.; Angel, Y.; McCabe, M.F. Radiometric Assessment of a UAV-Based Push-Broom Hyperspectral Camera. Sensors
**2019**, 19, 4699. [Google Scholar] [CrossRef] [Green Version] - Angel, Y.; Turner, D.; Parkes, S.; Malbeteau, Y.; Lucieer, A.; McCabe, M.F. Automated Georectification and Mosaicking of UAV-Based Hyperspectral Imagery from Push-Broom Sensors. Remote Sens.
**2020**, 12, 34. [Google Scholar] [CrossRef] [Green Version] - Habib, A.; Xiong, W.; He, F.; Yang, H.L.; Crawford, M. Improving Orthorectification of UAV-Based Push-Broom Scanner Imagery Using Derived Orthophotos From Frame Cameras. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
**2016**, 10, 262–276. [Google Scholar] [CrossRef] - Yang, G.; Li, C.; Wang, Y.; Yuan, H.; Feng, H.; Xu, B.; Yang, X. The DOM Generation and Precise Radiometric Calibration of a UAV-Mounted Miniature Snapshot Hyperspectral Imager. Remote Sens.
**2017**, 9, 642. [Google Scholar] [CrossRef] [Green Version] - Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens.
**2015**, 108, 245–259. [Google Scholar] [CrossRef] - Hagen, N.; Kudenov, M.W. Review of snapshot spectral imaging technologies. Opt. Eng.
**2013**, 52, 090901. [Google Scholar] [CrossRef] [Green Version] - Suomalainen, J.; Anders, N.; Iqbal, S.; Roerink, G.; Franke, J.; Wenting, P.; Hünniger, D.; Bartholomeus, H.; Becker, R.; Kooistra, L. A Lightweight Hyperspectral Mapping System and Photogrammetric Processing Chain for Unmanned Aerial Vehicles. Remote Sens.
**2014**, 6, 11013–11030. [Google Scholar] [CrossRef] [Green Version] - Kim, J.-I.; Kim, T.; Shin, D.; Kim, S. Fast and robust geometric correction for mosaicking UAV images with narrow overlaps. Int. J. Remote Sens.
**2017**, 38, 2557–2576. [Google Scholar] [CrossRef] - Faraji, M.R.; Qi, X.; Jensen, A. Computer vision–based orthorectification and georeferencing of aerial image sets. J. Appl. Remote Sens.
**2016**, 10, 036027. [Google Scholar] [CrossRef] - Shen, X.; Cao, L.; Coops, N.C.; Fan, H.; Wu, X.; Liu, H.; Wang, G.; Cao, F. Quantifying vertical profiles of biochemical traits for forest plantation species using advanced remote sensing approaches. Remote Sens. Environ.
**2020**, 250, 112041. [Google Scholar] [CrossRef] - Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative Remote Sensing at Ultra-High Resolution with UAV Spectroscopy: A Review of Sensor Technology, Measurement Procedures, and Data Correction Workflows. Remote Sens.
**2018**, 10, 1091. [Google Scholar] [CrossRef] [Green Version] - Koshy, G.; Vishnukumar, S. A hybrid approach to generate visually seamless aerial mosaicks from unmanned aerial vehicles. J. Intell. Fuzzy Syst.
**2019**, 36, 2075–2083. [Google Scholar] - Habib, A.; Han, Y.; Xiong, W.; He, F.; Zhang, Z.; Crawford, M. Automated Ortho-Rectification of UAV-Based Hyperspectral Data over an Agricultural Field Using Frame RGB Imagery. Remote Sens.
**2016**, 8, 796. [Google Scholar] [CrossRef] [Green Version] - Kirsch, M.; Lorenz, S.; Zimmermann, R.; Tusa, L.; Möckel, R.; Hödl, P.; Booysen, R.; Khodadadzadeh, M.; Gloaguen, R. Integration of Terrestrial and Drone-Borne Hyperspectral and Photogrammetric Sensing Methods for Exploration Mapping and Mining Monitoring. Remote Sens.
**2018**, 10, 1366. [Google Scholar] [CrossRef] [Green Version] - Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens.
**2017**, 9, 1110. [Google Scholar] [CrossRef] [Green Version] - Jaud, M.; Le Dantec, N.; Ammann, J.; Grandjean, P.; Constantin, D.; Akhtman, Y.; Barbieux, K.; Allemand, P.; Delacourt, C.; Merminod, B. Direct Georeferencing of a Pushbroom, Lightweight Hyperspectral System for Mini-UAV Applications. Remote Sens.
**2018**, 10, 204. [Google Scholar] [CrossRef] [Green Version] - Li, X.Y. Principle, Method and Practice of IMU/DGPS Based Photogrammetry. Ph.D. Thesis, Information Engineering University, Zhengzhou, China, 2005. [Google Scholar]
- Yuan, X.X.; Zhang, X.P. Theoretical accuracy of direct georeferencing with position and orientation system in aerial photogrammetry. Int. Arch. Photogramm. Remote Sens. Spat. Sci.
**2008**, XXXVII Pt B1, 617–622. [Google Scholar] - Moroni, M.; Dacquino, C.; Cenedese, A. Mosaicing of Hyperspectral Images: The Application of a Spectrograph Imaging Device. Sensors
**2012**, 12, 10228–10247. [Google Scholar] [CrossRef] [PubMed] - Olsson, P.-O.; Vivekar, A.; Adler, K.; Garcia Millan, V.E.; Koc, A.; Alamrani, M.; Eklundh, L. Radiometric Correction of Multispectral UAS Images: Evaluating the Accuracy of the Parrot Sequoia Camera and Sunshine Sensor. Remote Sens.
**2021**, 13, 577. [Google Scholar] [CrossRef] - Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote Sens.
**2018**, 10, 256. [Google Scholar] [CrossRef] [Green Version] - Flight Rules of Light-Weighted UAV, Civil Aviation Adminstration of China. 2015. Available online: http://www.caac.gov.cn/XXGK/XXGK/GFXWJ/201601/P020170527591647559640.pdf (accessed on 13 November 2021).
- Hruska, R.; Mitchell, J.; Anderson, M.; Glenn, N.F. Radiometric and Geometric Analysis of Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle. Remote Sens.
**2012**, 4, 2736–2752. [Google Scholar] [CrossRef] [Green Version] - Wang, S.G. The Principal and Applications on Photogrammetry, 1st ed.; Wuhan University Press: Wuhan, China, 2009. [Google Scholar]
- Harder, R.L.; Desmarais, R.N. Interpolation using surface splines. J. Aircr.
**1972**, 9, 189–191. [Google Scholar] [CrossRef] - Zhang, L.; Huang, W.M.; Zhang, Y.; Xu, Y.M.; Zhou, C. Estimation of Signal-Noise-Ratio for HJ-1/CCD Data. Geospatial Inf.
**2013**, 3, 73–75. [Google Scholar] - Tan, X.; Sun, C.; Sirault, X.; Furbank, R.; Pham, T.D. Feature matching in stereo images encouraging uniform spatial distribution. Pattern Recognit.
**2015**, 48, 2530–2542. [Google Scholar] [CrossRef] - Abdel-Hakim, A.E.; Aly, A.F. Csift: A sift descriptor with color invariant characteristics. In Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR′06), New York, NY, USA, 17–22 June 2006. [Google Scholar]
- Jakubovic, A.; Velagic, J. Image Feature Matching and Object Detection Using Brute-Force Matchers. In Proceedings of the 2018 International Symposium ELMAR, Zadar, Croatia, 16–19 September 2018. [Google Scholar]
- Li, T.T.; Jiang, B.; Tu, Z.Z.; Luo, B.; Tang, J. Image Matching Using Mutual k-Nearest Neighbor Graph. In Intelligent Computation in Big Data Era; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
- Liu, H.; Deng, M.; Xiao, C.; Deng, M. An improved best bin first algorithm for fast image registration. In Proceedings of the 2011 International Conference on Electronic & Mechanical Engineering and Information Technology, Harbin, China, 12–14 August 2011; Volume 1, pp. 355–358. [Google Scholar]
- Bellavia, F.; Colombo, C. Is There Anything New to Say About SIFT Matching? Int. J. Comput. Vis.
**2020**, 128, 1847–1866. [Google Scholar] [CrossRef] [Green Version] - Li, J.; Wang, H.; Zhang, L.; Wang, Z.; Wang, M. The Research of Random Sample Consensus Matching Algorithm in PCA-SIFT Stereo Matching Method. In Proceedings of the 2019 Chinese Control And Decision Conference (CCDC), Nanchang, China, 3–5 June 2019; pp. 3338–3341. [Google Scholar]
- Zhou, M.L.; Bai, Z.W.; Yan, X.J. Design of the phase correction based image stitching system. Foreign Electron. Meas. Technol.
**2015**, 5, 31–33. (In Chinese) [Google Scholar] - Reddy, B.; Chatterji, B. An FFT-based technique for translation, rotation, and scale-invariant image registration. IEEE Trans. Image Process.
**1996**, 5, 1266–1271. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Isoardi, R.A.; Osorio, A.R.; Mato, G. Medical Image Registration with Fourier basis Functions; INTECH Open Access Publisher: Metro Manila, Philippines, 2001. [Google Scholar]
- Szeliski, R. Image Alignment and Stitching: A Tutorial. Found. Trends® Comput. Graph. Vis.
**2007**, 2, 1–104. [Google Scholar] [CrossRef] - Luo, Y.T.; Wang, Y.; Zhang, H.M. Image-stitching Algorithm by Combining the Optimal Seam and an Improved Gradual Fusion Method. Infrared Technol.
**2018**, 40, 382–387. [Google Scholar] - Gu, Y.; Zhou, G.; Ren, G. Image stitching by combining optimal seam and multi-resolution fusion. J. Image Graphics.
**2017**, 22, 842–851. [Google Scholar] - Zhang, J.Z.; Zhu, W.Q.; Zheng, Z.T. Research on the similarity of hyperspectral image. Sci. Surv. Mapp.
**2013**, 38, 33–36. [Google Scholar] - Jakob, S.; Zimmermann, R.; Gloaguen, R. The Need for Accurate Geometric and Radiometric Corrections of Drone-Borne Hyperspectral Data for Mineral Exploration: MEPHySTo—A Toolbox for Pre-Processing Drone-Borne Hyperspectral Data. Remote Sens.
**2017**, 9, 88. [Google Scholar] [CrossRef] [Green Version] - de Oca, A.M.; Arreola, L.; Flores, A.; Sanchez, J.; Flores, G. Low-cost multispectral imaging system for crop monitoring. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; pp. 443–451. [Google Scholar]
- Lucieer, A.; Malenovský, Z.; Veness, T.; Wallace, L. HyperUAS-Imaging Spectroscopy from a Multirotor Unmanned Aircraft System: HyperUAS-Imaging Spectroscopy from a Multirotor Unmanned. J. Field Robot
**2014**, 31, 571–590. [Google Scholar] [CrossRef] [Green Version] - Richter, R.; Schläpfer, D.; Muller, A. Operational Atmospheric Correction for Imaging Spectrometers Accounting for the Smile Effect. IEEE Trans. Geosci. Remote Sens.
**2011**, 49, 1772–1780. [Google Scholar] [CrossRef]

**Figure 1.**UAV hyperspectral system used in the study: (

**a**) M600 Pro UAV platform with the hyperspectral imaging system onboard; (

**b**) The ZK-VNIR-FPG480 hyperspectral imaging spectrometry system.

**Figure 3.**True color images of two neighboring image strips (bands of 103, 63 and 27): (

**a**) Urban scape; (

**b**) River course; (

**c**) Forest.

**Figure 5.**Two-power sub-images (the blue and red line represent the boundaries of 2-power sub-images): (

**a**) the left-right overlap 2-power sub-images; (

**b**) the up-down overlap 2-power sub-images.

**Figure 6.**River course GCPs and geometric rectified results: (

**a**) GCPs on GE image; (

**b**) CSSF method with 11 GCPs; (

**c**) POS-based method without GCP; (

**d**) POS-based geometric rectification method with 4 GCPs that labeled as 2, 6, 8 and 11 on Figure 5a.

**Figure 7.**SIFT and RANSAC-based image registration results: (

**a**) Urban scape; (

**b**) River course; (

**c**) Forest.

**Figure 8.**Image registration results using the improved phase correlation method: (

**a**) Urban scape; (

**b**) River course.

**Figure 11.**Comparison of typical spectral curves before and after image mosaicking for the urban scape area: the spectral curves of objects of (

**a**) building; (

**b**) water; (

**c**) soil; (

**d**) farmland; (

**e**) road are represented in five colors, the sources of the spectral curves are shown in (

**f**) using the texts in the corresponding colors, where the original_left means the original left image, original_right means the original right image; mosaic means the mosaicked image; Rec_left means the geo-registered left image, Rec_right means the geo-registered right image.

**Figure 13.**Forest experiment results: (

**a**) the fused image by the weighted average method; and (

**b**) the fused image of multiple forest image strips.

Parameter | Indicators |
---|---|

Spectral range | 400 nm~1000 nm |

Number of spectral channels | 270 |

Spectral resolution | 2.8 nm |

Line array width | 480 |

Spatial resolution | 9 cm at 100 m |

Field of View | 26° at 35 mm (lens focal length) |

A/D conversion | 12 bits |

Max framerate | 100 fps |

Study Area | Flight Height | Side Overlap Rate |
---|---|---|

Urban scape | 200 m | 30% |

River course | 120 m | 50% |

Forest | 90 m | 20% |

Mean Residual | Medium Error | Mean Absolute Deviation | Median Absolute Deviation | Standard Deviation | Maximum Residual | ||
---|---|---|---|---|---|---|---|

CSSF method with 11 GCPs | $x$ | 0.0862 | 0.6954 | 0.6068 | 0.8621 | 0.6900 | 1.0909 |

$y$ | −0.1198 | 1.6207 | 1.2474 | 2.2333 | 1.6163 | 2.8232 | |

$xoy$ | 1.7636 | 1.7574 | |||||

POS-based method without GCPs | $x$ | −4.1354 | 4.3593 | 1.1555 | 1.1744 | 1.3792 | 6.8187 |

$y$ | −0.0961 | 2.2469 | 2.1094 | 1.9082 | 2.2449 | 3.8767 | |

$xoy$ | 4.9043 | 2.6347 | |||||

POS-based method with 4 GCPs | $x$ | −1.2553 | 1.7413 | 1.026 | 0.5031 | 1.2068 | 3.372 |

$y$ | 0.2162 | 1.6208 | 1.3627 | 1.0367 | 1.6063 | 3.1469 | |

$xoy$ | 2.3789 | 2.0091 |

Mean Residual | Medium Error | Mean Absolute Deviation | Median Absolute Deviation | Standard Deviation | Maximum Residual | Time Cost | ||
---|---|---|---|---|---|---|---|---|

The improved phase correlation method | $x$ | −0.67025 | 1.1664 | 1.5992 | 0.6899 | 1.1019 | 3.5498 | 0.329s |

$y$ | 0.24754 | 1.0813 | 1.307 | 1.5517 | 0.8618 | 3.4393 | ||

$xoy$ | 1.5905 | 1.3989 | ||||||

The SIFT and RANSAC-based method | $x$ | −1.6731 | 1.2104 | 0.95694 | 2.8478 | 1.1326 | 3.0629 | 5.395s |

$y$ | −0.82138 | 1.1552 | 0.97138 | 1.0708 | 0.8137 | 2.7707 | ||

$xoy$ | 1.6731 | 1.3945 |

Mean Residual | Medium Error | Mean Absolute Deviation | Median Absolute Deviation | Standard Deviation | Maximum Residual | Time Cost | ||
---|---|---|---|---|---|---|---|---|

The improved phase correlation method | $x$ | −2.7975 | 1.4856 | 1.7581 | 0.95615 | 1.6069 | 3.7709 | 1.169s |

$y$ | −0.95435 | 1.2470 | 0.73139 | 0.41235 | 0.95064 | 2.8895 | ||

1.9395 | 1.8670 | |||||||

The SIFT and RANSAC-based method | $x$ | −1.1530 | 1.2191 | 1.7520 | 1.0180 | 1.5961 | 4.2247 | 4.082s |

$y$ | −0.97421 | 1.2613 | 0.63685 | 0.19585 | 0.80109 | 2.5976 | ||

$xoy$ | 1.7541 | 1.7858 |

Mean Residual | Medium Error | Mean Absolute Deviation | Median Absolute Deviation | Standard Deviation | Maximum Residual | |
---|---|---|---|---|---|---|

$x$ | 0.0864 | 0.9065 | 0.7977 | 1.4019 | 0.9024 | 1.3841 |

$y$ | 0.2166 | 1.7455 | 1.3750 | 2.6205 | 1.7321 | 3.0199 |

$xoy$ | 1.9669 | 1.9530 |

**Table 7.**Evaluation of similarity of spectral curves of different ground objects in the overlapped area before and after image mosaicking (urban scape).

Ground Objects | The Compared Image | SAC | SC | SID | ED |
---|---|---|---|---|---|

building | Rec_left image | 0.9999 | 0.9998 | 0.00002 | 0.0377 |

Rec_right image | 0.9997 | 0.9984 | 0.00022 | 0.1114 | |

Original left image | 0.9999 | 0.9997 | 0.00004 | 0.0407 | |

Original right image | 0.9998 | 0.9990 | 0.00023 | 0.1313 | |

water | Rec_left image | 0.9989 | 0.9964 | 0.00095 | 0.03670 |

Rec_right image | 0.9934 | 0.9863 | 0.00479 | 0.05112 | |

Original left image | 0.9923 | 0.9894 | 0.00548 | 0.05837 | |

Original right image | 0.9857 | 0.9776 | 0.01066 | 0.07115 | |

soil | Rec_left image | 0.9951 | 0.9921 | 0.00396 | 0.3190 |

Rec_right image | 0.9968 | 0.9969 | 0.00670 | 0.3277 | |

Original left image | 0.9971 | 0.9930 | 0.00745 | 0.4289 | |

Original right image | 0.9992 | 0.9968 | 0.00143 | 0.3315 | |

farmland | Rec_left image | 0.9991 | 0.9984 | 0.00209 | 0.1664 |

Rec_right image | 0.9997 | 0.9995 | 0.00071 | 0.1927 | |

Original left image | 0.9986 | 0.9973 | 0.00231 | 0.2403 | |

Original right image | 0.9995 | 0.9987 | 0.00087 | 0.2628 | |

road | Rec_left image | 0.9999 | 0.9992 | 0.00001 | 0.0457 |

Rec_right image | 0.9986 | 0.9111 | 0.00105 | 0.1207 | |

Original left image | 0.9986 | 0.9574 | 0.00122 | 0.2362 | |

Original right image | 0.9976 | 0.9091 | 0.00217 | 0.1511 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Yi, L.; Chen, J.M.; Zhang, G.; Xu, X.; Ming, X.; Guo, W.
Seamless Mosaicking of UAV-Based Push-Broom Hyperspectral Images for Environment Monitoring. *Remote Sens.* **2021**, *13*, 4720.
https://doi.org/10.3390/rs13224720

**AMA Style**

Yi L, Chen JM, Zhang G, Xu X, Ming X, Guo W.
Seamless Mosaicking of UAV-Based Push-Broom Hyperspectral Images for Environment Monitoring. *Remote Sensing*. 2021; 13(22):4720.
https://doi.org/10.3390/rs13224720

**Chicago/Turabian Style**

Yi, Lina, Jing M. Chen, Guifeng Zhang, Xiao Xu, Xing Ming, and Wenji Guo.
2021. "Seamless Mosaicking of UAV-Based Push-Broom Hyperspectral Images for Environment Monitoring" *Remote Sensing* 13, no. 22: 4720.
https://doi.org/10.3390/rs13224720