#
Eliminating the Effect of Image Border with Image Periodic Decomposition for Phase Correlation Based Remote Sensing Image Registration^{ †}

^{1}

^{2}

^{3}

^{4}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Related Work

## 3. Methodology

#### 3.1. The Principle of Image Registration Based on Phase Correlation

#### 3.2. Eliminating the Effect of Image Border

- (1)
- Calculate the Discrete Laplacian ${\Delta}_{i}I$ of original image I;
- (2)
- Calculate the Discrete Fourier Transform $\tilde{\Delta I}$ of ${\Delta}_{i}I$;
- (3)
- Calculate the Discrete Fourier Transform $\tilde{p}$ of p by the usage of inversing Discrete periodic Laplacian:$$\left(\right)$$
- (4)
- Apply inverse Discrete Fourier Transform to $p({\xi}_{1},{\xi}_{2})$ and acquire the periodic image p.

#### 3.3. The Framework of Image Registration Based on Phase Correlation

- (1)
- Eliminating the effect of image border: decompose the reference image R and sensed image S, and acquire the corresponding periodic image ${R}^{\prime}$ and ${S}^{\prime}$. Its calculation process is detailed in the Section 3.2;
- (2)
- Determine the scalar s and rotation angle $\theta $: calculate the log-polar Fourier Transform of ${R}^{\prime}$ and ${S}^{\prime}$ by interpolating multi-layer fractional Fourier Transform [37]. Then, calculate their magnitude spectrum, respectively, and finally obtain the rotation angle $\theta $ and scalar s by determining the phase difference. For the multi-layer fractional Fourier Transform algorithm, the number of layers is 4, and the corresponding scalars are determined by the MATLAB function histcounts. For the histcounts, its input is the radius series values of points in a single radial line. For the construction of a log-polar grid, the number of radial lines is 128 and the number of points in each radial line is 128, and the logarithmic base is $1.044$ and the minimum radius is set to $0.015$. The interpolation approach for calculating the Log-polar Fourier Transform is the bicubic method. In the process of determing the scale and rotation, the involved displacement estimation is implemented by directly calculating the linear phase difference in the frequency domain and fitting the straight line by the usage of the least square method;
- (3)
- Recover the sensed image ${S}^{\prime}$: according to the estimated rotation $\theta $ and scalar s, correct the angle and scale deformation, and acquire the corrected sensed image, which has only displacement with the reference image;
- (4)
- Again, apply the phase correlation approach to obtain the translation [30]. The displacement estimation is implemented by directly calculating the linear phase difference in the frequency domain and fitting the straight line by the usage of the least square method.

## 4. Experiment and Analysis

#### 4.1. The Comparison of Image Registration Success Rate for Different Methods

#### 4.2. The Comparison of Displacement Estimate Accuracy for Different Methods

#### 4.3. The Comparison of Scale and Angle Estimation Accuracy for Different Methods

#### 4.4. Further Analysis of Eliminating the Effect of Image Border Methods

## 5. Discussion

## 6. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Kuglin, C. The phase correlation image alignment method. In Proceedings of the IEEE International Conference on Cybernetics and Society, New York, NY, USA, 23–25 September 1975. [Google Scholar]
- Roy, D.P. The impact of misregistration upon composited wide field of view satellite data and implications for change detection. IEEE Trans. Geosci. Remote Sens.
**2000**, 38, 2017–2032. [Google Scholar] [CrossRef] - Leprince, S.; Barbot, S.; Ayoub, F.; Avouac, J.P. Automatic and precise orthorectification, coregistration, and subpixel correlation of satellite images, application to ground deformation measurements. IEEE Trans. Geosci. Remote Sens.
**2007**, 45, 1529–1558. [Google Scholar] [CrossRef] - Necsoiu, M.; Leprince, S.; Hooper, D.M.; Dinwiddie, C.L.; McGinnis, R.N.; Walter, G.R. Monitoring migration rates of an active subarctic dune field using optical imagery. Remote Sens. Environ.
**2009**, 113, 2441–2447. [Google Scholar] [CrossRef] - Morgan, G.L.K.; Liu, J.G.; Yan, H. Precise subpixel disparity measurement from very narrow baseline stereo. IEEE Trans. Geosci. Remote Sens.
**2010**, 48, 3424–3433. [Google Scholar] [CrossRef] - Zitova, B.; Flusser, J. Image registration methods: a survey. Image Vis. Comput.
**2003**, 21, 977–1000. [Google Scholar] [CrossRef] - Zhu, N.; Jia, Y.; Ji, S. Registration of Panoramic/Fish-Eye Image Sequence and LiDAR Points Using Skyline Features. Sensors
**2018**, 18, 1651. [Google Scholar] [CrossRef] - Zhang, Z.; Han, D.; Dezert, J.; Yang, Y. A New Image Registration Algorithm Based on Evidential Reasoning. Sensors
**2019**, 19, 1091. [Google Scholar] [CrossRef] [PubMed] - Han, Y.; Oh, J. Automated Geo/Co-Registration of Multi-Temporal Very-High-Resolution Imagery. Sensors
**2018**, 18, 1599. [Google Scholar] [CrossRef] - Chang, X.; Du, S.; Li, Y.; Fang, S. A Coarse-to-Fine Geometric Scale-Invariant Feature Transform for Large Size High Resolution Satellite Image Registration. Sensors
**2018**, 18, 1360. [Google Scholar] [CrossRef] - Leng, C.; Zhang, H.; Li, B.; Cai, G.; Pei, Z.; He, L. Local Feature Descriptor for Image Matching: A Survey. IEEE Access
**2018**, 7, 6424–6434. [Google Scholar] [CrossRef] - Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis.
**2004**, 60, 91–110. [Google Scholar] [CrossRef] - Harris, C.; Stephens, M. A combined corner and edge detector. Alvey vision conference. Citeseer
**1988**, 15, 147–151. [Google Scholar] - Mikolajczyk, K.; Schmid, C. An affine invariant interest point detector. In Proceedings of the European Conference on Computer Vision, Copenhagen, Denmark, 28–31 May 2002; pp. 128–142. [Google Scholar]
- Verdie, Y.; Yi, K.; Fua, P.; Lepetit, V. TILDE: A temporally invariant learned detector. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 5279–5288. [Google Scholar]
- Lenc, K.; Vedaldi, A. Learning covariant feature detectors. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; pp. 100–117. [Google Scholar]
- DeTone, D.; Malisiewicz, T.; Rabinovich, A. SuperPoint: Self-Supervised Interest Point Detection and Description. arXiv
**2017**, arXiv:1712.07629. [Google Scholar] - Dalal, N.; Triggs, B. Histograms of oriented gradients for human detection. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), San Diego, CA, USA, 20–26 June 2005; Volume 1, pp. 886–893. [Google Scholar]
- Tola, E.; Lepetit, V.; Fua, P. Daisy: An efficient dense descriptor applied to wide-baseline stereo. IEEE Trans. Pattern Anal. Mach. Intell.
**2010**, 32, 815–830. [Google Scholar] [CrossRef] - Simo-Serra, E.; Trulls, E.; Ferraz, L.; Kokkinos, I.; Fua, P.; Moreno-Noguer, F. Discriminative learning of deep convolutional feature point descriptors. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 118–126. [Google Scholar]
- Tian, Y.; Fan, B.; Wu, F. L2-Net: Deep Learning of Discriminative Patch Descriptor in Euclidean Space. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; Volume 1, p. 6. [Google Scholar]
- Mishchuk, A.; Mishkin, D.; Radenovic, F.; Matas, J. Working hard to know your neighbor’s margins: Local descriptor learning loss. In Advances in Neural Information Processing Systems; Neural Information Processing Systems Foundation, Inc.: Montreal, QC, Canada, 2017; pp. 4826–4837. [Google Scholar]
- Le Moigne, J.; Campbell, W.J.; Cromp, R.F. An automated parallel image registration technique based on the correlation of wavelet features. IEEE Trans. Geosci. Remote Sens.
**2002**, 40, 1849–1864. [Google Scholar] [CrossRef] - Le Moigne, J. Parallel registration of multisensor remotely sensed imagery using wavelet coefficients. In Wavelet Applications; International Society for Optics and Photonics: Orlando, FL, USA, 1994; Volume 2242, pp. 432–444. [Google Scholar]
- Zavorin, I.; Le Moigne, J. Use of multiresolution wavelet feature pyramids for automatic registration of multisensor imagery. IEEE Trans. Image Process.
**2005**, 14, 770–782. [Google Scholar] [CrossRef] [PubMed] - Murphy, J.M.; Le Moigne, J.; Harding, D.J. Automatic image registration of multimodal remotely sensed data with global shearlet features. IEEE Trans. Geosci. Remote Sens.
**2016**, 54, 1685–1704. [Google Scholar] [CrossRef] - Argyriou, V.; Vlachos, T. A Study of Sub-pixel Motion Estimation using Phase Correlation. In Proceedings of the British Machine Vision Conference 2006, Edinburgh, UK, 4–7 September 2006; pp. 387–396. [Google Scholar]
- Tian, Q.; Huhns, M.N. Algorithms for subpixel registration. Comput. Vis. Graph. Image Process.
**1986**, 35, 220–233. [Google Scholar] [CrossRef] - Abdou, I.E. Practical approach to the registration of multiple frames of video images. In Visual Communications and Image Processing’99; International Society for Optics and Photonics: San Jose, CA, USA, 1998; Volume 3653, pp. 371–383. [Google Scholar]
- Stone, H.S.; Orchard, M.T.; Chang, E.C.; Martucci, S.A. A fast direct Fourier-based algorithm for subpixel registration of images. IEEE Trans. Geosci. Remote Sens.
**2001**, 39, 2235–2243. [Google Scholar] [CrossRef] - Tong, X.; Ye, Z.; Xu, Y.; Liu, S.; Li, L.; Xie, H.; Li, T. A novel subpixel phase correlation method using singular value decomposition and unified random sample consensus. IEEE Trans. Geosci. Remote Sens.
**2015**, 53, 4143–4156. [Google Scholar] [CrossRef] - Dong, Y.; Long, T.; Jiao, W.; He, G.; Zhang, Z. A Novel Image Registration Method Based on Phase Correlation Using Low-Rank Matrix Factorization With Mixture of Gaussian. IEEE Trans. Geosci. Remote Sens.
**2018**, 56, 446–460. [Google Scholar] [CrossRef] - Dong, Y.; Long, T.; Jioo, W. Eliminating Effect of Image Border with Image Periodic Decomposition for Phase Correlation Based Image Registration. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 23–27 July 2018; pp. 4977–4980. [Google Scholar]
- Ge, P.; Lan, C.; Wang, H. An improvement of image registration based on phase correlation. Opt. Int. J. Light Electron Opt.
**2014**, 125, 6709–6712. [Google Scholar] [CrossRef] - Moisan, L. Periodic plus smooth image decomposition. J. Math. Imaging Vis.
**2011**, 39, 161–179. [Google Scholar] [CrossRef] - Dong, Y.; Jiao, W.; Long, T.; He, G.; Gong, C. An Extension of Phase Correlation-Based Image Registration to Estimate Similarity Transform Using Multiple Polar Fourier Transform. Remote Sens.
**2018**, 10, 1719. [Google Scholar] [CrossRef] - Pan, W.; Qin, K.; Chen, Y. An adaptable-multilayer fractional Fourier transform approach for image registration. IEEE Trans. Pattern Anal. Mach. Intell.
**2009**, 31, 400–414. [Google Scholar] [CrossRef]

**Figure 1.**The three-dimensional diagrams of different window functions. (

**a**) is the diagram of Blackman window function; (

**b**) is the diagram of raised-cosine window function; (

**c**) is the diagram of flap-top window function. The length of all window functions is 100. For raised-cosine window function, its roll-off factor $\beta $ is set to 0.25. For the flap-top window function, its stretch factor k is set to 2.7.

**Figure 2.**The resulting images of image filtered by different window functions and the corresponding amplitude spectrum. (

**a**) is the original image, its size is $1000\times 1000$ pixels. (

**b**–

**d**) are the result images after filtering by Blackman, flap-top and raised-cosine window function, respectively. (

**e**–

**h**) are the corresponding amplitude spectrum of (

**a**–

**d**), respectively. It is worth noting that the amplitude spectrum was operated logarithmically for clear presentation.

**Figure 3.**Image decomposition and its corresponding amplitude spectrum of image. (

**a**–

**c**) are original image, periodic image and smooth image, respectively. (

**d**–

**f**) are the corresponding amplitude spectrum of (

**a**–

**c**), respectively. Compared to (

**d**) of the amplitude spectrum of original image, the cross structure in the (

**a**) of the amplitude spectrum of the periodic image disappears visually. It is noteworthy that, for other images, the cross structure may not vanish completely because the cross structure contains two pieces of information: one is produced by the discontinuity of image border, and the other is the content of the image itself.

**Figure 6.**

**A**,

**B**represent two images, respectively. There exist displacements between them. Their corresponding weighted filtering window is denoted by a circle and centered at the image. Signal ${S}_{0}$ denotes the information in the overlap and suffered from the degradation due to the filtering of weighted window function. Signal ${S}_{1}$ denotes the information in the overlap and is free from the degradation due to the filtering of weighted window function. The difference between signal ${S}_{0}$ and ${S}_{1}$ is that one is degraded by the weighted window function and one is not. Noise ${S}_{2}$ denotes the information that is not in the overlap and suffered from the degradation due to the filtering of weighted window function. Noise ${S}_{3}$ denotes the information that is not in the overlap and is free from the degradation due to the filtering of weighted window function. The difference between noise ${S}_{2}$ and noise ${S}_{3}$ is that one is degraded by the weighted window function and one is not.

**Table 1.**The comparison of performance for different methods of eliminating the effect of image border in the case of estimating the displacement.

Algorithm | ${\mathit{e}}_{\mathit{x}}$ | ${\mathit{e}}_{\mathit{y}}$ | ${\mathit{X}}_{\mathit{RMSE}}$ | ${\mathit{Y}}_{\mathit{RMSE}}$ |
---|---|---|---|---|

Decomposition | 0.263 | 0.248 | 0.0195 | 0.0197 |

Blackman | 0.268 | 0.249 | 0.0190 | 0.020 |

Flat-top | 0.266 | 0.251 | 0.0195 | 0.021 |

Raised-cosine | 0.265 | 0.254 | 0.0196 | 0.0216 |

Algorithm | Decomposition Method | Raised-Cosine Window | Blackman Window | Flat-Top Window | |
---|---|---|---|---|---|

Measurements | |||||

Success rate total images (37) | 33/37 | 22/37 | 20/37 | 17/37 | |

Mean scalar error (unit: ${10}^{-3}$) | 2.40 | 3.19 | 2.64 | 3.64 | |

Mean angle error (unit: ${10}^{-1}$) | 1.47 | 8.86 | 5.41 | 9.55 | |

RMSE of scalar (unit: ${10}^{-5}$) | 0.66 | 1.69 | 1.34 | 2.12 | |

RMSE of angle (unit: angle(°)) | 0.04 | 1.05 | 0.57 | 1.18 |

**Table 3.**The comparison of displacement accuracy for different methods of eliminating the effect of image border in the case of images with different sizes. ${e}_{x}$ and ${e}_{y}$ denote the mean absolute error in the x- and y-directions, respectively.

Algorithm | Decomposition | Blackman | Flat-top | Raised-cosine | |||||
---|---|---|---|---|---|---|---|---|---|

Image Size | ${\mathit{e}}_{\mathit{x}}$ | ${\mathit{e}}_{\mathit{y}}$ | ${\mathit{e}}_{\mathit{x}}$ | ${\mathit{e}}_{\mathit{y}}$ | ${\mathit{e}}_{\mathit{x}}$ | ${\mathit{e}}_{\mathit{y}}$ | ${\mathit{e}}_{\mathit{x}}$ | ${\mathit{e}}_{\mathit{y}}$ | |

$64\times 64$ | $\mathbf{0.244}$ | $\mathbf{0.240}$ | 0.266 | 0.250 | 0.294 | 0.282 | 0.291 | 0.292 | |

$128\times 128$ | $\mathbf{0.248}$ | $\mathbf{0.251}$ | 0.252 | 0.253 | 0.256 | 0.263 | 0.254 | 0.254 | |

$256\times 256$ | $\mathbf{0.263}$ | $\mathbf{0.248}$ | 0.268 | 0.249 | 0.266 | 0.251 | 0.265 | 0.254 | |

$512\times 512$ | 0.253 | 0.251 | 0.252 | 0.251 | 0.253 | $\mathbf{0.250}$ | $\mathbf{0.251}$ | 0.252 | |

$1024\times 1024$ | 0.246 | 0.270 | $\mathbf{0.243}$ | $\mathbf{0.241}$ | $\mathbf{0.243}$ | 0.250 | $0.242$ | 0.245 |

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Dong, Y.; Jiao, W.; Long, T.; Liu, L.; He, G.
Eliminating the Effect of Image Border with Image Periodic Decomposition for Phase Correlation Based Remote Sensing Image Registration. *Sensors* **2019**, *19*, 2329.
https://doi.org/10.3390/s19102329

**AMA Style**

Dong Y, Jiao W, Long T, Liu L, He G.
Eliminating the Effect of Image Border with Image Periodic Decomposition for Phase Correlation Based Remote Sensing Image Registration. *Sensors*. 2019; 19(10):2329.
https://doi.org/10.3390/s19102329

**Chicago/Turabian Style**

Dong, Yunyun, Weili Jiao, Tengfei Long, Lanfa Liu, and Guojin He.
2019. "Eliminating the Effect of Image Border with Image Periodic Decomposition for Phase Correlation Based Remote Sensing Image Registration" *Sensors* 19, no. 10: 2329.
https://doi.org/10.3390/s19102329