#
Rotation Estimation: A Closed-Form Solution Using Spherical Moments^{ †}

^{1}

^{2}

^{3}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Direct Estimation of Rotation Using Spherical Moments

#### 2.1. Spherical Moments

#### 2.2. Closed-Form Solution of Rotation Estimation

- Define the moment vector $\mathbf{w}$ similarly as for ${\mathbf{w}}_{\mathbf{23}}$. The vector can be built by moment products of two different orders or more. For instance, products such as ${m}_{200}{m}_{220}{m}_{300}$ which combine moments of order 2, 3, and 4 can be also used. the vector $\mathbf{w}$ has to be built from moment products of the same nature and has to also include all of them.
- Using Equation (4), compute the matrices ${\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{x}}$, ${\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{y}}$, and ${\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{z}}$ such that we obtain the following:$$\dot{\mathbf{w}}=\left({\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{x}}\phantom{\rule{0.166667em}{0ex}}\mathbf{w}\right){\omega}_{x}+\left({\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{y}}\phantom{\rule{0.166667em}{0ex}}\mathbf{w}\right){\omega}_{y}+\left({\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{z}}\phantom{\rule{0.166667em}{0ex}}\mathbf{w}\right){\omega}_{z}.$$
- Solve the following system:$$\left(\right)open="["\; close="]">\begin{array}{ccc}{\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{x}}^{\top}& \mathbf{0}& \mathbf{0}\\ {\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{y}}^{\top}& \mathbf{0}& \mathbf{I}\\ {\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{z}}^{\top}& -\mathbf{I}& \mathbf{0}\\ \vdots & \vdots & \vdots \\ \mathbf{0}& \mathbf{0}& {\mathbf{L}}_{{\mathbf{w}}_{\mathbf{23}}/{\omega}_{z}}^{\top}\end{array}=\mathbf{0},$$

#### 2.3. Rotation Estimation and Scene Symmetry

## 3. Validation Results

#### 3.1. Simulation Results

- To show the validity of the proposed method to cameras obeying unified model, two different kinds of camera model are used to compute the images. The first corresponds to a simulated fisheye camera with the parameters chosen as focal scaling factors $Fx=Fy=960$ pixels, coordinates of the principal point ${u}_{x}=240$ and ${u}_{y}=320$ pixels, and distortion parameter $\xi =1.6$. The second model corresponds a conventional camera with focal scaling factors $Fx=Fy=600$ pixels and coordinates of the principal point ${u}_{x}=240$ and ${u}_{y}=320$;
- To Validate our approach for large rotational motion;
- To test the effect of translational motion on the accuracy of the estimated rotations.

#### 3.2. Real Experiments

#### 3.3. Discussion

## 4. Conclusions

## Supplementary Materials

## Author Contributions

## Funding

## Conflicts of Interest

## Appendix A

## References

- De Castro, E.; Morandi, C. Registration of Translated and Rotated Images Using Finite Fourier Transforms. IEEE Trans. Pattern Anal. Mach. Intell.
**1987**, PAMI-9, 700–703. [Google Scholar] [CrossRef] - Althloothi, S.; Mahoor, M.; Voyles, R. A Robust Method for Rotation Estimation Using Spherical Harmonics Representation. IEEE Trans. Image Process.
**2013**, 22, 2306–2316. [Google Scholar] [CrossRef] [PubMed] - Bazin, J.C.; Demonceaux, C.; Vasseur, P.; Kweon, I. Rotation estimation and vanishing point extraction by omnidirectional vision in urban environment. Int. J. Robot. Res.
**2012**, 31, 63–81. [Google Scholar] [CrossRef] - Makadia, A.; Daniilidis, K. Rotation recovery from spherical images without correspondences. IEEE Trans. Pattern Anal. Mach. Intell.
**2006**, 28, 1170–1175. [Google Scholar] [CrossRef] [PubMed] - Nister, D.; Naroditsky, O.; Bergen, J. Visual odometry. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, 27 June–2 July 2004; Volume 1, pp. I-652–I-659. [Google Scholar]
- Hadj-Abdelkader, H.; Mezouar, Y.; Martinet, P.; Chaumette, F. Catadioptric Visual Servoing from 3-D Straight Lines. IEEE Trans. Robot.
**2008**, 24, 652–665. [Google Scholar] [CrossRef] - Safaee-Rad, R.; Tchoukanov, I.; Smith, K.C.; Benhabib, B. Three-dimensional location estimation of circular features for machine vision. IEEE Trans. Robot. Autom.
**1992**, 8, 624–640. [Google Scholar] [CrossRef] - Agrawal, M.; Konolige, K. Real-time Localization in Outdoor Environments using Stereo Vision and Inexpensive GPS. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR’06), Hong Kong, China, 20–24 August 2006; Volume 3, pp. 1063–1068. [Google Scholar]
- Malis, E.; Marchand, E. Experiments with robust estimation techniques in real-time robot vision. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’06), Beijing, China, 9–15 October 2006; pp. 223–228. [Google Scholar]
- Micusik, B.; Pajdla, T. Structure from motion with wide circular field of view cameras. IEEE Trans. Pattern Anal. Mach. Intell.
**2006**, 28, 1135–1149. [Google Scholar] [CrossRef] [PubMed] - Hadj-Abdelkader, H.; Malis, E.; Rives, P. Spherical Image Processing for Accurate Visual Odometry with Omnidirectional Cameras. In Proceedings of the 8th Workshop on Omnidirectional Vision, Camera Networks and Non-classical Cameras—OMNIVIS, Marseille, France, 17 October 2008. [Google Scholar]
- Lim, J.; Barnes, N.; Li, H. Estimating Relative Camera Motion from the Antipodal-Epipolar Constraint. IEEE Trans. Pattern Anal. Mach. Intell.
**2010**, 32, 1907–1914. [Google Scholar] [CrossRef] [PubMed] - Horn, B.K.; Schunck, B.G. Determining Optical Flow. Artif. Intell.
**1981**, 17, 185–203. [Google Scholar] [CrossRef] - Gluckman, J.; Nayar, S.K. Ego-motion and omnidirectional cameras. In Proceedings of the Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271), Bombay, India, 4–7 January 1998; pp. 999–1005. [Google Scholar]
- Benseddik, H.E.; Hadj-Abdelkader, H.; Cherki, B.; Bouchafa, S. Direct method for rotation estimation from spherical images using 3D mesh surfaces with SPHARM representation. J. Vis. Commun. Image Represent.
**2016**, 40, 708–720. [Google Scholar] [CrossRef] - Teague, M.R. Image analysis via the general theory of moments. JOSA
**1980**, 70, 920–930. [Google Scholar] [CrossRef] - Abu-Mostafa, Y.S.; Psaltis, D. Image normalization by complex moments. IEEE Trans. Pattern Anal. Mach. Intell.
**1985**, PAMI-7, 46–55. [Google Scholar] [CrossRef] - Tahri, O.; Araujo, H.; Mezouar, Y.; Chaumette, F. Efficient decoupled pose estimation from a set of points. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’2013), Tokyo, Japan, 3–7 November 2013; pp. 1608–1613. [Google Scholar]
- Tahri, O.; Chaumette, F. Complex objects pose estimation based on image moment invariants. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 436–441. [Google Scholar]
- Tahri, O.; Tamtsia, A.Y.; Mezouar, Y.; Demonceaux, C. Visual servoing based on shifted moments. IEEE Trans. Robot.
**2015**, 31, 798–804. [Google Scholar] [CrossRef] - Bakthavatchalam, M.; Tahri, O.; Chaumette, F. A Direct Dense Visual Servoing Approach using Photometric Moments. IEEE Trans. Robot.
**2018**. [Google Scholar] [CrossRef] - Tahri, O.; Mezouar, Y.; Chaumette, F.; Corke, P. Decoupled image-based visual servoing for cameras obeying the unified projection model. IEEE Trans. Robot.
**2010**, 26, 684–697. [Google Scholar] [CrossRef] - Hadj-Abdelkader, H.; Tahri, O.; Benseddik, H.E. Closed form solution for Rotation Estimation using Photometric Spherical Moments. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 627–634. [Google Scholar]
- Courbon, J.; Mezouar, Y.; Eckt, L.; Martinet, P. A generic fisheye camera model for robotic applications. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 1683–1688. [Google Scholar]
- Barnard, S.T. Interpreting Perspective Images. Artif. Intell.
**1983**, 21, 435–462. [Google Scholar] [CrossRef] - Tahri, O. Application des Moments à L’asservissement Visuel et au Calcul de Pose. Ph.D. Thesis, Université de Rennes, Rennes, France, 2004. [Google Scholar]
- Tatsambon Fomena, R. Asservissement Visuel par Projection Sphérique. Ph.D. Thesis, University of Rennes 1, Rennes, France, 2008. [Google Scholar]
- Hadj-Abdelkader, H.; Mezouar, Y.; Martinet, P. Decoupled Visual Servoing Based on the Spherical Projection of Points. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’09), Kobe, Japan, 12–17 May 2009; pp. 731–736. [Google Scholar]
- Morbidi, F.; Caron, G. Phase Correlation for Dense Visual Compass from Omnidirectional Camera-Robot Images. IEEE Robot. Autom. Lett.
**2017**, 2, 688–695. [Google Scholar] [CrossRef] - Fomena, R.T.; Tahri, O.; Chaumette, F. Distance-based and orientation-based visual servoing from three points. IEEE Trans. Robot.
**2011**, 27, 256–267. [Google Scholar] [CrossRef] - Tahri, O.; Araujo, H.; Chaumette, F.; Mezouar, Y. Robust image-based visual servoing using invariant visual information. Robot. Auton. Syst.
**2013**, 61, 1588–1600. [Google Scholar] [CrossRef] - Schönemann, P.H. A generalized solution of the orthogonal Procrustes problem. Psychometrika
**1966**, 31, 1–10. [Google Scholar] [CrossRef]

**Figure 1.**Results using a fisheye camera model: (

**a**) the reference image and (

**b**) example of rotated images.

**Figure 2.**Simulation results using a fisheye camera model: (

**a**) Estimation error of the rotation and (

**b**) rotation ground truth expressed by Euler angles.

**Figure 3.**Results using conventional camera model: (

**a**) the reference image and (

**b**) example of the rotated images.

**Figure 4.**Simulation results using a perspective camera model: (

**a**) Estimation error of the rotation and (

**b**) rotation ground truth expressed.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Hadj-Abdelkader, H.; Tahri, O.; Benseddik, H.-E.
Rotation Estimation: A Closed-Form Solution Using Spherical Moments. *Sensors* **2019**, *19*, 4958.
https://doi.org/10.3390/s19224958

**AMA Style**

Hadj-Abdelkader H, Tahri O, Benseddik H-E.
Rotation Estimation: A Closed-Form Solution Using Spherical Moments. *Sensors*. 2019; 19(22):4958.
https://doi.org/10.3390/s19224958

**Chicago/Turabian Style**

Hadj-Abdelkader, Hicham, Omar Tahri, and Houssem-Eddine Benseddik.
2019. "Rotation Estimation: A Closed-Form Solution Using Spherical Moments" *Sensors* 19, no. 22: 4958.
https://doi.org/10.3390/s19224958