Long-Distance GNSS-Denied Visual Inertial Navigation for Autonomous Fixed-Wing Unmanned Air Vehicles: SO(3) Manifold Filter Based on Virtual Vision Sensor
Abstract
:1. Introduction and Outline
2. Mathematical Notation
3. GNSS-Denied Navigation and Visual Inertial Odometry
- The first possibility, known as filtering, consists in employing the visual estimations as additional observations with which to feed the inertial filter. In the case of attitude estimation exclusively (momentarily neglecting the platform position), the GNSS signals are helpful and enable the inertial navigation system to obtain more accurate and less noisy estimations, but they are not indispensable, as driftless attitude estimations can be obtained based on the inertial measurement unit (IMU) without their use [2]. Proven techniques for attitude estimation in all kinds of platforms that do not rely on GNSS signals include Gaussian filters [40], deterministic filters [41], complimentary filters [42], and stochastic filters [43,44]. In the case of the complete pose estimation (both attitude and position), it is indispensable to employ the velocity or incremental position observations obtained with VO methods in the absence of the absolute references provided by GNSS receivers. The literature includes cases based on Gaussian filters [45] and, more recently, nonlinear deterministic filters [46], stochastic filters [47], Riccati observers [48], and invariant extended Kalman filters, or EKFs [49].
- The alternative is to employ VO optimizations with the assistance of the inertial navigation estimations, reducing the pose estimation drift inherent to VO. This is known as visual inertial odometry (VIO) [50,51], which can also be combined with image registration to fully eliminate the remaining pose drift. A previous article by the authors, [1], describes how the nonlinear VO optimizations can be enhanced by adding priors based on the inertial attitude and altitude estimations. VIO has matured significantly in the last few years, with detailed reviews available in [50,51,52,53,54].
4. Novelty
- The use of a modified EKF scheme within the navigation filter (Section 8) based on Lie algebra to ensure that the estimated aircraft body attitude is propagated along its tangent space and never deviates from its manifold, reducing the error growth inherent to concatenating estimations over a long period of time.
- The transformation within the VVS (Section 7) of the position estimations obtained by the visual system into incremental displacement measurements, which are unbiased and hence more adequate to be used as observations within an EKF.
- No information is discarded because the two solutions are not independent, as each simultaneously feeds and is fed by the other. The visual estimations depend on the inertial outputs, while the inertial filter uses the visual incremental displacement estimations as observations. Unlike loosely coupled solutions, the visual estimations for attitude and altitude are not allowed to deviate from the inertial ones above a certain threshold, so they do not drift.
- The two estimations are never fused together, as the inertial solution constitutes the final output, while the sole role of the visual outputs is to act as the previously mentioned incremental sensor.
5. Application
- Being restricted to aerial vehicles, it takes advantage of the extra sensors already present on board these platforms, such as magnetometers and barometer.
- The fixed-wing limitation is caused by the visual system relying on the pitot tube to continue navigating when overflying texture-poor terrain [1].
- As exemplified by the two scenarios employed to evaluate the algorithms, the proposed approach can cope with heavy turbulence, which increases the platform accelerations as well as the optical flow between consecutive images. Although not discussed in the results, the turbulence level has a negligible effect on the navigation accuracy as long as it remains below a sufficiently elevated threshold.
- With the exception of atmospheric pressure changes, which accumulate as vertical position estimation errors (Section 10.2), weather changes such as wind and atmospheric temperature have no effect on the navigation accuracy.
- It focuses more on GNSS-denied environments of a different nature than those experienced by other platforms, as it can always be assumed that GNSS signals are present at the beginning of the flight, and if they disappear, the reason is likely to be technical error or intentional action, so the vehicle needs to be capable of flying for long periods of time in GNSS-denied conditions.
- The reliance on visual navigation imposes certain restrictions, such as the need for sufficient illumination, lack of cloud cover below the aircraft, and impossibility to navigate over large bodies of water. The use of infrared cameras, although out of the scope of this article, is a promising research area for the first two restrictions, but the lack of static features makes visual systems inadequate for navigation over water.
6. Proposed Visual Inertial Navigation Architecture
- Given the operating frequencies of the different sensors listed in Table 1, most navigation filter cycles (99 out of 100 for GNSS-based navigation, 9 out of 10 in case of GNSS-denied conditions) can not rely on the position and ground velocity observations provided by either the GNSS receiver or the VVS. These cycles successively apply the EKF time update, measurement update, and reset equations (Section 8), noting that the filter observation system needs to be simplified by removing the ground velocity and position observations, which are not available.
- GNSS-denied filter cycles for which the VVS position and ground velocity observations are available at the required time () rely on the Figure 3 scheme. Note that this case only occurs in 1 out of 10 executions for GNSS-denied conditions. The first part of the process is exactly the same as in the previous case, estimating first the a priori state and covariance (, ), followed by the a posteriori ones (, ), and finally the estimated state (). Note that as these estimations do not employ the VVS measurements (that is, do not employ the image corresponding to ), they are preliminary and hence denoted with the subindex PRE to avoid confusion.The preliminary estimation together with the image is then passed to the visual system [1] to obtain a visual state , which, as explained in Section 7, is equivalent to the VVS position and ground velocity outputs. Only then it is possible to apply the complete navigation filter measurement update and reset equations for a second time to obtain the final state estimation .
- GNSS-based filter cycles for which the GNSS receiver position and ground velocity observations are available at the required time (). This case, which only occurs in one out of a hundred executions when GNSS signals are available, is in fact very similar to the first one above, with the only difference being that the navigation filter observation system does not need any simplifications.
7. Virtual Vision Sensor
- To obtain the VVS velocity observations (), it is necessary to first compute the geodetic coordinate (longitude , latitude , altitude h) time derivative based on the difference between their values corresponding to the last two images (2), followed by their transformation into per (3), in which M and N represent the WGS84 radii of curvature of meridian and prime vertical, respectively. Note that is very noisy given how it is obtained.
- With respect to the VVS geodetic coordinates, the sensed longitude and latitude can be obtained per (5) and (6) based on propagating the previous inertial estimations (those corresponding to the time of the previous VVS reading) with their visually obtained time derivatives. To avoid drift, the geometric altitude is estimated based on the barometer observations assuming that the atmospheric pressure offset remains frozen from the time the GNSS signals are lost [2].
8. Proposed Navigation Filter
8.1. Time Update Equations
- The geodetic coordinates kinematics (14);
8.2. Measurement Update Equations
8.3. Covariances
8.4. Reset Equations
- Coupled with the observation covariances discussed in Section 7, lower geodetic coordinate system noise values are employed when GNSS signals are available, as the objective is for the solution to avoid position jumps by smoothly following the state equations and only slightly updating the position based on the GNSS observations to avoid any position drift in the long term.
- When the position observations are instead supplied by the VVS, higher system noise values are employed so the EKF relies more on the observations and less on the integration. Note that the VVS velocity observations are very noisy because of (2), so it is better if the EKF closely adheres to the position observations that originate at the visual system, correcting in each step as necessary.
9. Testing: High-Fidelity Simulation and Scenarios
- The first, represented by the yellow blocks on the right of Figure 4, models the physics of flight and the interaction between the aircraft and its surroundings, which results in the real aircraft trajectory .
- The second, represented by the green blocks on the left, contains the aircraft systems in charge of ensuring that the resulting trajectory adheres as much as possible to the mission objectives. It includes the different sensors whose output comprises the sensed trajectory , the navigation system in charge of filtering it to obtain the estimated trajectory , the guidance system that converts the reference objectives into the control targets , and the control system that adjusts the position of the throttle and aerodynamic control surfaces so the estimated trajectory is as close as possible to the reference objectives . Table 1 lists the working frequencies of the various blocks represented in Figure 4.
- Scenario #1 has been defined with the objective of adequately representing the challenges faced by an autonomous fixed-wing UAV that suddenly cannot rely on GNSS and hence changes course to reach a predefined recovery location situated at approximately 1 h of flight time. In the process, in addition to executing an altitude and airspeed adjustment, the autonomous aircraft faces significant weather and wind field changes that make its GNSS-denied navigation even more challenging.With respect to the mission, the stochastic parameters include the initial airspeed, pressure altitude, and bearing (); their final values (); and the time at which each of the three maneuvers is initiated (turns are executed with a bank angle of . Altitude changes employ an aerodynamic path angle of . Additionally, airspeed modifications are automatically executed by the control system as set-point changes). The scenario lasts for , while the GNSS signals are lost at .The wind field is also defined stochastically, as its two parameters (speed and bearing) are constant both at the beginning () and conclusion () of the scenario, with a linear transition in between. The specific times at which the wind change starts and concludes also vary stochastically among the different simulation runs.A similar linear transition occurs with the temperature and pressure offsets that define the atmospheric properties, as they are constant both at the start () and end () of the flight.The turbulence remains strong throughout the whole scenario, but its specific values also vary stochastically from one execution to the next.
- Scenario #2 represents the challenges involved in continuing with the original mission upon the loss of the GNSS signals, executing a series of continuous turn maneuvers over a relatively short period of time with no atmospheric or wind variations. As in scenario , the GNSS signals are lost at , but the scenario duration is shorter (). The initial airspeed and pressure altitude () are defined stochastically and do not change throughout the whole scenario; the bearing, however, changes a total of eight times between its initial and final values, with all intermediate bearing values as well as the time for each turn varying stochastically from one execution to the next. Although the same turbulence is employed as in scenario , the wind and atmospheric parameters () remain constant throughout scenario .
10. Results: Navigation System Error in GNSS-Denied Conditions
- A standalone inertial system specifically designed to lower the GNSS-denied horizontal position drift, for which the results obtained with the same two scenarios are described in [2]. The attitude estimation error does not drift and is bounded by the quality of the onboard sensors, ensuring that the aircraft can remain aloft for as long as there is fuel available. The vertical position and ground velocity estimation errors are also bounded by atmospheric physics and do not drift; they depend on the atmospheric pressure offset and wind field changes that occur since the GNSS signals are lost. On the other hand, the horizontal position drifts as a consequence of integrating the ground velocity without absolute observations. Note that a standalone inertial system is hence capable of successfully estimating four of the six degrees of freedom (attitude plus altitude).
- A visual navigation system (which relies exclusively on the images generated by an onboard camera) aided by the attitude and altitude estimated by the above inertial system, for which the results are described in [1]. This system slowly drifts in all six degrees of freedom. Although its attitude and altitude estimation capabilities are qualitatively inferior to those of the inertial system, its horizontal position error is just a fraction of what can be achieved without the use of images.
10.1. Body Attitude Estimation
10.2. Vertical Position Estimation
10.3. Horizontal Position Estimation
11. Summary and Conclusions
- The body attitude estimation is qualitatively similar to that obtained by a standalone inertial filter without any visual aid [2]. The bounded estimations enable the aircraft to remain aloft in GNSS-denied conditions for as long as it has fuel. Quantitatively, the VVS observations and the associated more accurate filter equations result in significant accuracy improvements when compared with the [2] baseline.
- The vertical position estimation is qualitatively and quantitatively similar to that of the standalone inertial filter [2]. In addition to ionospheric effects (which also apply when GNSS signals are available), the altitude error depends on the amount of pressure offset variation since entering GNSS-denied conditions, being unbiased (zero mean) and bounded by atmospheric physics.
- The horizontal position estimation exhibits drastic quantitative improvements over the baseline standalone inertial filter [2], although from a qualitative point of view, the estimation error is not bounded as the drift cannot be fully eliminated.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
BRIEF | binary robust independent elementary features |
DSO | direct sparse odometry |
ECEF | Earth centered Earth fixed |
EKF | extended Kalman filter |
FAST | features from accelerated segment test |
GNSS | Global Navigation Satellite System |
IMU | inertial measurement unit |
iSAM | incremental smoothing and mapping |
LSD | large-scale direct |
MAV | micro air vehicle |
MSCKF | multistate constraint Kalman filter |
MSF | multisensor fusion |
NED | north east down |
NSE | navigation system error |
OKVIS | open keyframe visual inertial SLAM |
ORB | oriented FAST and rotated BRIEF |
ROVIO | robust visual inertial odometry |
SLAM | simultaneous localization and mapping |
special Euclidean group of | |
special orthogonal group of | |
SVO | semidirect visual odometry |
UAV | unmanned aerial vehicle |
VINS | visual inertial navigation system |
VIO | visual inertial odometry |
VO | visual odometry |
VVS | virtual vision sensor |
WGS84 | World Geodetic System 1984 |
Appendix A. Required Jacobians
- The time derivative of the geodetic coordinates (longitude, latitude, and altitude) depends on the ground velocity per (A1), where M and N represent the WGS84 ellipsoid radii of curvature of meridian and prime vertical, respectively. The Jacobian with respect to , given by (A2) and employed in (20), is hence straightforward:
- The motion angular velocity represents the rotation experienced by any object that moves without modifying its attitude with respect to the Earth surface. It is caused by the curvature of the Earth, and its expression when viewed in is given by (A3). Its Jacobian with respect to , provided by (A4) and employed in (22) and (33), is also straightforward:
- The Coriolis acceleration is the double cross product between the Earth angular velocity caused by its rotation around the axis at a constant rate and the aircraft velocity. Its expression when viewed in is provided by (A5), resulting in the (A6) Jacobian with respect to , which appears in (22):
- The Lie Jacobian represents the derivative of the function , that is, the concatenation between the attitude and its local perturbation , with respect to the attitude , when the increments are viewed in their respective local tangent spaces, that is, tangent respectively at and . Interested readers should refer to [4,5] for the obtainment of (A7), where represents the direct cosine matrix corresponding to a given rotation vector . This Jacobian is employed in (48).
- The Lie Jacobian represents the derivative of the function , that is, the rotation of according to the attitude , with respect to the attitude , when the increment is viewed in its local tangent space and that of the resulting vector is viewed in its Euclidean space. is the derivative of the same function with respect to the unrotated vector , in which the increments of the unrotated vector are also viewed in the Euclidean space. Refer to [4,5] for the obtainment of (A8) and (A9), which appear on (21) and (23), respectively, and where represents the direct cosine matrix equivalent to the unit quaternion :
References
- Gallo, E.; Barrientos, A. GNSS-Denied Semi Direct Visual Navigation for Autonomous UAVs Aided by PI-Inspired Priors. Aerospace 2023, 10, 220. [Google Scholar] [CrossRef]
- Gallo, E.; Barrientos, A. Reduction of GNSS-Denied Inertial Navigation Errors for Fixed Wing Autonomous Unmanned Air Vehicles. Aerosp. Sci. Technol. 2021, 120, 107237. [Google Scholar] [CrossRef]
- Sola, J. Quaternion Kinematics for the Error-State Kalman Filter. arXiv 2017, arXiv:1711.02508. [Google Scholar] [CrossRef]
- Sola, J.; Deray, J.; Atchuthan, D. A Micro Lie Theory for State Estimation in Robotics. arXiv 2018, arXiv:1812.01537. [Google Scholar] [CrossRef]
- Gallo, E. The SO(3) and SE(3) Lie Algebras of Rigid Body Rotations and Motions and their Application to Discrete Integration, Gradient Descent Optimization, and State Estimation. arXiv 2023, arXiv:2205.12572. [Google Scholar] [CrossRef]
- Hassanalian, M.; Abdelkefi, A. Classifications, Applications, and Design Challenges of Drones: A Review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
- Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
- Bijjahalli, S.; Sabatini, R.; Gardi, A. Advances in Intelligent and Autonomous Navigation Systems for Small UAS. Prog. Aerosp. Sci. 2020, 115, 100617. [Google Scholar] [CrossRef]
- Farrell, J.A. Aided Navigation, GPS with High Rate Sensors; Electronic Engineering Series; McGraw-Hill: New York, NY, USA, 2008. [Google Scholar]
- Groves, P.D. Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems; GNSS Technology and Application Series; Artech House: Norwood, MA, USA, 2008. [Google Scholar]
- Chatfield, A.B. Fundamentals of High Accuracy Inertial Navigation; American Institute of Aeronautics and Astronautics, Progress in Astronautics and Aeronautics: Reston, VA, USA, 1997; Volume 174. [Google Scholar]
- Elbanhawi, M.; Mohamed, A.; Clothier, R.; Palmer, J.; Simic, M.; Watkins, S. Enabling Technologies for Autonomous MAV Operations. Prog. Aerosp. Sci. 2017, 91, 27–52. [Google Scholar] [CrossRef]
- Sabatini, R.; Moore, T.; Ramasamy, S. Global Navigation Satellite Systems Performance Analysis and Augmentation Strategies in Aviation. Prog. Aerosp. Sci. 2017, 95, 45–98. [Google Scholar] [CrossRef]
- Tippitt, C.; Schultz, A.; Procino, W. Vehicle Navigation: Autonomy Through GPS-Enabled and GPS-Denied Environments; State of the Art Report DSIAC-2020-1328; Defense Systems Information Analysis Center: Belcamp, MD, USA, 2020. [Google Scholar]
- Gyagenda, N.; Hatilima, J.V.; Roth, H.; Zhmud, V. A Review of GNSS Independent UAV Navigation Techniques. Robot. Auton. Syst. 2022, 152, 104069. [Google Scholar] [CrossRef]
- Kapoor, R.; Ramasamy, S.; Gardi, A.; Sabatini, R. UAV Navigation using Signals of Opportunity in Urban Environments: A Review. Energy Procedia 2017, 110, 377–383. [Google Scholar] [CrossRef]
- Coluccia, A.; Ricciato, F.; Ricci, G. Positioning Based on Signals of Opportunity. IEEE Commun. Lett. 2014, 18, 356–359. [Google Scholar] [CrossRef]
- Goh, S.T.; Abdelkhalik, O.; Zekavat, S.A. A Weighted Measurement Fusion Kalman Filter Implementation for UAV Navigation. Aerosp. Sci. Technol. 2013, 28, 315–323. [Google Scholar] [CrossRef]
- Couturier, A.; Akhloufi, M.A. A Review on Absolute Visual Localization for UAV. Robot. Auton. Syst. 2020, 135, 103666. [Google Scholar] [CrossRef]
- Goforth, H.; Lucey, S. GPS-Denied UAV Localization using Pre Existing Satellite Imagery. In Proceedings of the IEEE International Conference on Robotics and Automation, Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway Township, NJ, USA, 2019. [Google Scholar] [CrossRef]
- Ziaei, N. Geolocation of an Aircraft using Image Registration Coupling Modes for Autonomous Navigation. arXiv 2019, arXiv:1909.02875. [Google Scholar] [CrossRef]
- Wang, T. Augmented UAS Navigation in GPS Denied Terrain Environments Using Synthetic Vision. Master’s Thesis, Iowa State University, Ames, IA, USA, 2018. [Google Scholar] [CrossRef] [Green Version]
- Kinnari, J.; Verdoja, F.; Kyrki, V. Season-Invariant GNSS-Denied Visual Localization for UAVs. IEEE Robot. Autom. Lett. 2022, 7, 10232–10239. [Google Scholar] [CrossRef]
- Ren, Y.; Wang, Z. A Novel Scene Matching Algorithm via Deep Learning for Vision-Based UAV Absolute Localization. In Proceedings of the Internaltional Conference on Machine Learning, Cloud Computing and Intelligent Mining, Xiamen, China, 5–7 August 2022. [Google Scholar] [CrossRef]
- Liu, K.; He, X.; Mao, J.; Zhang, L.; Zhou, W.; Qu, H.; Luo, K. Map Aided Visual Inertial Integrated Navigaion for Long Range UAVs. In Proceedings of the Internaltional Conference on Guidance, Navigation, and Control, Sopot, Poland, 12–16 June 2023. [Google Scholar] [CrossRef]
- Zhang, Q.; Zhang, H.; Lan, Z.; Chen, W.; Zhang, Z. A DNN-Based Optical Aided Autonomous Navigation System for UAV Under GNSS-Denied Environment. In Proceedings of the Internaltional Conference on Autonomous Unmanned Systems, Warsaw, Poland, 6–9 June 2023. [Google Scholar] [CrossRef]
- Jurevicius, R.; Marcinkevicius, V.; Seibokas, J. Robust GNSS-Denied Localization for UAV using Particle Filter and Visual Odometry. Mach. Vis. Appl. 2019, 30, 1181–1190. [Google Scholar] [CrossRef] [Green Version]
- Scaramuzza, D.; Fraundorfer, F. Visual Odometry Part 1: The First 30 Years and Fundamentals. IEEE Robot. Autom. Mag. 2011, 18, 80–92. [Google Scholar] [CrossRef]
- Fraundorfer, F.; Scaramuzza, D. Visual Odometry Part 2: Matching, Robustness, Optimization, and Applications. IEEE Robot. Autom. Mag. 2012, 19, 78–90. [Google Scholar] [CrossRef] [Green Version]
- Scaramuzza, D. Tutorial on Visual Odometry; Robotics & Perception Group, University of Zurich: Zurich, Switzerland, 2012. [Google Scholar]
- Scaramuzza, D. Visual Odometry and SLAM: Past, Present, and the Robust Perception Age; Robotics & Perception Group, University of Zurich: Zurich, Switzerland, 2017. [Google Scholar]
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Towards the Robust Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef] [Green Version]
- Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast Semi-Direct Monocular Visual Odometry. In Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China, 31 May–7 June 2014. [Google Scholar] [CrossRef] [Green Version]
- Forster, C.; Zhang, Z.; Gassner, M.; Werlberger, M.; Scaramuzza, D. SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems. IEEE Trans. Robot. 2016, 33, 249–265. [Google Scholar] [CrossRef] [Green Version]
- Engel, J.; Koltun, V.; Cremers, D. Direct Sparse Odometry. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 611–625. [Google Scholar] [CrossRef]
- Engel, J.; Schops, T.; Cremers, D. LSD-SLAM: Large Scale Direct Monocular SLAM. Eur. Conf. Comput. Vis. 2014, 834–849. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Tardos, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R. Real-Time Accurate Visual SLAM with Place Recognition. Ph.D. Thesis, University of Zaragoza, Zaragoza, Spain, 2017. Available online: http://zaguan.unizar.es/record/60871 (accessed on 25 May 2023).
- Crassidis, J.L.; Markley, F.L. Unscented Filtering for Spacecraft Attitude Estimation. J. Guid. Control Dyn. 2003, 26, 536–542. [Google Scholar] [CrossRef] [Green Version]
- Grip, H.F.; Fossen, T.I.; Johansen, T.A.; Saberi, A. Attitude Estimation Using Biased Gyro and Vector Measurements with Time Varying Reference Vectors. IEEE Trans. Autom. Control 2012, 57, 1332–1338. [Google Scholar] [CrossRef] [Green Version]
- Kottah, R.; Narkhede, P.; Kumar, V.; Karar, V.; Poddar, S. Multiple Model Adaptive Complementary Filter for Attitude Estimation. Aerosp. Sci. Technol. 2017, 69, 574–581. [Google Scholar] [CrossRef]
- Hashim, H.A. Systematic Convergence of Nonlinear Stochastic Estimators on the Special Orthogonal Group SO(3). Int. J. Robust Nonlinear Control 2020, 30, 3848–3870. [Google Scholar] [CrossRef] [Green Version]
- Hashim, H.A.; Brown, L.J.; McIsaac, K. Nonlinear Stochastic Attitude Filters on the Special Orthogonal Group SO(3): Ito and Stratonovich. IEEE Trans. Syst. Man Cybern. 2019, 49, 1853–1865. [Google Scholar] [CrossRef] [Green Version]
- Batista, P.; Silvestre, C.; Oliveira, P. On the Observability of Linear Motion Quantities in Navigation Systems. Syst. Control Lett. 2011, 60, 101–110. [Google Scholar] [CrossRef]
- Hashim, H.A.; Brown, L.J.; McIsaac, K. Nonlinear Pose Filters on the Special Euclidean Group SE(3) with Guaranteed Transient and Steady State Performance. IEEE Trans. Syst. Man Cybern. 2019, 51, 2949–2962. [Google Scholar] [CrossRef] [Green Version]
- Hashim, H.A. GPS Denied Navigation: Attitude, Position, Linear Velocity, and Gravity Estimation with Nonlinear Stochastic Observer. In Proceedings of the 2021 American Control Conference, Online, 25–28 May 2021. [Google Scholar] [CrossRef]
- Hua, M.D.; Allibert, G. Riccati Observer Design for Pose, Linear Velocity, and Gravity Direction Estimation Using Landmark Position and IMU Measurements. In Proceedings of the IEEE Conference on Control Technology and Applications, Copenhagen, Denmark, 21–24 August 2018. [Google Scholar] [CrossRef] [Green Version]
- Barrau, A.; Bonnabel, S. The Invariant Extended Kalman Filter as a Stable Observer. IEEE Trans. Autom. Control 2017, 62, 1797–1812. [Google Scholar] [CrossRef] [Green Version]
- Scaramuzza, D.; Zhang, Z. Visual-Inertial Odometry of Aerial Robots. arXiv 2019, arXiv:1906.03289. [Google Scholar] [CrossRef]
- Huang, G. Visual-Inertial Navigation: A Concise Review. arXiv 2019, arXiv:1906.02650. [Google Scholar] [CrossRef]
- von Stumberg, L.; Usenko, V.; Cremers, D. Chapter 7—A Review and Quantitative Evaluation of Direct Visual Inertial Odometry. In Multimodal Scene Understanding; Yang, M.Y., Rosenhahn, B., Murino, V., Eds.; Academic Press: Cambridge, MA, USA, 2019. [Google Scholar] [CrossRef]
- Feng, X.; Jiang, Y.; Yang, X.; Du, M.; Li, X. Computer Vision Algorithms and Hardware Implementations: A Survey. Integr. Vlsi J. 2019, 69, 309–320. [Google Scholar] [CrossRef]
- Al-Kaff, A.; Martin, D.; Garcia, F.; de la Escalera, A.; Maria, J. Survey of Computer Vision Algorithms and Applications for Unmanned Aerial Vehicles. Expert Syst. Appl. 2017, 92, 447–463. [Google Scholar] [CrossRef]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007. [Google Scholar] [CrossRef]
- Leutenegger, S.; Furgale, P.; Rabaud, V.; Chli, M.; Konolige, K.; Siegwart, R. Keyframe Based Visual Inertial SLAM Using Nonlinear Optimization. Robot. Sci. Syst. 2013. [Google Scholar] [CrossRef]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe Based Visual Inertial SLAM Using Nonlinear Optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef]
- Bloesch, M.; Omari, S.; Hutter, M.; Siegwart, R. Robust Visual Inertial Odometry Using a Direct EKF Based Approach. In Proceedings of the International Conference of Intelligent Robot Systems, Hamburg, Germany, 28 September–3 October 2015. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Lynen, S.; Achtelik, M.W.; Weiss, S.; Chli, M.; Siegwart, R. A Robust and Modular Multi Sensor Fusion Approach Applied to MAV Navigation. In Proceedings of the International Conference of Intelligent Robot Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar] [CrossRef] [Green Version]
- Faessler, M.; Fontana, F.; Forster, C.; Mueggler, E.; Pizzoli, M.; Scaramuzza, D. Autonomous, Vision Based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle. J. Field Robot. 2015, 33, 431–450. [Google Scholar] [CrossRef]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On Manifold Pre Integration for Real Time Visual Inertial Odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar] [CrossRef] [Green Version]
- Kaess, M.; Johannsson, H.; Roberts, R.; Ila, V.; Leonard, J.; Dellaert, F. iSAM2: Incremental Smoothing and Mapping Using the Bayes Tree. Int. J. Robot. Res. 2012, 31, 216–235. [Google Scholar] [CrossRef]
- Burri, M.; Nikolic, J.; Gohl, P.; Schneider, T.; Rehder, J.; Omari, S.; Achtelik, M.W.; Siegwart, R. The EuRoC MAV Datasets. IEEE Int. J. Robot. Res. 2016, 35, 1157–1163. [Google Scholar] [CrossRef]
- Delmerico, J.; Scaramuzza, D. A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, Australia, 21–25 May 2018. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Montiel, J.M.M. Visual Inertial Monocular SLAM with Map Reuse. IEEE Robot. Autom. Lett. 2017, 2, 796–803. [Google Scholar] [CrossRef] [Green Version]
- Clark, R.; Wang, S.; Wen, H.; Markham, A.; Trigoni, N. VINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem. In Proceedings of the AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Available online: https://ojs.aaai.org/index.php/AAAI/article/view/11215 (accessed on 25 May 2023).
- Paul, M.K.; Wu, K.; Hesch, J.A.; Nerurkar, E.D.; Roumeliotis, S.I. A Comparative Analysis of Tightly Coupled Monocular, Binocular, and Stereo VINS. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 19 May–3 June 2017. [Google Scholar] [CrossRef]
- Song, Y.; Nuske, S.; Scherer, S. A Multi Sensor Fusion MAV State Estimation from Long Range Stereo, IMU, GPS, and Barometric Sensors. Sensors 2017, 17, 11. [Google Scholar] [CrossRef] [Green Version]
- Solin, A.; Cortes, S.; Rahtu, E.; Kannala, J. PIVO: Probabilistic Inertial Visual Odometry for Occlusion Robust Navigation. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision, Lake Tahoe, NV, USA, 12–15 March 2018. [Google Scholar] [CrossRef] [Green Version]
- Houben, S.; Quenzel, J.; Krombach, N.; Behnke, S. Efficient Multi Camera Visual Inertial SLAM for Micro Aerial Vehicles. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Daejeon, Republic of Korea, 9–14 October 2016. [Google Scholar] [CrossRef]
- Eckenhoff, K.; Geneva, P.; Huang, G. Direct Visual Inertial Navigation with Analytical Preintegration. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017. [Google Scholar] [CrossRef]
- Negru, S.A.; Geragersian, P.; Petrunin, I.; Zolotas, A.; Grech, R. GNSS/INS/VO Fusion using Gated Recurrent Unit in GNSS-Denied Environments. AIAA SciTech Forum 2023. [Google Scholar] [CrossRef]
- Geragersian, P.; Petrunin, I.; Guo, W.; Grech, R. An INS/GNSS Fusion Architecture in GNSS-Denied Environment using Gated Recurrent Unit. AIAA SciTech Forum 2022. [Google Scholar] [CrossRef]
- Strasdat, H.; Montiel, J.M.M.; Davison, A.J. Real Time Monocular SLAM: Why Filter? In Proceedings of the IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010. [CrossRef] [Green Version]
- Gallego, G.; Delbruck, T.; Orchard, G.; Bartolozzi, C.; Taba, B.; Censi, A.; Leutenegger, S.; Davison, A.; Conradt, J.; Daniilidis, K.; et al. Event Based Cameras: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 154–180. [Google Scholar] [CrossRef]
- Mueggler, E.; Gallego, G.; Rebecq, H.; Scaramuzza, D. Continuous Time Visual Inertial Odometry for Event Cameras. IEEE Trans. Robot. 2018, 34, 1425–1440. [Google Scholar] [CrossRef] [Green Version]
- Simon, D. Optimal State Estimation; John Wiley & Sons: Hoboken, NJ, USA, 2006; ISBN 0-471-70858-5. [Google Scholar]
- Blanco, J.L. A Tutorial on SE(3) Transformation Parameterizations and On-Manifold Optimization. arXiv 2020, arXiv:2103.15980. [Google Scholar] [CrossRef]
- Gallo, E. High Fidelity Flight Simulation for an Autonomous Low SWaP Fixed Wing UAV in GNSS-Denied Conditions. C++ Open Source Code. 2020. Available online: https://github.com/edugallogithub/gnssdenied_flight_simulation (accessed on 25 May 2023).
Discrete Time | Frequency | Variable | System |
---|---|---|---|
Flight physics | |||
Sensors | |||
Inertial navigation | |||
Guidance and control | |||
Visual navigation and camera | |||
GNSS receiver |
GNSS-Denied | Inertial | Visual |
---|---|---|
Attitude | Bounded by sensor quality | Drifts |
Yaw worse than pitch and bank | Yaw better than pitch and bank | |
Vertical | Bounded by atmospheric physics | Drifts |
Horizontal | Drifts | Drifts |
Scenario | Inertial | Visual | Proposed | |
---|---|---|---|---|
#1 | mean | 0.158 | 0.218 | 0.100 |
std | 0.114 | 0.103 | 0.059 | |
max | 0.611 | 0.606 | 0.328 | |
#2 | mean | 0.128 | 0.221 | 0.107 |
std | 0.078 | 0.137 | 0.068 | |
max | 0.369 | 0.788 | 0.377 |
Scenario | Inertial | Visual | Proposed | |
---|---|---|---|---|
#1 | mean | 4.18 | +22.86 | 3.97 |
std | 25.78 | 49.17 | 26.12 | |
max | 70.49 | +175.76 | 70.47 | |
#2 | mean | +0.76 | +3.59 | +0.74 |
std | 7.55 | 13.01 | 7.60 | |
max | 19.86 | +71.64 | 18.86 |
Scenario | Inertial | Visual | Proposed | |||||
---|---|---|---|---|---|---|---|---|
Distance | ||||||||
#1 | mean | 107,873 | 7276 | 7.10 | 488 | 0.46 | 207 | 0.19 |
std | 19,756 | 4880 | 5.69 | 350 | 0.31 | 185 | 0.15 | |
max | 172,842 | 25,288 | 32.38 | 1957 | 1.48 | 1257 | 1.09 | |
#2 | mean | 14,198 | 216 | 1.52 | 33 | 0.23 | 18 | 0.13 |
std | 1176 | 119 | 0.86 | 26 | 0.18 | 9 | 0.07 | |
max | 18,253 | 586 | 4.38 | 130 | 0.98 | 43 | 0.33 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gallo, E.; Barrientos, A. Long-Distance GNSS-Denied Visual Inertial Navigation for Autonomous Fixed-Wing Unmanned Air Vehicles: SO(3) Manifold Filter Based on Virtual Vision Sensor. Aerospace 2023, 10, 708. https://doi.org/10.3390/aerospace10080708
Gallo E, Barrientos A. Long-Distance GNSS-Denied Visual Inertial Navigation for Autonomous Fixed-Wing Unmanned Air Vehicles: SO(3) Manifold Filter Based on Virtual Vision Sensor. Aerospace. 2023; 10(8):708. https://doi.org/10.3390/aerospace10080708
Chicago/Turabian StyleGallo, Eduardo, and Antonio Barrientos. 2023. "Long-Distance GNSS-Denied Visual Inertial Navigation for Autonomous Fixed-Wing Unmanned Air Vehicles: SO(3) Manifold Filter Based on Virtual Vision Sensor" Aerospace 10, no. 8: 708. https://doi.org/10.3390/aerospace10080708
APA StyleGallo, E., & Barrientos, A. (2023). Long-Distance GNSS-Denied Visual Inertial Navigation for Autonomous Fixed-Wing Unmanned Air Vehicles: SO(3) Manifold Filter Based on Virtual Vision Sensor. Aerospace, 10(8), 708. https://doi.org/10.3390/aerospace10080708