#
DANAE^{++}: A Smart Approach for Denoising Underwater Attitude Estimation^{ †}

^{1}

^{2}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Related Works

## 3. Theoretical Notions and Method

#### 3.1. Orientation Estimation

- $\varphi $ represents the rotation around the X axis, known as roll;
- $\theta $ defines the rotation around the Y axis, i.e., the pitch angle;
- $\psi $ is related to the yaw angle around the Z axis.

#### 3.2. Sensors’ Characteristics

- Noise errors coming from the sensors’ noisy measurements;
- Bias errors deriving from wrong or missing calibration procedures;
- Filter errors due to a wrong or missing filter tuning procedure.

#### 3.3. Kalman Filtering Techniques

#### 3.4. Denoising Autoencoders

#### 3.5. U-Net Architecture

#### 3.6. DANAE${}^{++}$

## 4. Experimental Setup

#### 4.1. Datasets

#### 4.2. Experiments

#### 4.2.1. Kalman Filters Initialization

#### 4.2.2. DANAE${}^{++}$ Setting

## 5. Results

## 6. Conclusions

- In addition to the estimations provided by the filter, it takes as an input the intermediate attitude values analytically derived inside the filter loop from the sensors’ measurements; this solution was proven to increase the accuracy of the final results;
- Differently from its previous implementation, it is capable of denoising the three orientation angles at the same time, thus reducing the overall time consumption of ${\scriptstyle \sim}$66%.

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## Abbreviations

AHRS | Attitude and Heading Reference System |

CNN | Convolutional Neural Networks |

DAE | Denoising Auto-Encoders |

DANAE | Denoising AutoeNcoder for Attitude Estimation |

DOF | Degree of Freedom |

DVL | Doppler Velocity Logger |

ECG | Electrocardiogram |

EKF | Extended Kalman Filter |

ENU | East-North-Up |

NED | North-East-Down |

GPS | Global Positioning System |

GT | Ground Truth |

IMU | Inertial Measurement Unit |

KF | Kalman Filter |

LSTM | Long-Short-Term-Memory |

MEMS | Micro Electro Mechanical Systems |

OxIOD | Oxford Inertial Odometry Dataset |

RMSE | Root Mean Square Error |

RNN | Recurrent Neural Networks |

UCSD | Underwater Caves Sonar Dataset |

UKF | Unscented Kalman Filter |

VAE | Variational Autoencoder |

## References

- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. Trans. ASME- Basic Eng.
**1960**, 82, 35–45. [Google Scholar] [CrossRef] [Green Version] - St-Pierre, M.; Gingras, D. Comparison between the unscented Kalman filter and the extended Kalman filter for the position estimation module of an integrated navigation information system. IEEE Intell. Veh. Symp.
**2004**, 2004, 831–835. [Google Scholar] - Tereshkov, V.M. An intuitive approach to inertial sensor bias estimation. Int. J. Navig. Obs.
**2013**, 2013. [Google Scholar] [CrossRef] - Russo, P.; Di Ciaccio, F.; Troisi, S. DANAE: A denoising autoencoder for underwater attitude estimation. In Proceedings of the 2020 IMEKO TC-19 International Workshop on Metrology for the Sea. arXiv
**2020**, arXiv:2011.06853. Available online: https://arxiv.org/abs/2011.06853 (accessed on 12 December 2020). - Pirník, R.; Hruboš, M.; Nemec, D.; Mravec, T.; Božek, P. Integration of inertial sensor data into control of the mobile platform. In Federated Conference on Software Development and Object Technologies; Springer: Berlin, Germany, 2015; pp. 271–282. [Google Scholar]
- Caron, F.; Duflos, E.; Pomorski, D.; Vanheeghe, P. GPS/IMU data fusion using multisensor Kalman filtering: Introduction of contextual aspects. Inf. Fusion
**2006**, 7, 221–230. [Google Scholar] [CrossRef] - Ferdinando, H.; Khoswanto, H.; Purwanto, D. Embedded Kalman filter for inertial measurement unit (IMU) on the ATMega8535. In Proceedings of the IEEE 2012 International Symposium on Innovations in Intelligent Systems and Applications, Trabzon, Turkey, 2–4 July 2012; pp. 1–5. [Google Scholar]
- Allotta, B.; Caiti, A.; Costanzi, R.; Fanelli, F.; Fenucci, D.; Meli, E.; Ridolfi, A. A new AUV navigation system exploiting unscented Kalman filter. Ocean Eng.
**2016**, 113, 121–132. [Google Scholar] [CrossRef] - Zhang, X.; Mu, X.; Liu, H.; He, B.; Yan, T. Application of Modified EKF Based on Intelligent Data Fusion in AUV Navigation. In Proceedings of the 2019 IEEE Underwater Technology (UT), Kaohsiung, Taiwan, 16–19 April 2019; pp. 1–4. [Google Scholar]
- Li, W.; Wang, J. Effective adaptive Kalman filter for MEMS-IMU/magnetometers integrated attitude and heading reference systems. J. Navig.
**2013**, 66, 99–113. [Google Scholar] [CrossRef] [Green Version] - Di Ciaccio, F.; Gaglione, S.; Troisi, S. A Preliminary Study on Attitude Measurement Systems Based on Low Cost Sensors. In International Workshop on R3 in Geomatics: Research, Results and Review; Springer: Berlin, Germany, 2019; pp. 103–115. [Google Scholar]
- Alatise, M.B.; Hancke, G.P. Pose estimation of a mobile robot based on fusion of IMU data and vision data using an extended Kalman filter. Sensors
**2017**, 17, 2164. [Google Scholar] [CrossRef] [Green Version] - De Marina, H.G.; Pereda, F.J.; Giron-Sierra, J.M.; Espinosa, F. UAV attitude estimation using unscented Kalman filter and TRIAD. IEEE Trans. Ind. Electron.
**2011**, 59, 4465–4474. [Google Scholar] [CrossRef] [Green Version] - Allotta, B.; Chisci, L.; Costanzi, R.; Fanelli, F.; Fantacci, C.; Meli, E.; Ridolfi, A.; Caiti, A.; Di Corato, F.; Fenucci, D. A comparison between EKF-based and UKF-based navigation algorithms for AUVs localization. In Proceedings of the IEEE OCEANS 2015-Genova, Genova, Italy, 18–21 May 2015; pp. 1–5. [Google Scholar]
- Taghavi, E.; Tharmarasa, R.; Kirubarajan, T.; Bar-Shalom, Y.; Mcdonald, M. A practical bias estimation algorithm for multisensor-multitarget tracking. IEEE Trans. Aerosp. Electron. Syst.
**2016**, 52, 2–19. [Google Scholar] [CrossRef] [Green Version] - Nirmal, K.; Sreejith, A.; Mathew, J.; Sarpotdar, M.; Suresh, A.; Prakash, A.; Safonova, M.; Murthy, J. Noise modeling and analysis of an IMU-based attitude sensor: Improvement of performance by filtering and sensor fusion. In Advances in Optical and Mechanical Technologies for Telescopes and Instrumentation II; International Society for Optics and Photonics: Bellingham, WA, USA, 2016; Volume 9912, p. 99126W. [Google Scholar]
- Guo, H.; Hong, H. Research on Filtering Algorithm of MEMS Gyroscope Based on Information Fusion. Sensors
**2019**, 19, 3552. [Google Scholar] [CrossRef] [Green Version] - Widodo, R.B.; Wada, C. Attitude estimation using kalman filtering: External acceleration compensation considerations. J. Sens.
**2016**, 2016. [Google Scholar] [CrossRef] [Green Version] - Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems 25; Harrah’s Lake Tahoe: Stateline, NV, USA, 2012; pp. 1097–1105. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar]
- Russo, P.; Tommasi, T.; Caputo, B. Towards Multi-source Adaptive Semantic Segmentation. In International Conference on Image Analysis and Processing; Springer: Berlin, Germany, 2019; pp. 292–301. [Google Scholar]
- Graves, A.; Mohamed, A.R.; Hinton, G. Speech recognition with deep recurrent neural networks. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 6645–6649. [Google Scholar]
- Graves, A. Generating sequences with recurrent neural networks. arXiv
**2013**, arXiv:1308.0850. [Google Scholar] - Kramer, M.A. Nonlinear principal component analysis using autoassociative neural networks. AIChE J.
**1991**, 37, 233–243. [Google Scholar] [CrossRef] - Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical Image Computing and Computer-Assisted Intervention; Springer: Berlin, Germany, 2015; pp. 234–241. [Google Scholar]
- Kingma, D.P.; Welling, M. Auto-encoding variational bayes. arXiv
**2013**, arXiv:1312.6114. [Google Scholar] - Baynazarov, R.; Piontkovskaya, I. Binary Autoencoder for Text Modeling. In Conference on Artificial Intelligence and Natural Language; Springer: Berlin, Germany, 2019; pp. 139–150. [Google Scholar]
- Vincent, P.; Larochelle, H.; Bengio, Y.; Manzagol, P.A. Extracting and composing robust features with denoising autoencoders. In Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland, 5–9 July 2008; pp. 1096–1103. [Google Scholar]
- Gondara, L. Medical image denoising using convolutional denoising autoencoders. In Proceedings of the 2016 IEEE 16th International Conference on Data Mining Workshops (ICDMW), Barcelona, Spain, 12–15 December 2016; pp. 241–246. [Google Scholar]
- Lu, X.; Tsao, Y.; Matsuda, S.; Hori, C. Speech Enhancement Based on Deep Denoising Autoencoder. Available online: https://bio-asplab.citi.sinica.edu.tw/paper/conference/lu2013speech.pdf (accessed on 12 December 2020).
- Xiong, P.; Wang, H.; Liu, M.; Zhou, S.; Hou, Z.; Liu, X. ECG signal enhancement based on improved denoising auto-encoder. Eng. Appl. Artif. Intell.
**2016**, 52, 194–202. [Google Scholar] [CrossRef] - Noda, K.; Yamaguchi, Y.; Nakadai, K.; Okuno, H.G.; Ogata, T. Audio-visual speech recognition using deep learning. Appl. Intell.
**2015**, 42, 722–737. [Google Scholar] [CrossRef] [Green Version] - Shen, H.; George, D.; Huerta, E.; Zhao, Z. Denoising gravitational waves using deep learning with recurrent denoising autoencoders. arXiv
**2017**, arXiv:1711.09919. [Google Scholar] - Principi, E.; Vesperini, F.; Squartini, S.; Piazza, F. Acoustic novelty detection with adversarial autoencoders. In Proceedings of the IEEE 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017; pp. 3324–3330. [Google Scholar]
- Bernal-Polo, P.; Martínez Barberá, H. Orientation Estimation by Means of Extended Kalman Filter, Quaternions, and Charts. J. Phys. Agents
**2017**, 8, 11–24. [Google Scholar] [CrossRef] [Green Version] - Shuster, M.D. A survey of attitude representations. Navigation
**1993**, 8, 439–517. [Google Scholar] - Božek, P.; Al Akkad M, A.; Blištan, P.; Ibrahim N, I. Navigation control and stability investigation of a mobile robot based on a hexacopter equipped with an integrated manipulator. Int. J. Adv. Robot. Syst.
**2017**, 14, 1729881417738103. [Google Scholar] [CrossRef] [Green Version] - Diebel, J. Representing attitude: Euler angles, unit quaternions, and rotation vectors. Matrix
**2006**, 58, 1–35. [Google Scholar] - Hemingway, E.G.; O’Reilly, O.M. Perspectives on Euler angle singularities, gimbal lock, and the orthogonality of applied forces and applied moments. Multibody Syst. Dyn.
**2018**, 44, 31–56. [Google Scholar] [CrossRef] - Ben-Ari, M. A Tutorial on Euler Angles and Quaternions; Weizmann Institute of Science: Rehovot, Israel, 2014. [Google Scholar]
- Angrisano, A.; Nocerino, E.; Troisi, S.; Del Core, G. IMU low cost calibration method. In Proceedings of the European Navigation Conference-Global Navigation Satellite Systems, Naples, Italy, 3–6 May 2009. [Google Scholar]
- Lakshminarayan, I.; Rao, D. Kalman Filter Based Estimation of Constant Angular Rate Bias for Mems Gyroscope. In Proceedings of the IEEE TechSym 2014 Satellite Conference, Vellore, India, 7–8 March 2014; pp. 22–23. [Google Scholar]
- Bao, J.; Li, D.; Qiao, X.; Rauschenbach, T. Integrated navigation for autonomous underwater vehicles in aquaculture: A review. Inf. Process. Agric.
**2020**, 7, 139–151. [Google Scholar] [CrossRef] - Yu, Z.; Crassidis, J.L. Accelerometer bias calibration using attitude and angular velocity information. J. Guid. Control. Dyn.
**2016**, 39, 741–753. [Google Scholar] [CrossRef] [Green Version] - D’Emilia, G.; Gaspari, A.; Mazzoleni, F.; Natale, E.; Prato, A.; Schiavi, A. Metrological Characterization of MEMS Accelerometers by LDV; Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2020; Volume 1589, p. 012011. [Google Scholar]
- Renaudin, V.; Afzal, M.H.; Lachapelle, G. Complete triaxis magnetometer calibration in the magnetic domain. J. Sens.
**2010**, 2010. [Google Scholar] [CrossRef] [Green Version] - Tereshkov, V.M. A simple observer for gyro and accelerometer biases in land navigation systems. J. Navig.
**2015**, 68, 635–645. [Google Scholar] [CrossRef] [Green Version] - Oktay, O.; Schlemper, J.; Folgoc, L.L.; Lee, M.; Heinrich, M.; Misawa, K.; Mori, K.; McDonagh, S.; Hammerla, N.Y.; Kainz, B. Attention u-net: Learning where to look for the pancreas. arXiv
**2018**, arXiv:1804.03999. [Google Scholar] - Kamnitsas, K.; Bai, W.; Ferrante, E.; McDonagh, S.; Sinclair, M.; Pawlowski, N.; Rajchl, M.; Lee, M.; Kainz, B.; Rueckert, D. Ensembles of multiple models and architectures for robust brain tumour segmentation. In International MICCAI Brainlesion Workshop; Springer: Berlin, Germany, 2017; pp. 450–462. [Google Scholar]
- Weston, A.D.; Korfiatis, P.; Kline, T.L.; Philbrick, K.A.; Kostandy, P.; Sakinis, T.; Sugimoto, M.; Takahashi, N.; Erickson, B.J. Automated abdominal segmentation of CT scans for body composition analysis using deep learning. Radiology
**2019**, 290, 669–679. [Google Scholar] [CrossRef] - Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Li, H.; Xu, Z.; Taylor, G.; Studer, C.; Goldstein, T. Visualizing the loss landscape of neural nets. In Advances in Neural Information Processing Systems 31; Palais des Congrès de Montréal, Montréal CANADA: Montreal, QC, Canada, 2018; pp. 6389–6399. [Google Scholar]
- Oord, A.v.d.; Dieleman, S.; Zen, H.; Simonyan, K.; Vinyals, O.; Graves, A.; Kalchbrenner, N.; Senior, A.; Kavukcuoglu, K. Wavenet: A generative model for raw audio. arXiv
**2016**, arXiv:1609.03499. [Google Scholar] - Yu, F.; Koltun, V. Multi-scale context aggregation by dilated convolutions. arXiv
**2015**, arXiv:1511.07122. [Google Scholar] - Chen, C.; Zhao, P.; Lu, C.X.; Wang, W.; Markham, A.; Trigoni, N. Oxiod: The dataset for deep inertial odometry. arXiv
**2018**, arXiv:1809.07491. [Google Scholar] - Mallios, A.; Vidal, E.; Campos, R.; Carreras, M. Underwater caves sonar data set. Int. J. Robot. Res.
**2017**, 36, 1247–1251. [Google Scholar] [CrossRef] - Sola, J. Quaternion kinematics for the error-state Kalman filter. arXiv
**2017**, arXiv:1711.02508. [Google Scholar] - Kok, M.; Hol, J.D.; Schön, T.B.; Gustafsson, F.; Luinge, H. Calibration of a magnetometer in combination with inertial sensors. In Proceedings of the IEEE 2012 15th International Conference on Information Fusion, Singapore, 9–12 July 2012; pp. 787–793. [Google Scholar]

**Figure 1.**DANAE${}^{++}$ architecture: the dilated convolutions (green blocks) represent the encoder part of the model, while the transposed and standard convolutions (red blocks) constitute the decoder part.

**Figure 2.**Workflow of the experiments: the upper section summarizes the training phase, while the bottom section represents the relative testing phase.

**Figure 3.**OxIO Dataset: $roll$ angle estimation provided by the LKF (top, light brown) and DANAE (bottom, light blue) compared to the GT (dark red). This experiment was performed on a subsection of the slow walking set.

**Figure 4.**UCS Dataset: $theta$ angle estimation provided by the LKF (top, light brown) and DANAE (bottom, light blue) compared to the GT (dark red).

**Figure 5.**OxIO Dataset: $roll$ angle estimation provided by the EKF (top, light brown) and DANAE${}^{++}$(bottom, light blue) compared to the GT (dark red). This experiment is made on a subsection of the slow walking set.

**Figure 6.**UCS Dataset: $theta$ angle estimation provided by the EKF (top, light brown) and DANAE${}^{++}$(bottom, light blue) compared to the GT (dark red). This experiment was performed on a subsection of the slow walking set.

**Figure 7.**OxIO Dataset: $roll$ angle estimation provided by DANAE${}^{++}$(light blue) compared to the butterworth (top) and the uniform1d (bottom) filters applied on the EKF outputs (light brown). The GT (dark red) is reported as reference in both the images.

XSens MTi | ADIS16480 | ||
---|---|---|---|

Angular resolution | 0.05 deg | Static accuracy (roll/pitch) | 0.1 deg |

Repeatability | 0.2 deg | Static accuracy (heading) | 0.3 deg |

Static accuracy (roll/pitch) | 0.5 deg | Dynamic accuracy (roll/pitch) | 0.3 deg |

Static accuracy (heading) | 1 deg | Dynamic accuracy (heading) | 0.5 deg |

Dynamic accuracy | 2 deg RMS |

**Table 2.**OxIO Dataset: evaluation of the performances of the LKF, DANAE and DANAE${}^{++}$ w.r.t. the GT for the three Euler angles.

LKF | DANAE | DANAE${}^{++}$ | |||||||
---|---|---|---|---|---|---|---|---|---|

$\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | |

Mean dev. [rad] | 0.0661 | 0.0483 | 1.9518 | 0.0224 | 0.0157 | 0.7392 | 0.0237 | 0.0157 | 0.5756 |

Max dev. [rad] | 0.2929 | 0.2134 | 2.7313 | 0.1382 | 0.1082 | 0.4925 | 0.1396 | 0.1064 | 0.1285 |

RMSE | 0.0815 | 0.0600 | 2.4000 | 0.0282 | 0.0196 | 1.3194 | 0.0296 | 0.0197 | 1.0014 |

**Table 3.**OxIO Dataset: evaluation of the performances of the EKF, DANAE and DANAE${}^{++}$ w.r.t. the GT for the three Euler angles.

LKF | DANAE | DANAE${}^{++}$ | |||||||
---|---|---|---|---|---|---|---|---|---|

$\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | |

Mean dev. [rad] | 0.0614 | 0.0485 | 0.4535 | 0.0216 | 0.0150 | 0.3636 | 0.0240 | 0.0149 | 0.2790 |

Max dev. [rad] | 0.2724 | 0.2113 | 0.0189 | 0.1198 | 0.1100 | 0.7921 | 0.1632 | 0.1014 | 0.2482 |

RMSE | 0.0762 | 0.0601 | 1.0478 | 0.0270 | 0.0187 | 0.8218 | 0.0301 | 0.0188 | 0.7860 |

**Table 4.**UCS Dataset: evaluation of the performances of the LKF, DANAE and DANAE${}^{++}$ w.r.t. the GT for the three Euler angles. Since the GT values of $\psi $ are not reliable, the corresponding results are not reported here.

LKF | DANAE | DANAE${}^{++}$ | |||||||
---|---|---|---|---|---|---|---|---|---|

$\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | |

Mean dev. [rad] | 0.0326 | 0.0328 | - | 0.0139 | 0.0147 | - | 0.0127 | 0.0142 | - |

Max dev. [rad] | 0.1476 | 0.1751 | - | 0.0671 | 0.0769 | - | 0.0616 | 0.0712 | - |

RMSE | 0.0410 | 0.0412 | - | 0.0177 | 0.0190 | - | 0.0162 | 0.0184 | - |

**Table 5.**UCS Dataset: evaluation of the performances of the EKF, DANAE and DANAE${}^{++}$ w.r.t. the GT for the three Euler angles. Since the GT values of $\psi $ are not reliable, the corresponding results are not reported here.

EKF | DANAE | DANAE${}^{++}$ | |||||||
---|---|---|---|---|---|---|---|---|---|

$\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | |

Mean dev. [rad] | 0.0249 | 0.0341 | - | 0.0125 | 0.0141 | - | 0.0126 | 0.0140 | - |

Max dev. [rad] | 0.1382 | 0.1578 | - | 0.0807 | 0.0882 | - | 0.0616 | 0.0824 | - |

RMSE | 0.0427 | 0.0412 | - | 0.0163 | 0.0180 | - | 0.0160 | 0.0179 | - |

**Table 6.**OxIO Dataset: evaluation of the performances of DANAE${}^{++}$, butterworth and uniform1d filters w.r.t. the GT for the three Euler angles.

DANAE${}^{++}$ | Butterworth Filter | Uniform1d Filter | |||||||
---|---|---|---|---|---|---|---|---|---|

$\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | $\mathbf{\varphi}$ | $\mathbf{\theta}$ | $\mathbf{\psi}$ | |

Mean dev. [rad] | 0.0240 | 0.0149 | 0.2790 | 0.0470 | 0.0204 | 1.0113 | 0.0535 | 0.0456 | 0.4537 |

Max dev. [rad] | 0.1632 | 0.1014 | 0.2482 | 0.1488 | 0.1012 | 2.1764 | 0.1841 | 0.1880 | 0.0926 |

RMSE | 0.0301 | 0.0188 | 0.7860 | 0.0547 | 0.0254 | 1.3377 | 0.0640 | 0.0561 | 1.0039 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Russo, P.; Di Ciaccio, F.; Troisi, S.
DANAE^{++}: A Smart Approach for Denoising Underwater Attitude Estimation. *Sensors* **2021**, *21*, 1526.
https://doi.org/10.3390/s21041526

**AMA Style**

Russo P, Di Ciaccio F, Troisi S.
DANAE^{++}: A Smart Approach for Denoising Underwater Attitude Estimation. *Sensors*. 2021; 21(4):1526.
https://doi.org/10.3390/s21041526

**Chicago/Turabian Style**

Russo, Paolo, Fabiana Di Ciaccio, and Salvatore Troisi.
2021. "DANAE^{++}: A Smart Approach for Denoising Underwater Attitude Estimation" *Sensors* 21, no. 4: 1526.
https://doi.org/10.3390/s21041526