Generic Diagnostic Framework for Anomaly Detection—Application in Satellite and Spacecraft Systems
Abstract
:1. Introduction
- A robust and adaptive framework for automatically creating anomaly detection models is presented.
- The framework is applied in three case studies, including benchmark datasets for satellite and spacecraft systems and a real-life satellite dataset provided by the European Space Agency (ESA).
2. Literature Review and Background
2.1. Anomaly Detection
- point anomalies, which are punctual occurrences of anomalous data with respect to the remaining data;
- contextual anomalies, which are instances that show anomalous behaviour in a specific context; e.g., instances with relatively larger/smaller values in their context but not globally; and
- collective anomalies are anomalies consisting of a set of related data instances (e.g., occurring at a specific time range) that are anomalous with respect to the entire dataset.
2.1.1. Taxonomy of Anomaly Detection Methods
- proximity-based methods, which rely on the definition of a distance/similarity function between two data instances;
- ensemble-based methods, which use ensembles of AI algorithms for anomaly detection;
- domain-based methods, which define boundaries or domains to separate normal data from anomalies; and
- reconstruction-based methods, which embed data in a lower dimension to separate normal instances from anomalous ones.
2.1.2. Thresholding
2.2. Adaptive Anomaly Detection Methods
2.3. Adaptive Anomaly Detection Methods for Space Applications
3. Methodology
3.1. Metrics for Anomaly Detection
3.2. The Generic Diagnostic Framework
3.2.1. Multi-Objective Genetic Algorithm
- -
- A population is initialised, composed of a set of individuals (i.e., solutions to the optimisation problem).
- -
- The best-fitted individuals are selected based on a fitness metric which represents the objective.
- -
- In the following step, the selected individuals undergo a cross-over and mutation process to produce new children for a new generation of individuals.
- -
- This process is repeated over a number of generations until the algorithm converges or a stopping criterion is achieved.
Algorithm 1: Genetic Algorithm |
3.2.2. Data Pre-Processing
3.2.3. Anomaly Detection
- k-Nearest Neighbours (KNN) as presented in [39], which measures the distance between data points and classifies the points with the highest distance from the other instances as anomalous.
- Isolation Forests (iF) as introduced by [40], which build tree structures to isolate data points (which are considered as anomalies).
- Principal Component Analysis (PCA), which performs a linear dimensionality reduction into a lower dimensional space to compute outlier scores.
- One Class-Support Vector Machines (OC-SVM), which estimate the support of a high-dimensional distribution and thereby define non-linear boundaries around the region of the normal data (separating the remaining points as anomalies).
3.2.4. Thresholding
- the Area Under Curve Percentage (AUCP);
- the Clustering-based method (CLUST);
- the Median Absolute Deviation (MAD);
- the Modified Thompson Tau Test (MTT); and
- the Z-Score (Z-Score).
4. Case Studies and Results
4.1. Application of the GDF to the Datasets
4.2. SMAP Dataset
4.2.1. Resulting Pareto Front Compared against the Baseline
4.2.2. Comparing Multi-Objective Optimisation with Single-Objective Optimisation
- When optimising towards an F1 score, the best individual has the following settings: normalisation, KNN, and a Z-Score of 0.04, with an F1 score of 0.249.
- When optimising towards the F1pa score, the best individual has the following settings: normalisation, KNN, and Z-Score, with an F1pa score of 0.676.
4.2.3. The Effect of Including Thresholding Methods
4.3. MSL Dataset
4.3.1. Resulting Pareto Front Compared against Baseline
4.3.2. Comparing Multi-Objective Optimisation with Single-Objective Optimisation
- When optimising towards an F1 score, the best individual has the following settings: normalisation, KNN, and MAD, with an F1 score of 0.259.
- When optimising towards an F1pa score, the best individual has the following settings: normalisation, PCA, and AUCP, with an F1pa score of 0.734.
4.3.3. The Effect of Including Thresholding Methods
4.4. Satellite Reaction Wheel Dataset
4.4.1. Resulting Pareto Front Compared against the Baseline
4.4.2. Comparing Multi-Objective Optimisation with Single-Objective Optimisation
- When optimising towards an F1 score, the best individual has the following settings: normalisation, KNN, 0.14 with an F1 score of 0.621.
- When optimising towards the F1pa score, the best individual has the following settings: normalisation, KNN MAD with an F1pa score of 1.0.
4.4.3. The Effect of Including Thresholding Methods
4.5. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Chen, J.; Pi, D.; Wu, Z.; Zhao, X.; Pan, Y.; Zhang, Q. Imbalanced satellite telemetry data anomaly detection model based on Bayesian LSTM. Acta Astronaut. 2021, 180, 232–242. [Google Scholar] [CrossRef]
- Fuertes, S.; Picart, G.; Tourneret, J.Y.; Chaari, L.; Ferrari, A.; Richard, C. Improving spacecraft health monitoring with automatic anomaly detection techniques. In Proceedings of the 14th International Conference on Space Operations, Daejeon, Republic of Korea, 16–20 May 2016; pp. 1–16. [Google Scholar] [CrossRef] [Green Version]
- Hundman, K.; Constantinou, V.; Laporte, C.; Colwell, I.; Soderstrom, T. Detecting spacecraft anomalies using LSTMs and nonparametric dynamic thresholding. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, London, UK, 19–23 August 2018; pp. 387–395. [Google Scholar] [CrossRef] [Green Version]
- Zio, E. Prognostics and Health Management (PHM): Where are we and where do we (need to) go in theory and practice. Reliab. Eng. Syst. Saf. 2022, 218, 108119. [Google Scholar] [CrossRef]
- Basora, L.; Olive, X.; Dubot, T. Recent advances in anomaly detection methods applied to aviation. Aerospace 2019, 6, 117. [Google Scholar] [CrossRef] [Green Version]
- Zeng, Z.; Jin, G.; Xu, C.; Chen, S.; Zhang, L. Spacecraft Telemetry Anomaly Detection Based on Parametric Causality and Double-Criteria Drift Streaming Peaks over Threshold. Appl. Sci. 2022, 12, 1803. [Google Scholar] [CrossRef]
- Chen, Z.; Zhou, D.; Zio, E.; Xia, T.; Pan, E. Adaptive transfer learning for multimode process monitoring and unsupervised anomaly detection in steam turbines. Reliab. Eng. Syst. Saf. 2023, 234, 109162. [Google Scholar] [CrossRef]
- Shao, K.; He, Y.; Xing, Z.; Du, B. Detecting wind turbine anomalies using nonlinear dynamic parameters-assisted machine learning with normal samples. Reliab. Eng. Syst. Saf. 2023, 233, 109092. [Google Scholar] [CrossRef]
- Fink, O.; Wang, Q.; Svensén, M.; Dersin, P.; Lee, W.J.; Ducoffe, M. Potential, challenges and future directions for deep learning in prognostics and health management applications. Eng. Appl. Artif. Intell. 2020, 92, 103678. [Google Scholar] [CrossRef]
- Jardine, A.K.S.; Lin, D.; Banjevic, D. A review on machinery diagnostics and prognostics implementing condition-based maintenance. Mech. Syst. Signal Process. 2006, 20, 1483–1510. [Google Scholar] [CrossRef]
- Choi, K.; Yi, J.; Park, C.; Yoon, S. Deep learning for anomaly detection in time-series data: Review, analysis, and guidelines. IEEE Access 2021, 9, 120043–120065. [Google Scholar] [CrossRef]
- Nassif, A.B.; Talib, M.A.; Nasir, Q.; Dakalbab, F.M. Machine Learning for Anomaly Detection: A Systematic Review. IEEE Access 2021, 9, 78658–78700. [Google Scholar] [CrossRef]
- Chandola, V.; Banerjee, A.; Kumar, V. Anomaly Detection: A Survey. ACM J. 2009, 41, 1–58. [Google Scholar] [CrossRef]
- Khan, S.; Tsutsumi, S.; Yairi, T.; Nakasuka, S. Robustness of AI-based prognostic and systems health management. Annu. Rev. Control 2021, 51, 130–152. [Google Scholar] [CrossRef]
- Basora, L.; Bry, P.; Olive, X.; Freeman, F. Aircraft Fleet Health Monitoring using Anomaly Detection Techniques. Aerospace 2021, 8, 103. [Google Scholar] [CrossRef]
- Ren, K.; Yang, H.; Zhao, Y.; Chen, W.; Xue, M.; Miao, H.; Huang, S.; Liu, J. A Robust auc maximization framework with simultaneous outlier detection and feature selection for positive-unlabeled classification. IEEE Trans. Neural Netw. Learn. Syst. 2019, 30, 3072–3083. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Archana, N.; Pawar, S.S. Periodicity Detection of Outlier Sequences Using Constraint Based Pattern Tree with MAD. arXiv 2015, arXiv:1507.01685. [Google Scholar]
- Rengasamy, D.; Rothwell, B.C.; Figueredo, G.P. Towards a more reliable interpretation of machine learning outputs for safety-critical systems using feature importance fusion. Appl. Sci. 2021, 11, 1854. [Google Scholar] [CrossRef]
- Xiao, Z.; Yan, Q.; Amit, Y. Likelihood regret: An out-of-distribution detection score for variational auto-encoder. Adv. Neural Inf. Process. Syst. 2020, 33, 20685–20696. [Google Scholar]
- Bagdonavičius, V.; Petkevičius, L. Multiple outlier detection tests for parametric models. Mathematics 2020, 8, 2156. [Google Scholar] [CrossRef]
- Klawonn, F.; Rehm, F. Cluster Analysis for Outlier Detection. In Encyclopedia of Data Warehousing and Mining, 2nd ed.; IGI Global: Hershey, PA, USA, 2011; pp. 2006–2008. [Google Scholar] [CrossRef]
- Zhao, C.; Shen, W. Adaptive open set domain generalization network: Learning to diagnose unknown faults under unknown working conditions. Reliab. Eng. Syst. Saf. 2022, 226, 108672. [Google Scholar] [CrossRef]
- Alam, M.R.; Gerostathopoulos, I.; Prehofer, C.; Attanasi, A.; Bures, T. A framework for tunable anomaly detection. In Proceedings of the 2019 IEEE International Conference on Software Architecture, ICSA 2019, Hamburg, Germany, 25–29 March 2019; pp. 201–210. [Google Scholar] [CrossRef]
- Calikus, E.; Nowaczyk, S.; Sant’Anna, A.; Dikmen, O. No free lunch but a cheaper supper: A general framework for streaming anomaly detection. Expert Syst. Appl. 2020, 155, 113453. [Google Scholar] [CrossRef] [Green Version]
- Akiba, T.; Sano, S.; Yanase, T.; Ohta, T.; Koyama, M. Optuna: A Next-generation Hyperparameter Optimization Framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019. [Google Scholar]
- O’meara, C.; Schlag, L.; Faltenbacher, L.; Wicklerz, M. ATHMoS: Automated telemetry health monitoring system at GSOC using outlier detection and supervised machine learning. In Proceedings of the SpaceOps 2016 Conference, Daejeon, Republic of Korea, 16–20 May 2016; pp. 1–17. [Google Scholar] [CrossRef] [Green Version]
- O’meara, C.; Schlag, L.; Wickler, M. Applications of deep learning neural networks to satellite telemetry monitoring. In Proceedings of the 15th International Conference on Space Operations, Marseille, France, 28 May–1 June 2018; pp. 1–16. [Google Scholar] [CrossRef] [Green Version]
- Sun, W.; Paiva, A.R.C.; Xu, P.; Sundaram, A.; Braatz, R.D. Fault Detection and Identification using Bayesian Recurrent Neural Networks. Comput. Chem. Eng. 2019, 141, 106991. [Google Scholar] [CrossRef]
- Freeman, C.; Merriman, J.; Beaver, I.; Mueen, A. Experimental Comparison and Survey of Twelve Time Series Anomaly Detection Algorithms (Extended Abstract). In Proceedings of the IJCAI International Joint Conference on Artificial Intelligence, Vienna, Austria, 23–29 July 2022; Volume 72, pp. 5737–5741. [Google Scholar] [CrossRef]
- Bieber, M.; Verhagen, W.J. A Generic Framework for Prognostics of Complex Systems. Aerospace 2022, 9, 839. [Google Scholar] [CrossRef]
- Kim, G.Y.; Lim, S.M.; Euom, I.C. A Study on Performance Metrics for Anomaly Detection Based on Industrial Control System Operation Data. Electronics 2022, 11, 1213. [Google Scholar] [CrossRef]
- Kim, S.; Choi, K.; Choi, H.S.; Lee, B.; Yoon, S. Towards a Rigorous Evaluation of Time-Series Anomaly Detection. In Proceedings of the AAAI Conference on Artificial Intelligence, Online, 22 February– 1 March 2022; Volume 36, pp. 7194–7201. [Google Scholar] [CrossRef]
- Garg, A.; Zhang, W.; Samaran, J.; Savitha, R.; Foo, C.S. An evaluation of anomaly detection and diagnosis in multivariate time series. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 2508–2517. [Google Scholar] [CrossRef] [PubMed]
- Holland, J.H. Adaptation in Natural and Artificial Systems; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar] [CrossRef]
- Stanovov, V.; Brester, C.; Kolehmainen, M.; Semenkina, O. Why don’t you use Evolutionary Algorithms in Big Data? IOP Conf. Ser. Mater. Sci. Eng. 2017, 173, 012020. [Google Scholar] [CrossRef]
- Konak, A.; Coit, D.W.; Smith, A.E. Multi-objective optimization using genetic algorithms: A tutorial. Reliab. Eng. Syst. Saf. 2006, 91, 992–1007. [Google Scholar] [CrossRef]
- Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 2002, 6, 182–197. [Google Scholar] [CrossRef] [Green Version]
- Liu, R.; Yang, B.; Zio, E.; Chen, X. Artificial intelligence for fault diagnosis of rotating machinery: A review. Mech. Syst. Signal Process. 2018, 108, 33–47. [Google Scholar] [CrossRef]
- Angiulli, F.; Pizzuti, C. Fast outlier detection in high dimensional spaces. In Proceedings of the Principles of Data Mining and Knowledge Discovery: 6th European Conference, PKDD 2002, Helsinki, Finland, 19–23 August 2002; Volume 2431, pp. 15–27. [Google Scholar] [CrossRef] [Green Version]
- Liu, F.T.; Ting, K.M.; Zhou, Z.H. Isolation Forest. In Proceedings of the Eighth IEEE International Conference on Data Mining, Pisa, Italy, 15–19 December 2008; pp. 413–422. [Google Scholar]
- Zhao, Y.; Nasrullah, Z.; Li, Z. PyOD: A python toolbox for scalable outlier detection. J. Mach. Learn. Res. 2019, 20, 1–7. [Google Scholar]
- Lara, J.A.; Lizcano, D.; Rampérez, V.; Soriano, J. A method for outlier detection based on cluster analysis and visual expert criteria. Expert Syst. 2020, 37, e12473. [Google Scholar] [CrossRef]
- Sonneveld, B. Using the Mollifier Method to Characterize Datasets and Models: The Case of the Universal Soil Loss Equation; Technical Report; ITC: Kaunas, Lithuania, 1997. [Google Scholar]
- Challu, C.; Jiang, P.; Wu, Y.N.; Callot, L. Deep Generative model with Hierarchical Latent Factors for Time Series Anomaly Detection. In Proceedings of the 25th International Conference on Artificial Intelligence and Statistics, Virtual, 28–30 March 2022; Volume 151. [Google Scholar]
Method | Hyperparameter | Description | Tested Values |
---|---|---|---|
Isolation Forest | max_samples | Size of the tree, number of samples to draw from X to train each base estimator | 100, 300, 500, 700 |
n_estimators | Number of trees in the ensemble (default is 100 trees) | 100, 200, 300, 400, 500 | |
max_features | Number of features to draw from X to train each base estimator (default value is 1.0) | 5, 10, 15 | |
KNN | n_neighbors | Number of neighbours to use for k neighbours queries | 1, 4, 8, 12, 16 |
p | Parameter for Minkowski metric | 1, 2, 3 | |
method |
| ‘largest’, ‘mean’, ‘median’ | |
algorithm | Algorithm used to compute the nearest neighbours:
| ‘auto’, ‘ball_tree’, ‘kd_tree’ | |
PCA | n_components: | Number of components to keep | Np.arrange(1,20,2) |
OC-SVM | kernel | Specifies the kernel type to be used in the algorithm used to pre-compute the kernel matrix. | ‘rbf’, ‘poly’, ‘sigmoid’, ‘linear’ |
nu | Upper bound on the fraction of training errors and a lower bound of the fraction of support vectors | 0.1, 1, 10, 100, 1000 | |
gamma | Kernel coefficient for ‘rbf’, ‘poly’ and ‘sigmoid’. | np.arange(0,1,0.2) |
Algorithm | Hyperparameter | Chosen Value |
---|---|---|
PCA | n_components | 5 |
iF | n_estimators | 100 |
max_samples | 100 | |
max_features | 10 | |
KNN | n_neighbors | 13 |
p | 1 | |
method | ‘median’ | |
algorithm | ‘auto’ | |
OC-SVM | nu | 0.1 |
Gamma | 0.6 | |
kernel | ‘sigmoid’ |
Settings | F1 | F1pa | FC |
---|---|---|---|
Normalization KNN MAD | 0.213 | 0.588 | 0.319 |
Normalization KNN 0.04 | 0.249 | 0.582 | 0.34 |
Normalization KNN ZSCORE | 0.19 | 0.676 | 0.364 |
Standardization KNN MAD | 0.21 | 0.598 | 0.317 |
Standardization KNN 0.04 | 0.249 | 0.582 | 0.34 |
Algorithm | F1 | F1_pa | FC |
---|---|---|---|
OC-SVM | 0.183 | 0.565 | 0.276 |
KNN | 0.239 | 0.427 | 0.301 |
iF | 0.095 | 0.457 | 0.175 |
PCA | 0.0 | 0.0 | 0.0 |
Baseline | GDF | |
---|---|---|
Settings | KNN | KNN 0.04 |
F1 | 0.239 | 0.249 |
Settings | OC-SVM | Normalization KNN ZSCORE |
F1pa | 0.565 | 0.676 |
Settings | OC-SVM | Normalization KNN ZSCORE |
FC | 0.276 | 0.364 |
Settings | F1 | F1pa | FC |
---|---|---|---|
Normalization PCA | 0.136 | 0.49 | 0.221 |
Normalization KNN | 0.242 | 0.427 | 0.302 |
Standardization KNN | 0.242 | 0.427 | 0.302 |
Standardization OC-SVM | 0.103 | 0.522 | 0.206 |
GDF No Thresholding | GDF Incl Thresholding | |
---|---|---|
Settings | Standardization/normalization KNN | KNN 0.04 |
F1 | 0.242 | 0.249 |
Settings | Standardization OC-SVM | Normalization KNN ZSCORE |
F1pa | 0.522 | 0.676 |
Settings | Standardization/normalization KNN | Normalization KNN ZSCORE |
FC | 0.302 | 0.364 |
Algo | Hyperparam | Chosen Value |
---|---|---|
PCA | n_components | 1 |
iF | n_estimators | 500 |
max_samples | 100 | |
max_features | 5 | |
KNN | n_neighbors | 13 |
p | 1 | |
method | ‘largest’ | |
algorithm | ‘auto’ | |
OC-SVM | nu | 0.1 |
Gamma | 0 | |
kernel | ‘linear’ |
Settings | F1 | F1pa | FC |
---|---|---|---|
normalization PCA AUCP | 0.107 | 0.734 | 0.313 |
Normalization iF AUCP | 0.184 | 0.626 | 0.306 |
Normalization iF MAD | 0.184 | 0.626 | 0.306 |
Normalization iF 0.08 | 0.184 | 0.626 | 0.306 |
Normalization KNN CLUST | 0.233 | 0.620 | 0.36 |
Normalization KNN ZSCORE | 0.249 | 0.553 | 0.346 |
Normalization KNN MAD | 0.259 | 0.524 | 0.338 |
Algorithm | F1 | F1_pa | FC |
---|---|---|---|
OC-SVM | 0.208 | 0.53 | 0.324 |
KNN | 0.251 | 0.488 | 0.324 |
iF | 0.144 | 0.559 | 0.238 |
PCA | 0.166 | 0.554 | 0.261 |
Baseline | GDF | |
---|---|---|
Settings | KNN | Normalization KNN MAD |
F1 | 0.251 | 0.259 |
Settings | iF | Normalization PCA AUCP |
F1pa | 0.559 | 0.734 |
Settings | OC-SVM and KNN | Normalization KNN CLUST/MAD/ZSCORE |
FC | 0.324 | 0.36 |
Settings | F1 | F1pa | FC |
---|---|---|---|
Normalization iF | 0.184074 | 0.596667 | 0.282963 |
Normalization KNN | 0.255185 | 0.503333 | 0.325185 |
Standardization iF | 0.181481 | 0.587407 | 0.291852 |
GDF No Thresholding | GDF Incl Thresholding | |
---|---|---|
Settings | Normalization KNN | Normalization KNN MAD |
F1 | 0.255 | 0.259 |
Settings | Normalization iF | Normalization PCA AUCP |
F1pa | 0.597 | 0.734 |
Settings | Normalization KNN | Normalization KNN CLUST/MAD/ZSCORE |
FC | 0.325 | 0.36 |
Algorithm | Hyperparameter | Chosen Value |
---|---|---|
PCA | n_components | 1 |
iF | n_estimators | 100 |
max_samples | 400 | |
max_features | 10 | |
KNN | n_neighbors | 5 |
p | 1 | |
method | ‘mean’ | |
algorithm | ‘auto’ | |
OC-SVM | nu | 0.1 |
Gamma | 0.8 | |
kernel | ‘rbf’ |
Settings | F1 | F1pa | FC |
---|---|---|---|
Normalization PCA 0.02 | 0.459 | 0.971 | 0.841 |
Normalization PCA 0.04 | 0.489 | 0.949 | 0.8 |
Normalization PCA ZSCORE | 0.113 | 0.983 | 0.903 |
Normalization iF 0.02 | 0.476 | 0.939 | 0.817 |
Normalization KNN MAD | 0.031 | 1.0 | 1.0 |
Normalization KNN ZSCORE | 0.079 | 0.983 | 0.921 |
Normalization KNN 0.06 | 0.607 | 0.839 | 0.794 |
Normalization KNN 0.08 | 0.616 | 0.827 | 0.78 |
Normalization KNN 0.14 | 0.621 | 0.791 | 0.741 |
Standardization PCA 0.04 | 0.489 | 0.949 | 0.8 |
Standardization PCA ZSCORE | 0.113 | 0.983 | 0.903 |
Standardization KNN 0.06 | 0.607 | 0.837 | 0.79 |
Standardization KNN 0.08 | 0.617 | 0.826 | 0.776 |
Standardization KNN 0.12 | 0.619 | 0.804 | 0.754 |
Standardization KNN 0.18 | 0.623 | 0.77 | 0.724 |
Standardization KNN ZSCORE | 0.059 | 1.0 | 0.994 |
Standardization OC-SVM MAD | 0.531 | 0.933 | 0.897 |
Standardization OC-SVM CLUST | 0.601 | 0.907 | 0.841 |
Algorithm | F1 | F1_pa | FC |
---|---|---|---|
OC-SVM | 0.54 | 0.849 | 0.736 |
KNN | 0.613 | 0.814 | 0.766 |
iF | 0.539 | 0.841 | 0.737 |
PCA | 0.336 | 0.597 | 0.499 |
Baseline | GDF | |
---|---|---|
Settings | KNN | Standardization KNN 0.18 |
F1 | 0.613 | 0.623 |
Settings | OC-SVM | KNN MAD/ZSCORE |
F1pa | 0.849 | 1.0 |
Settings | KNN | KNN MAD |
FC | 0.766 | 1.0 |
Settings | F1 | F1pa | FC |
---|---|---|---|
Normalization PCA | 0.536 | 0.879 | 0.744 |
Normalization iF | 0.547 | 0.839 | 0.747 |
Normalization KNN | 0.617 | 0.816 | 0.769 |
Normalization OC-SVM | 0.54 | 0.841 | 0.74 |
Standardization PCA | 0.536 | 0.879 | 0.744 |
Standardization iF | 0.547 | 0.839 | 0.747 |
GDF No Thresholding | GDF Incl Thresholding | |
---|---|---|
Settings | Normalization KNN | Standardization KNN 0.18 |
F1 | 0.617 | 0.623 |
Settings | PCA | KNN MAD/ZSCORE |
F1pa | 0.879 | 1.0 |
Settings | Normalization KNN | KNN MAD |
FC | 0.768 | 1.0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bieber, M.; Verhagen, W.J.C.; Cosson, F.; Santos, B.F. Generic Diagnostic Framework for Anomaly Detection—Application in Satellite and Spacecraft Systems. Aerospace 2023, 10, 673. https://doi.org/10.3390/aerospace10080673
Bieber M, Verhagen WJC, Cosson F, Santos BF. Generic Diagnostic Framework for Anomaly Detection—Application in Satellite and Spacecraft Systems. Aerospace. 2023; 10(8):673. https://doi.org/10.3390/aerospace10080673
Chicago/Turabian StyleBieber, Marie, Wim J. C. Verhagen, Fabrice Cosson, and Bruno F. Santos. 2023. "Generic Diagnostic Framework for Anomaly Detection—Application in Satellite and Spacecraft Systems" Aerospace 10, no. 8: 673. https://doi.org/10.3390/aerospace10080673
APA StyleBieber, M., Verhagen, W. J. C., Cosson, F., & Santos, B. F. (2023). Generic Diagnostic Framework for Anomaly Detection—Application in Satellite and Spacecraft Systems. Aerospace, 10(8), 673. https://doi.org/10.3390/aerospace10080673