# Deep Generative Models-Based Anomaly Detection for Spacecraft Control Systems

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Problem and Data Description

#### 2.1. Problem Description

#### 2.2. LUNASIM

#### 2.3. Data Schema

## 3. Preliminaries

#### 3.1. Variational Autoencoder (VAE)

#### 3.2. GANomaly

#### 3.3. Bayesian Optimization

^{+}) = max

_{i}f(p

_{i})) of the points (f(p

_{1}), f(p

_{2}),…, f(p

_{n})) investigated so far and the difference between the function value and f(p

^{+}). The expression of EI when using GP is summarized as follows:

## 4. Method

#### 4.1. Data Preprocessing

_{min})/(x

_{max}− x

_{min}),

_{min}is the minimum value encountered for a given variable over the entire data set, and x

_{max}is the maximum value encountered for a given variable over the entire data set. However, the min–max normalization may be inaccurate in the presence of outliers. The data set used in this paper did not have any outliers in the simulated data; therefore, we chose the min–max normalization method.

#### 4.2. Algorithms

#### 4.3. Hyperparameter Optimization

^{−5}, 10

^{−3}], [0.2, 0.8], and [30, 500] respectively. Using the optimization method, the optimized parameters according to the two types of generation models and the performance when they are applied can be seen in Table 10.

## 5. Results and Discussion

#### 5.1. Learning Models and Operation Modes

#### 5.2. Assistive Tools for AD

#### 5.3. Efficient Monitoring Tool According to Purpose

#### 5.3.1. AD in Operation Scenario

#### 5.3.2. AD in Operation Mode

#### 5.3.3. Learning Model Update

## 6. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Wang, P.; Li, Y.; Reddy, C.K. Machine learning for survival analysis: A survey. ACM Comput. Surv.
**2019**, 51, 110. [Google Scholar] [CrossRef] - Chandola, V.; Banerjee, A.; Kumar, V. Anomaly detection: A survey. ACM Comput. Surv.
**2009**, 41, 15. [Google Scholar] [CrossRef] - Yairi, T.; Kawahara, Y.; Fujimaki, R.; Sato, Y.; Machida, K. Telemetry-mining: A machine learning approach to anomaly detection and fault diagnosis for space systems. In Proceedings of the 2nd IEEE International Conference on Space Mission Challenges for Information Technology (SMC-IT’06), Pasadena, CA, USA, 17–20 July 2006. [Google Scholar]
- Ibrahim, S.K.; Ahmed, A.; Zeidan, M.A.E.; Ziedan, I.E. Machine Learning Techniques for Satellite Fault Diagnosis. Ain Shams Eng. J. 2019. [CrossRef]
- Galvan, D.A.; Hemenway, B.; Welser, I.; Baiocchi, D. Satellite Anomalies: Benefits of a Centralized Anomaly Database and Methods for Securely Sharing Information among Satellite Operators; Rand National Defense Research Institute: Santa Monica, CA, USA, 2014. [Google Scholar]
- Siahpush, A.; Gleave, J. A brief survey of attitude control systems for small satellites using momentum concepts. In Proceedings of the 2nd Annual AIAA/USU Conference on Small Satellites, Logan, UT, USA, 18–21 September 1988. [Google Scholar]
- Chatfield, C. The Analysis of Time Series: An Introduction; Chapman and Hall/CRC: London, UK, 2016. [Google Scholar]
- Cryer, J.D.; Kellet, N. Time Series Analysis; Springer: New York, NY, USA, 1991. [Google Scholar]
- Brillinger, D.R. Time Series: Data Analysis and Theory; SIAM: Philadelphia, PA, USA, 1981; Volume 36. [Google Scholar]
- Anderson, T.W. The Statistical Analysis of Time Series; John Wiley & Sons: Hoboken, NJ, USA, 2011; Volume 19. [Google Scholar]
- Tipping, M.E.; Bishop, C.M. Mixtures of probabilistic principal component analyzers. Neural Comput.
**1999**, 11, 443–482. [Google Scholar] [CrossRef] [PubMed] - Yairi, T.; Takeishi, N.; Oda, T.; Nakajima, Y.; Nishimura, N.; Takata, N. A data-driven health monitoring method for satellite housekeeping data based on probabilistic clustering and dimensionality reduction. IEEE Trans. Aerosp. Electron. Syst.
**2017**, 53, 1384–1401. [Google Scholar] [CrossRef] - Gamboa, J.C.B. Deep learning for time-series analysis. arXiv arXiv:1701.01887, 2017.
- Ahn, H.; Choi, H.-L.; Kang, M.; Moon, S. Learning-Based Anomaly Detection and Monitoring for Swarm Drone Flights. Appl. Sci.
**2019**, 9, 5477. [Google Scholar] [CrossRef] [Green Version] - Längkvist, M. Modeling Time-Series with Deep Networks; Örebro University: Örebro, Sweden, 2014. [Google Scholar]
- OMeara, C.; Schlag, L.; Wickler, M. Applications of Deep Learning Neural Networks to Satellite Telemetry Monitoring. In Proceedings of the 2018 SpaceOps Conference, Marseille, France, 28 May–1 June 2018; p. 2558. [Google Scholar]
- Wei, W.; Wu, H.; Ma, H. An autoencoder and LSTM-based traffic flow prediction method. Sensors
**2019**, 19, 2946. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Park, P.; Marco, P.D.; Shin, H.; Bang, J. Fault Detection and Diagnosis Using Combined Autoencoder and Long Short-Term Memory Network. Sensors
**2019**, 19, 4612. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Liu, X.; Zhou, Q.; Zhao, J.; Shen, H.; Xiong, X. Fault Diagnosis of Rotating Machinery under Noisy Environment Conditions Based on a 1-D Convolutional Autoencoder and 1-D Convolutional Neural Network. Sensors
**2019**, 19, 972. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Chen, K.; Mao, Z.; Zhao, H.; Jiang, Z.; Zhang, J. A Variational Stacked Autoencoder with Harmony Search Optimizer for Valve Train Fault Diagnosis of Diesel Engine. Sensors
**2020**, 20, 223. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Jung, D.; Kwon, J.W.; Baek, K.; Ahn, H.W. Attitude Control Simulator for the Korea Pathfinder Lunar Orbiter. In Asia-Pacific International Symposium on Aerospace Technology; Springer: Singapore, 2018; pp. 2521–2532. [Google Scholar]
- Jung, D.; Kwon, J.W.; Seo, H.H.; Yim, J.R. New Concepts for the Korea Pathfinder Lunar Orbiter Attitude Control System Simulator. In Proceedings of the 2017 Asia-Pacific International Symposium on Aerospace Technology, Seoul, Korea, 16–18 October 2017; pp. 637–638. [Google Scholar]
- Kingma, D.P.; Welling, M. Auto-encoding variational bayes. arXiv arXiv:1312.6114, 2013.
- Akcay, S.; Atapour-Abarghouei, A.; Breckon, T.P. Ganomaly: Semi-supervised anomaly detection via adversarial training. In Asian Conference on Computer Vision; Springer: Cham, Switzerland, 2018; pp. 622–637. [Google Scholar]
- Bourlard, H.; Kamp, Y. Auto-association by multilayer perceptrons and singular value decomposition. Biol. Cybern.
**1988**, 59, 291–294. [Google Scholar] [CrossRef] [PubMed] - Škvára, V.; Pevný, T.; Šmídl, V. Are generative deep models for novelty detection truly better? arXiv arXiv:1807.05027, 2018.
- Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat.
**1951**, 22, 79–86. [Google Scholar] [CrossRef] - Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. In Proceedings of the Neural Information Processing Systems, Montreal, QC, Canada, 8–13 December 2014; pp. 2672–2680. [Google Scholar]
- Schlegl, T.; Seeböck, P.; Waldstein, S.M.; Schmidt-Erfurth, U.; Langs, G. Unsupervised anomaly detection with generative adversarial networks to guide marker discovery. In Proceedings of the International Conference on Information Processing in Medical Imaging, University in Boone, Boone, NC, USA, 25–30 June 2017; pp. 146–157. [Google Scholar]
- Snoek, J.; Larochelle, H.; Adams, R.P. Practical bayesian optimization of machine learning algorithms. In Proceedings of the Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–8 December 2012; pp. 2951–2959. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res.
**2011**, 12, 2825–2830. [Google Scholar] - Sola, J.; Sevilla, J. Importance of input data normalization for the application of neural networks to complex industrial problems. IEEE Trans. Nucl. Sci.
**1997**, 44, 1464–1468. [Google Scholar] [CrossRef] - Kiranyaz, S.; Avci, O.; Abdeljaber, O.; Ince, T.; Gabbouj, M.; Inman, D.J. 1D Convolutional Neural Networks and Applications: A Survey. arXiv arXiv:1905.03554, 2019.
- Abdeljaber, O.; Avci, O.; Kiranyaz, S.; Gabbouj, M.; Inman, D.J. Real-time vibration-based structural damage detection using one-dimensional convolutional neural networks. J. Sound Vib.
**2017**, 388, 154–170. [Google Scholar] [CrossRef] - Ince, T.; Kiranyaz, S.; Eren, L.; Askar, M.; Gabbouj, M. Real-time motor fault detection by 1-D convolutional neural networks. IEEE Trans. Ind. Electron.
**2016**, 63, 7067–7075. [Google Scholar] [CrossRef] - Sønderby, C.K.; Raiko, T.; Maaløe, L.; Sønderby, S.K.; Winther, O. How to train deep variational autoencoders and probabilistic ladder networks. In Proceedings of the 33rd International Conference on Machine Learning (ICML 2016), New York, NY, USA, 19–24 June 2016. [Google Scholar]

**Figure 1.**Algorithm architectures: (

**a**) anomaly detection (AD) algorithm architecture using variational autoencoder (VAE) structure; (

**b**) AD algorithm architecture using generative adversarial networks AD (GANomaly) structure.

**Figure 2.**Relationship between warm up period and area under receiver operating characteristics (AUROC), (GANomaly, Sun Pointing (SP) mode).

**Figure 3.**An anomaly example in LUNar Attitude and Orbit Control System SIMulator (LUNASIM): (

**a**) a model example; (

**b**) a fault in LUNASIM’s gyroscope model; (

**c**) nonlinear changes in the modeled spacecraft control error; and (

**d**) nonlinear changes in integrated angular rate.

**Figure 4.**Results of AD using the deep-learning models: (

**a**) the result of variational autoencoder (VAE) model; (

**b**) the result of GANomaly model.

**Figure 5.**Anomaly continuing after injection point: (

**a**) in case of anomaly injected; (

**b**) in case of normal condition.

**Figure 8.**Failure types of each mode in MT8 scenario: (

**a**) the result of GANomaly model; (

**b**) the result of VAE model.

**Figure 10.**Gyro failure type of each scenario in Wheel Off Loading (WOL) mode: (

**a**) the results of the MT8 and MT9 scenarios in the event of gyroscope failure in TP mode; (

**b**) knowledge error in MT8 scenario; (

**c**) knowledge error in MT9 scenario.

**Figure 11.**Reaction Wheels (RWA) and Gyro failure type of each scenario in Target Pointing (TP) mode: (

**a**) failures in MT2, MT9, TP1, and TP3 scenario when RWA and gyro failures are mixed in TP mode; (

**b**) anomalies of gyroscope rate in the MT2 scenario; (

**c**) anomalies of wheel speed in the MT2 scenario.

Units | Components | Main Output Signal |
---|---|---|

Sensors | Star sensor (STA) | Attitude (3-axis) |

Gyroscope (GRA) | Angular rate | |

Coarse Sun sensor (CSSA) | Attitude (2-axis) | |

Actuators | Reaction wheel (RWA) | Torque |

Mode Name | Sensors(s) | Actuator(s) | Description |
---|---|---|---|

LAM | Star Trackers (STA) Gyros (GRA) | Reaction Wheels (RWA) | Large Angle Maneuver (LAM) to set up Del-V attitude |

TP | STA, GRA | RWA | Target Pointing (TP) mode for imaging |

SP | STA, GRA | RWA | Sun-Pointing (SP) mode for solar array charging and Earth communications maintenance |

WOL | STA, GRA | Attitude Control Thrusters (ACT) | Wheel Off Loading (WOL) mode for dumping RWA momentum using thrusters |

TSH | CSSA, GRA | ACT | Thruster-based Safe-Hold (TSH)mode just after launch vehicle separation and critical spacecraft failures |

Scenario Name | Modes/Transition Times (sec) | Length (sec) |
---|---|---|

LAM1 | TP (199)→ LAM (699) → SP (2500) | 3664 |

MT1 | TSH (2) → SP (1200) → TP (3999) | 6000 |

MT2 | TP (2) → SP (1799) → TP (3000) | 4000 |

MT3 | SP (2) → LAM (2100) → TP (3900) | 4500 |

MT4 | SP (2) → LAM (2100) → SP (3900) | 4500 |

MT8 | SP (2) → WOL (2199) → SP (2800) | 4000 |

MT9 | TP (2) → WOL (2199) → TP (2800) | 4000 |

SP1 | SP (2) | 14,399 |

SP2 | SP (2) | 7200 |

TP1 | TP (9) | 4800 |

TP3 | TP (9) | 4800 |

TSH1 | TSH (2) | 14,399 |

TSH2 | TSH (2) | 7200 |

WOL1 | WOL (9) → TP (999) | 3000 |

WOL2 | WOL (9) → SP (999) | 3000 |

Variable | Column Description | Units | Type |
---|---|---|---|

Current_time | Simulation Time | sec | Continuous |

(W_body(23)-pL- > ACS_swwsc(24))*R2D | Estimated Rate Error between Model and FSW | deg | Continuous |

(W_body(1)-pL- > ACS_swwsc(1))*R2D | Estimated Rate Error between Model and FSW | deg | Continuous |

(W_body(2)-pL- > ACS_swwsc(2))*R2D | Estimated Rate Error between Model and FSW | deg | Continuous |

AD_error(0)*R2D | Knowledge Error | deg | Continuous |

AD_error(1)*R2D | Knowledge Error | deg | Continuous |

AD_error(2)*R2D | Knowledge Error | deg | Continuous |

Scenario Name | LAM1 | MT1 | MT3 | MT4 | MT8 | MT9 | SP1 | SP2 | TP1 | TP3 | TSH1 | TSH2 | WOL1 | WOL2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

RWA 1,2,3,4 failure | 4 | - | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | - | - | - | - |

RWA 1 wheel speed spike | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | - | - | 4 | 4 |

Gyro rate spike | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 |

CSSA 1,4 count spike | - | 1 | - | - | - | - | - | - | - | - | 4 | 4 | - | - |

Combined RWA1 wheel speed + Gyro rate spike | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | - | - | 1 | 1 |

Combined CSSA spike + RWA 1 wheel speed | - | 1 | - | - | - | - | - | - | - | - | - | - | - | - |

Combined Gyro rate + CSSA spike | - | - | - | - | - | - | - | - | - | - | 1 | 1 | - | - |

Combined CSSA spike + RWA 1 wheel speed + Gyro rate spike | - | 1 | - | - | - | - | - | - | - | - | - | - | - | - |

Layer | Input → Output | Operation | |
---|---|---|---|

Input | (n, 1063, 8) * → (n, 1063, 8) | Min/Max Normalization | |

Hidden | (n, 1063, 8) → (n, 531, 4) | 1D Convolution, 1D Batch Normalization, Activation Function (ReLU) | |

Hidden | (n, 531, 4) → (n, 265, 2) | 1D Convolution, 1D Batch Normalization, Activation Function (ReLU) | |

Hidden | (n, 265, 2) → (n, 132, 1) | 1D Convolution, 1D Batch Normalization, Activation Function (ReLU) | |

Flatten | (n, 132, 1) → (n, 132) | Flatten | |

Hidden | (n, 132) → (n, 100) | Linear, Batch Normalization, Activation Function (ReLU) | |

Output | Mean Layer | (n, 100) → (n, 32) | Linear |

S.D. Layer | (n, 100) → (n, 32) | Linear |

Layer | Input → Output | Operation |
---|---|---|

Input | (n, 32) * → (n, 32) | Parameterization |

Hidden | (n, 32) → (n, 100) | Linear, Batch Normalization, Activation Function (ReLU) |

Hidden | (n, 100) → (n, 264) | Linear, Batch Normalization, Activation Function (ReLU) |

Hidden | (n, 264) → (n, 132, 2) | Reshape |

Hidden | (n, 132,2) → (n, 265, 4) | 1D Transposed Convolution, 1D Batch Normalization, Activation Function (ReLU) |

Hidden | (n, 265, 4) → (n, 531, 5) | 1D Transposed Convolution, 1D Batch Normalization, Activation Function (ReLU) |

Output | (n, 531, 5) → (n, 1063, 8) | 1D Transposed Convolution |

Layer | Encoder Number | Input → Output | Operation |
---|---|---|---|

Input | Ⅰ | (n, 1063, 8) * → (n, 1063, 8) | Min/Max Normalization |

Ⅱ | (n, 1063, 8) | - | |

Ⅲ | (n, 1063, 8) | - | |

Hidden | Ⅰ | (n, 1063, 8) → (n, 531, 4) | 1D Convolution, 1D Batch Normalization, Activation Function (ReLU) |

Ⅱ | |||

Ⅲ | |||

Hidden | Ⅰ | (n, 531, 4)→ (n, 265, 2) | 1D Convolution, 1D Batch Normalization, Activation Function (ReLU) |

Ⅱ | |||

Ⅲ | |||

Hidden | Ⅰ | (n, 265, 2)→ (n, 132, 1) | 1D Convolution, 1D Batch Normalization, Activation Function (ReLU) |

Ⅱ | |||

Ⅲ | |||

Flatten | Ⅰ | (n, 132, 1) → (n, 132) | Flatten |

Ⅱ | |||

Ⅲ | |||

Hidden | Ⅲ | (n, 132) → (n, 100) | Linear |

Output | Ⅰ | (n, 132) → (n, 100) | Linear |

Ⅱ | (n, 132)→ (n, 100) | Linear | |

Ⅲ | (n, 100) → (n, 1) | Linear, Sigmoid |

Layer | Input → Output | Operation |
---|---|---|

Input | (n, 100) * | - |

Hidden | (n, 100) → (n, 264) | Linear, Batch Normalization, Activation Function (ReLU) |

Hidden | (n, 264) → (n, 132, 2) | Reshape |

Hidden | (n, 132, 2) → (n, 265, 4) | 1D Transposed Convolution, 1D Batch Normalization, Activation Function (ReLU) |

Flatten | (n, 265, 4) → (n, 531, 5) | 1D Transposed Convolution, 1D Batch Normalization, Activation Function (ReLU) |

Output | (n, 531, 5) → (n, 1063, 8) | 1D Transposed Convolution |

Mode | VAE | GANomaly | |||||
---|---|---|---|---|---|---|---|

Warm Up Period | L_{r} | λ | AUROC | Warm Up Period | L_{r} | AUROC | |

LAM | 153.8 | 10^{−3.022} | 0.208 | 1.000 | 197.3 | 10^{−3.305} | 1.000 |

SP | 300 | 10^{−5.000} | 0.200 | 0.686 | 50 | 10^{−4.649} | 0.851 |

TP | 100 | 10^{−5.000} | 0.800 | 0.916 | 50 | 10^{−5.000} | 0.977 |

TSH | 300 | 10^{−4.892} | 0.208 | 1.000 | 200 | 10^{−5.000} | 1.000 |

WOL | 244.5 | 10^{−0.300} | 0.200 | 0.997 | 27.12 | 10^{−5.000} | 0.887 |

SP Mode | VAE | GANomaly | |||||
---|---|---|---|---|---|---|---|

Warm Up Period | L_{r} | λ | AUROC | Warm Up Period | L_{r} | AUROC | |

8 Hz | 300 | 10^{−5.000} | 0.200 | 0.686 | 50 | 10^{−4.649} | 0.851 |

4 Hz | 200 | 10^{−4.999} | 0.203 | 0.764 | 30 | 10^{−2.805} | 0.906 |

1 Hz | 271.6 | 10^{−5.000} | 0.800 | 0.917 | 142.8 | 10^{−1.000} | 0.931 |

Model | LAM Mode | |||
---|---|---|---|---|

Warm Up Period | L_{r} | λ | AUROC | |

GANomaly | 100.0 | 10^{−5.0} | - | 1.0 |

VAE | 279.9 | 10^{−5.0} | 0.2 | 1.0 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ahn, H.; Jung, D.; Choi, H.-L.
Deep Generative Models-Based Anomaly Detection for Spacecraft Control Systems. *Sensors* **2020**, *20*, 1991.
https://doi.org/10.3390/s20071991

**AMA Style**

Ahn H, Jung D, Choi H-L.
Deep Generative Models-Based Anomaly Detection for Spacecraft Control Systems. *Sensors*. 2020; 20(7):1991.
https://doi.org/10.3390/s20071991

**Chicago/Turabian Style**

Ahn, Hyojung, Dawoon Jung, and Han-Lim Choi.
2020. "Deep Generative Models-Based Anomaly Detection for Spacecraft Control Systems" *Sensors* 20, no. 7: 1991.
https://doi.org/10.3390/s20071991