Figure 1.
Recall heatmaps summarizing classifier performance for the Discovery detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 1.
Recall heatmaps summarizing classifier performance for the Discovery detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 2.
Recall heatmaps summarizing classifier performance for the Credential Access detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 2.
Recall heatmaps summarizing classifier performance for the Credential Access detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 3.
Recall heatmaps summarizing classifier performance for the Privilege Escalation Access detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 3.
Recall heatmaps summarizing classifier performance for the Privilege Escalation Access detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 4.
Recall heatmaps summarizing classifier performance for the Exfiltration detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 4.
Recall heatmaps summarizing classifier performance for the Exfiltration detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 5.
Recall heatmaps summarizing classifier performance for the Lateral Movement detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 5.
Recall heatmaps summarizing classifier performance for the Lateral Movement detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 6.
Recall heatmaps summarizing classifier performance for the Resource Development detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 6.
Recall heatmaps summarizing classifier performance for the Resource Development detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 7.
Recall heatmaps summarizing classifier performance for the Defense Evasion detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 7.
Recall heatmaps summarizing classifier performance for the Defense Evasion detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 8.
Recall heatmaps summarizing classifier performance for the Initial Access detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 8.
Recall heatmaps summarizing classifier performance for the Initial Access detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 9.
Recall heatmaps summarizing classifier performance for the Persistence detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Figure 9.
Recall heatmaps summarizing classifier performance for the Persistence detection task across GAN variants and experimental settings. Rows correspond to classifiers, and columns correspond to GAN variants. Panels show the four evaluated configurations: (a) r = 0.25, 400 epochs; (b) r = 0.50, 400 epochs; (c) r = 0.25, 800 epochs; (d) r = 0.50, 800 epochs.
Table 1.
Comparison of related work on class-imbalance handling in intrusion detection.
Table 1.
Comparison of related work on class-imbalance handling in intrusion detection.
| Study | Year | Technique | Dataset | Key Contribution |
|---|
| Chawla et al. [1] | 2002 | SMOTE | Various | Introduced synthetic oversampling using interpolation between minority samples |
| He et al. [2] | 2008 | ADASYN | Various | Adaptive oversampling focusing on difficult minority regions |
| Han et al. [30] | 2005 | Borderline-SMOTE | Various | Generates samples near decision boundaries to improve the minority |
| Insan et al. [26] | 2023 | SMOTE-LOF, Borderline-SMOTE | Various | Combines oversampling with outlier detection to improve sample quality |
| Zhao et al. [34] | 2024 | Vanilla GAN, WGAN, cGAN | CIC-IDS2017 | GAN-based data augmentation to improve intrusion detection performance |
| Agrawal et al. [35] | 2024 | GAN-based generative learning (survey) | NSL-KDD, UNSW-NB15, CICID2017 | Survey of GAN-based synthetic attack data generation |
| Ndayipfukamiye et al. [36] | 2025 | WGAN-GP, cGAN, Hybrid GANs | Various | Systematic review of GAN-based adversarial defense techniques |
| This work | 2026 | Vanilla GAN, cGAN, WGAN, WGAN-GP | UWF-ZeekData22 | Systematic ablation analysis of GAN architectures and augmentation strategies |
Table 2.
Architecture and hyperparameters used in the GAN ablation study.
Table 2.
Architecture and hyperparameters used in the GAN ablation study.
| Category | Parameter | Value |
|---|
| GAN variants | Models evaluated | Vanilla GAN, cGAN, WGAN, WGAN-GP |
| Network architecture | Type | Multilayer perceptron |
| Hidden layers | Number | 2 |
| Hidden units per layer | Size | 128 |
| Latent space | Dimension (z) | 32 |
| Latent distribution | Noise type | Normal |
| Data preprocessing | Feature scaling | [−1, 1] |
| Regularization | Generator dropout | 0.3 |
| Regularization | Discriminator dropout | 0.3 |
| Training | Batch size | 64 |
| Training | Epochs | 400, 800 |
| Augmentation | Ratios | 0.25, 0.50 |
| Evaluation | Cross-validation | Stratified 5-fold |
| WGAN specific | Critic updates | 5 |
| WGAN specific | Weight clipping | 0.01 |
| WGAN-GP specific | Gradient penalty coefficient | 10 |
Table 3.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 3.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Discovery | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 4.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 4.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Discovery | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 5.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 5.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Discovery | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 6.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 6.
Discovery. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Discovery | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 7.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 7.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Credential Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 8.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 8.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Credential Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 9.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 9.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Credential Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 10.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 10.
Credential Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Credential Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 11.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 11.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Privilege Escalation | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 12.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 12.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Privilege Escalation | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 13.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 13.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Privilege Escalation | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 14.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 14.
Privilege Escalation. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Privilege Escalation | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 15.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 15.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Exfiltration | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 16.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 16.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Exfiltration | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 17.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 17.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Exfiltration | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 18.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 18.
Exfiltration. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Exfiltration | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 19.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 19.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Lateral Movement | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 20.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 20.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Lateral Movement | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 21.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 21.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Lateral Movement | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 22.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 22.
Lateral Movement. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Lateral Movement | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 23.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 23.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Resource Development | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 24.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 24.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Resource Development | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 25.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 25.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Resource Development | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 26.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 26.
Resource Development. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Resource Development | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 27.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 27.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Defense Evasion | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 28.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 28.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
| Defense Evasion | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 29.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 29.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Defense Evasion | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 30.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 30.
Defense Evasion. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Defense Evasion | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 31.
Initial Access. Feature Sscaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 31.
Initial Access. Feature Sscaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Initial Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 32.
Initial Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 32.
Initial Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Initial Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 33.
Initial Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 33.
Initial Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Initial Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 34.
Initial Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 34.
Initial Access. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Initial Access | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 35.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 35.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Persistence | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 36.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 36.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 400|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Persistence | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 37.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
Table 37.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.25.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Persistence | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 38.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
Table 38.
Persistence. Feature scaling: [−1, 1]|Noise sensitivity(z): 32|Noise mode: normal|Epochs: 800|Batch size: 64|G dropout: 0.3|D dropout: 0.3|Ratio: 0.50.
| | | Logistic Regression | SVM | KNN | Decision Tree | Random Forest |
|---|
| Persistence | GAN | | | | | |
| CGAN | | | | | |
| WGAN | | | | | |
| WGAN-GP | | | | | |
Table 39.
Ablation training time per variant.
Table 39.
Ablation training time per variant.
| Variant | Time (min) * |
|---|
| GAN | 26 |
| cGAN | 35 |
| WGAN | 29 |
| WGAN-GP | 29 |