Previous Article in Journal
Adaptive Multidimensional Model for User Interface Quality Assessment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Consistency-Regularized Hybrid Deep Learning with Entropy-Weighted Attention and Branch Dropout for Intrusion Detection in IoT Networks

ENSIAS, Mohammed V University, Rabat BP 713, Morocco
*
Author to whom correspondence should be addressed.
Future Internet 2026, 18(5), 262; https://doi.org/10.3390/fi18050262
Submission received: 11 February 2026 / Revised: 23 March 2026 / Accepted: 7 April 2026 / Published: 15 May 2026
(This article belongs to the Topic Applications of IoT in Multidisciplinary Areas)

Abstract

Securing IoT networks presents fundamental challenges rooted in hardware constraints: firmware is often non-upgradeable and every security boundary is fixed at manufacture. Machine learning-based intrusion detection offers a scalable response, yet nearly all published systems assume clean training data and clean inference conditions. Production IoT environments satisfy neither assumption. Sensors degrade, packets drop, and adversaries deliberately corrupt telemetry streams to evade detection. The framework described here is built around that reality. The proposed framework is distinguished from prior work by four design decisions. First, three encoding branches, a residual DNN, a 1D-CNN, and a BiLSTM, are run in parallel and are fused by concatenation, each capturing structural patterns in tabular traffic data that the others miss. Second, a dual-view consistency loss trains the model under simultaneous feature masking and Gaussian noise, penalizing prediction divergence between two independently corrupted views of the same sample. Third, we introduce entropy-weighted attention: rather than fixed learned weights, per-feature importance is adjusted dynamically from information entropy measured across training batches, giving higher-entropy features stronger influence because they carry more discriminative variation. Fourth, branch-dropout regularization randomly silences entire branches during training, forcing each to develop independently useful representations instead of co-adapting. Class imbalance is handled through severity-aware loss weighting which scales contributions by the operational cost of missing each attack category, not purely by inverse frequency. On UNSW-NB15, the full model achieves 99.99% accuracy, 100% precision, 99.97% recall, and a false-negative rate of 2.65 × 10−4—the lowest across all compared architectures.
Keywords: hybrid IDS; deep learning; CNN; DNN; BiLSTM; entropy-weighted attention; consistency regularization; branch-dropout; IoT security; UNSW-NB15 hybrid IDS; deep learning; CNN; DNN; BiLSTM; entropy-weighted attention; consistency regularization; branch-dropout; IoT security; UNSW-NB15

Share and Cite

MDPI and ACS Style

Ayyoub, E.H.; Mohammed, M.; Mohamed, L. Consistency-Regularized Hybrid Deep Learning with Entropy-Weighted Attention and Branch Dropout for Intrusion Detection in IoT Networks. Future Internet 2026, 18, 262. https://doi.org/10.3390/fi18050262

AMA Style

Ayyoub EH, Mohammed M, Mohamed L. Consistency-Regularized Hybrid Deep Learning with Entropy-Weighted Attention and Branch Dropout for Intrusion Detection in IoT Networks. Future Internet. 2026; 18(5):262. https://doi.org/10.3390/fi18050262

Chicago/Turabian Style

Ayyoub, El Hariri, Mouiti Mohammed, and Lazaar Mohamed. 2026. "Consistency-Regularized Hybrid Deep Learning with Entropy-Weighted Attention and Branch Dropout for Intrusion Detection in IoT Networks" Future Internet 18, no. 5: 262. https://doi.org/10.3390/fi18050262

APA Style

Ayyoub, E. H., Mohammed, M., & Mohamed, L. (2026). Consistency-Regularized Hybrid Deep Learning with Entropy-Weighted Attention and Branch Dropout for Intrusion Detection in IoT Networks. Future Internet, 18(5), 262. https://doi.org/10.3390/fi18050262

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop