A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity
Abstract
Share and Cite
Chechkin, A.; Pleshakova, E.; Gataullin, S. A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity. Technologies 2025, 13, 223. https://doi.org/10.3390/technologies13060223
Chechkin A, Pleshakova E, Gataullin S. A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity. Technologies. 2025; 13(6):223. https://doi.org/10.3390/technologies13060223
Chicago/Turabian StyleChechkin, Aleksandr, Ekaterina Pleshakova, and Sergey Gataullin. 2025. "A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity" Technologies 13, no. 6: 223. https://doi.org/10.3390/technologies13060223
APA StyleChechkin, A., Pleshakova, E., & Gataullin, S. (2025). A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity. Technologies, 13(6), 223. https://doi.org/10.3390/technologies13060223