This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Open AccessArticle
Towards a Gated Graph Neural Network with an Attention Mechanism for Audio Features with a Situation Awareness Application
by
Jieli Chen
Jieli Chen 1,2,
Kah Phooi Seng
Kah Phooi Seng 3,4,*,
Li Minn Ang
Li Minn Ang 4,
Jeremy Smith
Jeremy Smith 2 and
Hanyue Xu
Hanyue Xu 1,2
1
School of Advanced Technology, Xi’an Jiaotong-Liverpool University, Suzhou 215123, China
2
Department of Electrical Engineering & Electronics, University of Liverpool, Liverpool L69 3BX, UK
3
Faculty of AI and Frontier Technology, UNITAR International University, Petaling Jaya 47301, Malaysia
4
School of Science, Technology and Engineering, University of Sunshine Coast, Petrie, QLD 4502, Australia
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(13), 2621; https://doi.org/10.3390/electronics14132621 (registering DOI)
Submission received: 13 May 2025
/
Revised: 11 June 2025
/
Accepted: 20 June 2025
/
Published: 28 June 2025
Abstract
Situation awareness (SA) involves analyzing sensory data, such as audio signals, to identify anomalies. While acoustic features are widely used in audio analysis, existing methods face critical limitations; they often overlook the relevance of SA audio segments, failing to capture the complex relational patterns in audio data that are essential for SA. In this study, we first propose a graph neural network (GNN) with an attention mechanism that models SA audio features through graph structures, capturing both node attributes and their relationships for richer representations than traditional methods. Our analysis identifies suitable audio feature combinations and graph constructions for SA tasks. Building on this, we introduce a situation awareness gated-attention GNN (SAGA-GNN), which dynamically filters irrelevant nodes through max-relevance neighbor sampling to reduce redundant connections, and a learnable edge gated-attention mechanism that suppresses noise while amplifying critical events. The proposed method employs sigmoid-activated attention weights conditioned on both node features and temporal relationships, enabling adaptive node emphasizing for different acoustic environments. Experiments reveal that the proposed graph-based audio features demonstrate superior representation capacity compared to traditional methods. Additionally, both proposed graph-based methods outperform existing approaches. Specifically, owing to the combination of graph-based audio features and dynamic selection of audio nodes based on gated-attention, SAGA-GNN achieved superior results on two real datasets. This work underscores the importance and potential value of graph-based audio features and attention mechanism-based GNNs, particularly in situational awareness applications.
Share and Cite
MDPI and ACS Style
Chen, J.; Seng, K.P.; Ang, L.M.; Smith, J.; Xu, H.
Towards a Gated Graph Neural Network with an Attention Mechanism for Audio Features with a Situation Awareness Application. Electronics 2025, 14, 2621.
https://doi.org/10.3390/electronics14132621
AMA Style
Chen J, Seng KP, Ang LM, Smith J, Xu H.
Towards a Gated Graph Neural Network with an Attention Mechanism for Audio Features with a Situation Awareness Application. Electronics. 2025; 14(13):2621.
https://doi.org/10.3390/electronics14132621
Chicago/Turabian Style
Chen, Jieli, Kah Phooi Seng, Li Minn Ang, Jeremy Smith, and Hanyue Xu.
2025. "Towards a Gated Graph Neural Network with an Attention Mechanism for Audio Features with a Situation Awareness Application" Electronics 14, no. 13: 2621.
https://doi.org/10.3390/electronics14132621
APA Style
Chen, J., Seng, K. P., Ang, L. M., Smith, J., & Xu, H.
(2025). Towards a Gated Graph Neural Network with an Attention Mechanism for Audio Features with a Situation Awareness Application. Electronics, 14(13), 2621.
https://doi.org/10.3390/electronics14132621
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details
here.
Article Metrics
Article Access Statistics
For more information on the journal statistics, click
here.
Multiple requests from the same IP address are counted as one view.