DyTSSAM: A Dynamic Dependency Analysis Model Based on DAST
Abstract
1. Introduction
- Fine-grained dependency modeling. We propose a change-driven subtree segmentation and feature aggregation strategy based on DAST, which captures fine-grained structural evolution at both subtree and node levels. By incorporating a dynamic syntactic attention mechanism, our model adaptively aggregates subtree structural information, effectively identifying dependency variations caused by local syntactic changes such as expression replacement and variable renaming.
- Improved predictive accuracy. We design the Dynamic Temporal Syntax-Semantics Aggregation Model (DyTSSAM), which leverages a dual-channel fusion mechanism of structural and temporal features. This enhances the model’s expressive power for discriminating complex dependencies. Experimental results show that DyTSSAM consistently outperforms existing models in terms of AUC and AP, achieving higher predictive accuracy at both node- and subtree-level granularity.
- Support for dependency evolution modeling and analysis. By adopting a code-change–driven DAST segmentation strategy and integrating a temporal semantic attention mechanism, our model captures the evolutionary trends of dependencies throughout code development. This enables fine-grained characterization of dependency evolution without relying on repetitive full-scale static analysis.
2. Related Work
2.1. Program Dependence Analysis Based on Static Program Data
2.2. Temporal Graph Evolution Analysis
2.3. Dynamic Spatiotemporal Dependency Modeling
- RQ1: Can the proposed dynamic syntactic and temporal semantic attention mechanisms effectively enhance the modeling of subtree-level dependency variations, thereby enabling finer-grained dependency analysis?
- RQ2: Can DyTSSAM maintain high predictive accuracy when modeling complex program dependency relationships?
- RQ3: Can DyTSSAM accurately capture dependency evolution trends during code modifications and demonstrate its advantages for dynamic dependency modeling?
3. Preliminary
3.1. Definition of DAST and Dependencies
3.2. Limitations of Traditional Dynamic Graph Neural Networks on DAST
3.3. Problem Definition
4. Methodology
- (1)
- Change-Based Temporal Subtree Segmentation of DAST: The original DAST is segmented into a sequence of subtrees according to code modifications, where each change corresponds to an independent subgraph. This step ensures that dependency modeling remains aligned with the code evolution process and provides fine-grained temporal units for subsequent analysis.
- (2)
- Dynamic Syntax Structure Aggregation (DSAM): The segmented subtrees are integrated at the syntactic-structural level to generate structural features that distinguish different syntactic patterns. The objective of this step is to capture structural variations at both the subtree and node levels, thereby providing a fine-grained foundation for dependency prediction.
- (3)
- Dependency Source Analysis and Temporal Semantic Aggregation (LGDAM and TDAM): Heterogeneous dependency information from multiple sources is integrated, while historical temporal context is incorporated to model the evolving patterns of dependencies. The purpose of this step is to maintain contextual coherence and enhance predictive capability in handling complex dependency relationships.
- (4)
- Contrastive Learning-Based Dependency Analysis (CLM): Based on the aggregated features, the model outputs potential dependency edges along with their types. The goal of this step is to improve both the accuracy and stability of dependency prediction throughout the code evolution process.
4.1. Change-Based Temporal Subtree Segmentation of DAST
4.2. Aggregating Dynamic Syntactic Structural Features
4.3. Dependency Source Analysis and Temporal Semantic Feature Aggregation
4.4. Contrastive Learning and Loss Function
5. Experiments and Analysis
5.1. Dataset and Data Preprocessing
5.2. Evaluation Metrics
5.3. Baselines
- CD-GCN [21]: Combines Graph Convolutional Networks (GCN) and Long Short-Term Memory (LSTM), extracting structural features via GCN and modeling temporal sequences with LSTM.
- DySAT [10]: Employs structural and temporal self-attention mechanisms to jointly model graph structures and temporal dynamics, enabling flexible complexity and improved computational efficiency.
- EvolveGCN [12]: Evolves GCN parameters over time using a recurrent neural network, effectively modeling structural changes.
- TGAT [22]: Integrates self-attention with functional time encoding (via Bochner’s theorem), aggregating temporal neighbors to capture both dynamic topology and temporal interactions.
- DyGNN [23]: Incorporates update and propagation modules, where updates modify node features upon edge arrivals and propagations diffuse the updates across neighbors.
- HGNN+ [24]: Constructs hyperedge groups to capture high-order correlations among modalities or types, and uses adaptive fusion of hyperedge groups to integrate heterogeneous relational information.
- JLineVD+ [11]: A recent code-specific graph neural network designed for vulnerability detection in Java. It enhances subgraph construction through semantic-aware partitioning and integrates pretrained code representations from CodeBERT to strengthen code-level feature extraction and relational reasoning.
5.4. Experimental Setting
5.5. Experiments Results
- Dynamic Syntax Attention Layer (DSAM)—By recursively aggregating subtree information with syntactic constraints, DSAM effectively captures node types and hierarchical features. Compared to direct neighbor aggregation models (e.g., CDGCN), DyTSSAM achieves an AUC improvement of +8.86 percentage points, confirming DSAM’s superior ability to encode fine-grained syntactic dependencies.
- Temporal Dependency Attention Layer (TDAM)—With learnable temporal encodings and interval-aware weighting, TDAM dynamically captures dependency evolution. This design substantially improves Dep_Acc, with DyTSSAM outperforming TGAT by +6.99 percentage points, indicating its capacity to model long-term dependency shifts and reduce noise from outdated edges.
- Local–Global Dependency Analysis Module (LGDAM)—By jointly modeling global structural dependencies and local contextual information, LGDAM enhances representational completeness. DyTSSAM achieves a Recall improvement of +6.38 percentage points over DyGCN, showing that LGDAM effectively improves dependency coverage and robustness, especially in large, evolving codebases.
5.6. Case Study: Fine-Grained Dependency Evolution Analysis
5.7. Ablation Study
- DyTSSAM-V1: Removes both the dynamic syntax module and temporal semantic module, retaining only conventional GCN and GRU components. This serves as a baseline to evaluate performance without the core innovations.
- DyTSSAM-V2: Retains the temporal semantic module but removes the dynamic syntax module, relying on standard GCN for structural aggregation. This variant isolates the effect of temporal semantics.
- DyTSSAM-V3: Retains the dynamic syntax module but removes the temporal semantic module, replacing it with GRU-based temporal aggregation. This variant isolates the effect of dynamic syntax modeling.
- DyTSSAM-V4: The complete model, serving as the benchmark for comparison.
6. Threats to Validity
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Fagan, M. Design and code inspections to reduce errors in program development. In Software Pioneers: Contributions to Software Engineering; Springer: Berlin/Heidelberg, Germany, 2011; pp. 575–607. [Google Scholar]
- Jin, Z.; Liu, F.; Li, G. Program comprehension: Present and future. Ruan Jian Xue Bao/J. Softw. 2019, 30, 110–126. (In Chinese). Available online: http://www.jos.org.cn/1000-9825/5643.htm (accessed on 21 July 2025).
- Deng, W.T.; Cheng, C.; He, P.; Chen, M.Y.; Li, B. Interaction prediction of multigranularity software system based on graph neural network. J. Softw. 2025, 36, 2043–2063. Available online: http://www.jos.org.cn/1000-9825/7207.htm (accessed on 21 July 2025).
- Zhang, Y.; Hu, Y.; Chen, X. Context and multi-features-based vulnerability detection: A vulnerability detection frame based on context slicing and multi-features. Sensors 2024, 24, 1351. [Google Scholar] [CrossRef] [PubMed]
- Gu, S.; Chen, W. Function level code vulnerability detection method of graph neural network based on extended AST. Comput. Sci. 2023, 50, 283–290. [Google Scholar]
- Yao, W.; Jiang, Y.; Yang, Y. The metric for automatic code generation based on dynamic abstract syntax tree. Int. J. Digit. Crime Forensics 2023, 15, 20. [Google Scholar] [CrossRef]
- Agarwal, S.; Agrawal, A.P. An empirical study of control dependency and data dependency for large software systems. In Proceedings of the 2014 5th International Conference—Confluence: The Next Generation Information Technology Summit, Noida, India, 25–26 September 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 877–879. [Google Scholar]
- Kalhauge, C.G.; Palsberg, J. Binary reduction of dependency graphs. In Proceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering, Tallinn, Estonia, 26–30 August 2019; ACM: New York, NY, USA, 2019; pp. 556–566. [Google Scholar]
- Guo, H.; Chen, X.; Huang, Y.; Wang, Y.; Ding, X.; Zheng, Z.; Zhou, X.; Dai, H. Snippet comment generation based on code context expansion. ACM Trans. Softw. Eng. Methodol. 2023, 33, 24. [Google Scholar] [CrossRef]
- Roy, J.; Patel, R.; Simon, S. Dynamic Syntax Tree Model for Enhanced Source Code Representation. J. Softw. Eng. Res. Dev. 2023. preprint. [Google Scholar]
- Lekeufack Foulefack, R.Z.; Marchetto, A. Enhanced Graph Neural Networks for Vulnerability Detection in Java via Advanced Subgraph Construction. In Proceedings of the IFIP International Conference on Testing Software and Systems, London, UK, 30 October–1 November 2024; Springer Nature: Cham, Switzerland, 2024; pp. 131–148. [Google Scholar]
- Sankar, A.; Wu, Y.; Gou, L.; Zhang, W.; Yang, H. DySAT: Deep neural representation learning on dynamic graphs via self-attention networks. In Proceedings of the 13th International Conference on Web Search and Data Mining, Houston, TX, USA, 3–7 February 2020; ACM: New York, NY, USA, 2020; pp. 519–527. [Google Scholar]
- Pareja, A.; Domeniconi, G.; Chen, J.; Ma, T.; Suzumura, T.; Kanezashi, H.; Kaler, T.; Schardl, T.; Leiserson, C. EvolveGCN: Evolving graph convolutional networks for dynamic graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; AAAI Press: Palo Alto, CA, USA, 2020; Volume 34, pp. 5363–5370. [Google Scholar]
- Cui, Z.; Li, Z.; Wu, S.; Zhang, X.; Liu, Q.; Wang, L.; Ai, M. DyGCN: Efficient dynamic graph embedding with graph convolutional network. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 4635–4646. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.L.; Zhang, G.W.; Yu, J.; Xu, L.Y. Dynamic graph structure learning for multivariate time series forecasting. Pattern Recognit. 2023, 138, 109423. [Google Scholar] [CrossRef]
- Mu, Z.; Zhuang, Y.; Tang, S. Contrastive Hawkes graph neural networks with dynamic sampling for event prediction. Neurocomputing 2024, 575, 127265. [Google Scholar] [CrossRef]
- Xia, Z.; Zhang, Y.; Yang, J.; Xie, L. Dynamic spatial–temporal graph convolutional recurrent networks for traffic flow forecasting. Expert Syst. Appl. 2024, 240, 122381. [Google Scholar] [CrossRef]
- Jiang, Y.; Huang, P.; Gu, J. Analysis of the impact scope of code changes based on DAST and GCN. J. Kunming Univ. Sci. Technol. (Nat. Sci.) 2024, 49, 118–127. [Google Scholar]
- Martínez, V.; Berzal, F.; Cubero, J.C. A survey of link prediction in complex networks. ACM Comput. Surv. 2016, 49, 1–33. [Google Scholar] [CrossRef]
- Zhou, T. Discriminating abilities of threshold-free evaluation metrics in link prediction. Phys. A Stat. Mech. Its Appl. 2025, 615, 128529. [Google Scholar] [CrossRef]
- Manessi, F.; Rozza, A.; Manzo, M. Dynamic graph convolutional networks. Pattern Recognit. 2020, 97, 107000. [Google Scholar] [CrossRef]
- Xu, D.; Ruan, C.; Korpeoglu, E.; Kumar, S.; Achan, K. Inductive representation learning on temporal graphs. arXiv 2020, arXiv:2002.07962. [Google Scholar] [CrossRef]
- Ma, Y.; Guo, Z.; Ren, Z.; Tang, J.; Yin, D. Streaming graph neural networks. In Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual, 25–30 July 2020; ACM: New York, NY, USA, 2020; pp. 719–728. [Google Scholar]
- Gao, Y.; Feng, Y.; Ji, J.R. HGNN+: General hypergraph neural networks. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 3181–3199. [Google Scholar] [CrossRef] [PubMed]














| Dependency Type | Parent Node Type | Semantic Description |
|---|---|---|
| Assign_VD | Assign → Assign, Call, Return, etc. | Variable dependency: a variable references another variable; the calling variable points to the referenced variable. |
| For_VD | For → Assign, Call, Return, etc. | Loop variable dependency: a statement accesses variables defined in the for loop header; the statement points to the loop variable. |
| VC | If, While, For → Assign, Call, Return, etc. | Control-flow dependency: statements inside a control block point to the controlling statement. |
| Arg | FunctionDef → Assign, Call, Return, etc. | Argument dependency: a statement in a function body references a parameter defined in the function header. |
| Call | Call → FunctionDef | Function call dependency: a statement calls a function; the Call node points to the corresponding FunctionDef node. |
| Ret | FunctionDef → Return | Return dependency: the return statement connects back to the function definition to represent the return path. |
| Glo | Assign → Global | Global declaration dependency: a global statement declares a local variable as global, pointing from the declaration to its defining statement. |
| Symbol | Definition |
|---|---|
| Node feature matrix at time step t | |
| Set of syntactic edges in the AST at time t | |
| Set of dependency edges in the DAST at time t | |
| Feature matrix of syntactic edges at time t | |
| Memory bank storing node representations up to time t | |
| Updated representation of node v at time t | |
| Temporarily computed node feature before decay fusion | |
| Time interval between consecutive code changes | |
| Learnable temporal decay weight for node v | |
| Linear transformation matrices at layer l | |
| Query, Key, and Value vectors at layer l | |
| Enhanced dependency feature after LGDAM | |
| Encoded temporal feature by TimeEncoder | |
| Query, Key, and Value representations in TDAM | |
| h | Output feature for dependency prediction |
| Model | AUC | Recall | AP | Dep_Acc | MRR |
|---|---|---|---|---|---|
| DyTSSAM | |||||
| CDGCN | |||||
| DySAT | |||||
| EvolveGCN | |||||
| DyGCN | |||||
| TGAT | |||||
| DyGNN | |||||
| HGNN | |||||
| JLineVD+ |
| Model | AUC | AP | Recall | Dep_Acc | MRR |
|---|---|---|---|---|---|
| DyTSSAM-V1 | 0.6361 | 0.5593 | 0.5870 | 0.6746 | 0.4856 |
| DyTSSAM-V2 | 0.7457 | 0.7425 | 0.6400 | 0.7745 | 0.7223 |
| DyTSSAM-V3 | 0.8833 | 0.8733 | 0.7618 | 0.8724 | 0.8869 |
| DyTSSAM-V4 | 0.9540 | 0.9720 | 0.8500 | 0.9400 | 0.9215 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhao, Y.; Jiang, Y.; Huang, P. DyTSSAM: A Dynamic Dependency Analysis Model Based on DAST. Electronics 2025, 14, 4443. https://doi.org/10.3390/electronics14224443
Zhao Y, Jiang Y, Huang P. DyTSSAM: A Dynamic Dependency Analysis Model Based on DAST. Electronics. 2025; 14(22):4443. https://doi.org/10.3390/electronics14224443
Chicago/Turabian StyleZhao, Yuxiang, Ying Jiang, and Peifeng Huang. 2025. "DyTSSAM: A Dynamic Dependency Analysis Model Based on DAST" Electronics 14, no. 22: 4443. https://doi.org/10.3390/electronics14224443
APA StyleZhao, Y., Jiang, Y., & Huang, P. (2025). DyTSSAM: A Dynamic Dependency Analysis Model Based on DAST. Electronics, 14(22), 4443. https://doi.org/10.3390/electronics14224443

