High-Order Topology-Enhanced Graph Convolutional Networks for Dynamic Graphs
Abstract
:1. Introduction
- We highlight the importance of group behaviors induced by high-order structures in graph dynamics. To model the different mechanisms of pairwise influence and group effects, we use a two-branch graph neural network, with one branch feeding the skeleton network and the other branch feeding the high-order structures.
- Our proposed method achieves optimal results on the link prediction task for three real-world network datasets, with a maximum improvement of 633%. The ablation study further verifies the effectiveness of introducing high-order information.
- We use learnable parameters to tune the relative importance of the strength of high-order and pairwise relationships, and further compare various schemes for combining high- and low-order information. We find that the simple linear combination of high- and low- order information can achieve satisfactory results.
2. Related Work
2.1. Static Graph Representation Learning
2.2. Dynamic Graph Representation Learning
3. Method
3.1. Definitions
3.2. High-Order Subgraph
3.3. Model
4. Experiments
4.1. Datasets
4.2. Baselines
- GCN [8]. GCN is designed for static graph modeling. In the learning process, a single GCN is used for each graph snapshot, and the loss is accumulated over time. Only the information from the previous one time step is used to predict the subsequent one when this method is applied.
- GCN-GRU. A single GCN model is co-trained with a recurrent model using gated recurrent units (GRU), where one GCN models every time step and GRU learned from the sequential node embeddings.
- dyngraph2vec [20]. Three variations of the unsupervised graph representation model dyngraph2vec: (1) dyngraph2vecAE extends the autoencoders to the dynamic setting, (2) dyngraph2vecRNN utilizes sparsely connected long short-term memory (LSTM) networks, and (3) dyngraph2vecAERNN combines both autoencoders and LSTM networks.
- EvolveGCN [21]. It captures the mechanism of a dynamic graph by using an RNN to evolve the GCN parameters. Two different architectures, EvolveGCN-H and EvolveGCN-O, are designed to cope with different scenarios. The H version incorporates additional node embedding in the recurrent network, while the O version focuses on the structural change of the graph.
4.3. Tasks
4.4. Experimental Results
5. Discussion
5.1. Enhanced Models with Different Schemes
- ConcatThis scheme takes the form of the regularly used concatenation, and it is paired with one MLP to learn the final node representations.
- GatedThe gated scheme employs the reset gate to determine how to combine higher-order information with lower-order information, and the update gate defines the amount of lower-order information saved at the current time step.
5.2. The Effectiveness of Introducing High-Order Structures
5.3. The Stability Brought by Introducing High-Order Structures
5.4. The Dataset
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Hamilton, W.L.; Ying, R.; Leskovec, J. Representation learning on graphs: Methods and applications. arXiv 2017, arXiv:1709.05584. [Google Scholar]
- Cai, H.; Zheng, V.W.; Chang, K.C.C. A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications. IEEE Trans. Knowl. Data Eng. 2018, 30, 1616–1637. [Google Scholar] [CrossRef] [Green Version]
- Roweis, S.T.; Saul, L.K. Nonlinear dimensionality reduction by locally linear embedding. Science 2000, 290, 2323–2326. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Perozzi, B.; Al-Rfou, R.; Skiena, S. Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, 24–27 August 2014; pp. 701–710. [Google Scholar]
- Grover, A.; Leskovec, J. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 855–864. [Google Scholar]
- Wang, D.; Cui, P.; Zhu, W. Structural deep network embedding. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 1225–1234. [Google Scholar]
- Bruna, J.; Zaremba, W.; Szlam, A.; LeCun, Y. Spectral networks and locally connected networks on graphs. arXiv 2013, arXiv:1312.6203. [Google Scholar]
- Kipf, T.N.; Welling, M. Semi-Supervised Classification with Graph Convolutional Networks. In Proceedings of the International Conference on Learning Representations (ICLR), Toulon, France, 24–26 April 2017. [Google Scholar]
- Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. arXiv 2017, arXiv:1710.10903. [Google Scholar]
- Hamilton, W.L.; Ying, R.; Leskovec, J. Inductive representation learning on large graphs. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; pp. 1025–1035. [Google Scholar]
- Aggarwal, C.; Subbian, K. Evolutionary network analysis: A survey. ACM Comput. Surv. (CSUR) 2014, 47, 1–36. [Google Scholar] [CrossRef]
- Šiljak, D. Dynamic graphs. Nonlinear Anal. Hybrid Syst. 2008, 2, 544–567. [Google Scholar] [CrossRef]
- Zaki, A.; Attia, M.; Hegazy, D.; Amin, S. Comprehensive survey on dynamic graph models. Int. J. Adv. Comput. Sci. Appl. 2016, 7, 573–582. [Google Scholar] [CrossRef] [Green Version]
- Chen, B.; Fan, W.; Liu, J.; Wu, F.X. Identifying protein complexes and functional modules—From static PPI networks to dynamic PPI networks. Briefings Bioinform. 2014, 15, 177–194. [Google Scholar] [CrossRef] [Green Version]
- Berger-Wolf, T.Y.; Saia, J. A framework for analysis of dynamic social networks. In Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Philadelphia, PA, USA, 20–23 August 2006; pp. 523–528. [Google Scholar]
- Holm, A.N.; Plank, B.; Wright, D.; Augenstein, I. Longitudinal citation prediction using temporal graph neural networks. arXiv 2020, arXiv:2012.05742. [Google Scholar]
- Skarding, J.; Gabrys, B.; Musial, K. Foundations and modelling of dynamic networks using dynamic graph neural networks: A survey. arXiv 2020, arXiv:2005.07496. [Google Scholar]
- Zhu, L.; Guo, D.; Yin, J.; Ver Steeg, G.; Galstyan, A. Scalable temporal latent space inference for link prediction in dynamic social networks. IEEE Trans. Knowl. Data Eng. 2016, 28, 2765–2777. [Google Scholar] [CrossRef] [Green Version]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Goyal, P.; Chhetri, S.R.; Canedo, A. dyngraph2vec: Capturing network dynamics using dynamic graph representation learning. Knowl.-Based Syst. 2020, 187, 104816. [Google Scholar] [CrossRef]
- Pareja, A.; Domeniconi, G.; Chen, J.; Ma, T.; Suzumura, T.; Kanezashi, H.; Kaler, T.; Schardl, T.B.; Leiserson, C.E. EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs. In Proceedings of the AAAI, New York, NY, USA, 7–12 February 2020; pp. 5363–5370. [Google Scholar]
- Lu, Z.; Wahlström, J.; Nehorai, A. Community detection in complex networks via clique conductance. Sci. Rep. 2018, 8, 5982. [Google Scholar] [CrossRef] [PubMed]
- Soundarajan, S.; Hopcroft, J. Using community information to improve the precision of link prediction methods. In Proceedings of the 21st International Conference on World Wide Web, Lyon, France, 16–20 April 2012; pp. 607–608. [Google Scholar]
- Cong, W.; Wu, Y.; Tian, Y.; Gu, M.; Xia, Y.; Mahdavi, M.; Chen, C.C.J. Dynamic Graph Representation Learning via Graph Transformer Networks. arXiv 2021, arXiv:2111.10447. [Google Scholar]
- Pardalos, P.M.; Xue, J. The maximum clique problem. J. Glob. Optim. 1994, 4, 301–328. [Google Scholar] [CrossRef]
- Belkin, M.; Niyogi, P. Laplacian eigenmaps and spectral techniques for embedding and clustering. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canda, 9–14 December 2002; pp. 585–591. [Google Scholar]
- Tenenbaum, J.B.; De Silva, V.; Langford, J.C. A global geometric framework for nonlinear dimensionality reduction. Science 2000, 290, 2319–2323. [Google Scholar] [CrossRef]
- Tang, J.; Qu, M.; Wang, M.; Zhang, M.; Yan, J.; Mei, Q. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference on World Wide Web, Florence, Italy, 18–22 May 2015; pp. 1067–1077. [Google Scholar]
- Ribeiro, L.F.; Saverese, P.H.; Figueiredo, D.R. struc2vec: Learning node representations from structural identity. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, 13–17 August 2017; pp. 385–394. [Google Scholar]
- Niepert, M.; Ahmed, M.; Kutzkov, K. Learning convolutional neural networks for graphs. In Proceedings of the International Conference on Machine Learning, New York, NY, USA, 19–24 June 2016; pp. 2014–2023. [Google Scholar]
- Zhou, J.; Cui, G.; Hu, S.; Zhang, Z.; Yang, C.; Liu, Z.; Wang, L.; Li, C.; Sun, M. Graph neural networks: A review of methods and applications. AI Open 2020, 1, 57–81. [Google Scholar] [CrossRef]
- Zhang, Z.; Cui, P.; Pei, J.; Wang, X.; Zhu, W. Timers: Error-bounded svd restart on dynamic networks. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018. [Google Scholar]
- Goyal, P.; Kamra, N.; He, X.; Liu, Y. Dyngem: Deep embedding method for dynamic graphs. arXiv 2018, arXiv:1805.11273. [Google Scholar]
- Zhou, L.; Yang, Y.; Ren, X.; Wu, F.; Zhuang, Y. Dynamic network embedding by modeling triadic closure process. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018; Volume 32. [Google Scholar]
- Seo, Y.; Defferrard, M.; Vandergheynst, P.; Bresson, X. Structured sequence modeling with graph convolutional recurrent networks. In Proceedings of the International Conference on Neural Information Processing, Siem Reap, Cambodia, 13–16 December 2018; Springer: Cham, Switzerland, 2018; pp. 362–373. [Google Scholar]
- Manessi, F.; Rozza, A.; Manzo, M. Dynamic graph convolutional networks. Pattern Recognit. 2020, 97, 107000. [Google Scholar] [CrossRef]
- Narayan, A.; Roe, P.H. Learning graph dynamics using deep neural networks. IFAC-PapersOnLine 2018, 51, 433–438. [Google Scholar] [CrossRef]
- Kazemi, S.M.; Goel, R.; Jain, K.; Kobyzev, I.; Sethi, A.; Forsyth, P.; Poupart, P. Representation Learning for Dynamic Graphs: A Survey. J. Mach. Learn. Res. 2020, 21, 1–73. [Google Scholar]
- Wasserman, S.; Faust, K. Social Network Analysis: Methods and Applications; Cambridge University Press: Cambridge, UK, 1994; p. 8. [Google Scholar]
- Bron, C.; Kerbosch, J. Algorithm 457: Finding all cliques of an undirected graph. Commun. ACM 1973, 16, 575–577. [Google Scholar] [CrossRef]
- Eppstein, D.; Löffler, M.; Strash, D. Listing all maximal cliques in sparse graphs in near-optimal time. In Proceedings of the International Symposium on Algorithms and Computation, Jeju Island, Korea, 15–17 December 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 403–414. [Google Scholar]
- Eppstein, D.; Löffler, M.; Strash, D. Listing all maximal cliques in large sparse real-world graphs. J. Exp. Algorithmics (JEA) 2013, 18, 3.1–3.21. [Google Scholar] [CrossRef]
- Srinivas, V.; Mitra, P. Link Prediction in Social Networks: Role of Power Law Distribution; Springer: New York, NY, USA, 2016. [Google Scholar]
#Nodes | #Edges | # Time Steps | ||
---|---|---|---|---|
UCI | 1899 | 119,670 | 63.02 | 88 |
HEP | 2885 | 1,277,556 | 442.83 | 54 |
DBLP | 11,919 | 30,330 | 2.545 | 27 |
Model | UCI | HEP | DBLP |
---|---|---|---|
dynAE | 0.0092 | 0.0120 | 0.0003 |
dynAERNN | 0.0040 | 0.0448 | 0.0020 ** |
dynRNN | 0.0007 | 0.0698 | OOM |
GCN | 1.90 | 0.0230 | 2.54 |
GCN+GRU | 0.0050 | 0.1043 ** | 3.72 |
Egcn-H | 0.0126 | 0.0328 | 0.0009 |
Egcn-O | 0.0270 ** | 0.0409 | 0.0001 |
GCN(Add) | 4.10 | 0.0175 | 2.79 |
GCN+GRU(Add) | 0.0060 | 0.1143 * | 2.91 |
Egcn-H(Add) | 0.0229 | 0.0769 | 0.0066 * |
Egcn-O(Add) | 0.0414 * | 0.0156 | 0.0006 |
0% | 0.5% | 2% | 5% | 10% | |
---|---|---|---|---|---|
Egcn-O | 0.027 | 0.0251 | 0.0179 | 0.0124 | 0.0063 |
Egcn-O (Add) | 0.0414 | 0.0382 | 0.0221 | 0.0227 | 0.0186 |
Egcn-H | 0.0126 | 0.0107 | 0.0099 | 0.0108 | 0.0042 |
Egcn-H (Add) | 0.0229 | 0.0212 | 0.0110 | 0.0143 | 0.0084 |
Dataset | Size of the Maximum Clique |
---|---|
UCI | 5 |
DBLP | 12 |
HEP | 47 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhu, J.; Li, B.; Zhang, Z.; Zhao, L.; Li, H. High-Order Topology-Enhanced Graph Convolutional Networks for Dynamic Graphs. Symmetry 2022, 14, 2218. https://doi.org/10.3390/sym14102218
Zhu J, Li B, Zhang Z, Zhao L, Li H. High-Order Topology-Enhanced Graph Convolutional Networks for Dynamic Graphs. Symmetry. 2022; 14(10):2218. https://doi.org/10.3390/sym14102218
Chicago/Turabian StyleZhu, Jiawei, Bo Li, Zhenshi Zhang, Ling Zhao, and Haifeng Li. 2022. "High-Order Topology-Enhanced Graph Convolutional Networks for Dynamic Graphs" Symmetry 14, no. 10: 2218. https://doi.org/10.3390/sym14102218
APA StyleZhu, J., Li, B., Zhang, Z., Zhao, L., & Li, H. (2022). High-Order Topology-Enhanced Graph Convolutional Networks for Dynamic Graphs. Symmetry, 14(10), 2218. https://doi.org/10.3390/sym14102218