CKGAT: Collaborative Knowledge-Aware Graph Attention Network for Top-N Recommendation
Abstract
:1. Introduction
- We propose a novel method called collaborative knowledge-aware graph attention network (CKGAT) for top-N recommendation. This method can learn refined ripple set embeddings, thereby generating accurate user embeddings and item embeddings, so as to accurately capture users’ potential interests in items. To the best of our knowledge, it is the first method that uses the knowledge-aware graph attention network to learn the refined ripple set embeddings;
- We use the attention aggregator to generate accurate user embeddings and item embeddings for top-N recommendation;
- Extensive experiments on four real-world datasets demonstrate the superiority of our CKGAT in comparison with state-of-the-art methods.
2. Related Work
2.1. Embedding-Based Recommendation Methods
2.2. Connection-Based Recommendation Methods
2.3. Propagation-Based Recommendation Methods
2.3.1. User Representation-Refinement Approaches
2.3.2. Item Representation-Refinement Approaches
2.3.3. Both User and Item Representation-Refinement Approaches
3. Problem Formulation
4. Proposed CKGAT Method
4.1. Overall Framework of CKGAT
- Heterogeneous propagation layer. This layer is composed of the collaboration propagation module and the knowledge graph propagation module. The first module propagates collaborative signals through the user-item interactions to obtain the user initial entity set and the item initial entity set. On the basis of the two initial entity sets, the second module propagates knowledge associations along the links in the knowledge graph to obtain the user’s multi-hop ripple sets and the item’s multi-hop ripple sets. Then, this layer outputs these sets to the next layer;
- Knowledge-aware GAT-based attentive embedding layer. For each input ripple set of the user/item, this layer uses a knowledge-aware graph attention network to capture the topological proximity structures of the entities (i.e., items) in the ripple set to learn the high-order entity representations, thereby generating the ripple set embedding (i.e., vector). This layer also generates the user initial entity set embedding, the item initial entity set embedding, and the original representation (i.e., vector) of the item. Finally, all these embeddings (vectors) are output to the next layer;
- User-item interaction probability prediction layer. For the input embeddings, this layer uses the attention aggregator to learn the weight of each embedding, and performs weighted aggregation on these embeddings to generate the user embedding and the item embedding. The two embeddings are then used to calculate the predicted probability that the user will interact with the item.
4.2. Heterogeneous Propagation Layer
4.2.1. Collaboration Propagation Module
4.2.2. Knowledge Graph Propagation Module
4.3. Knowledge-Aware GAT-Based Attentive Embedding Layer
4.4. User-Item Interaction Probability Prediction Layer
4.5. Model Learning and Recommendation Generation
5. Experiments
5.1. Experimental Datasets
- Last.FM [39]. This dataset contains social networking, tagging, and music artist listening information from a set of 2000 users from the Last.fm online music system;
- Book-Crossing [40]. This dataset collects explicit ratings (ranging from 0 to 10) from different readers about various books in the book-crossing community;
- MovieLens 20M [41]. This dataset is a widely used benchmark dataset in movie recommendation, which contains approximately 20 million explicit user ratings for movies (ranging from one to five) on the MovieLens website;
- Dianping-Food [42]. This dataset is provided by Dianping.com, which contains 10 million interaction data (including clicks and purchases, etc.) between approximately 2 million users and 1000 restaurants.
5.2. Comparison Methods
- In accordance with the practice of choosing experimental comparison methods in the existing research work [2,3,4,5,6,7,8,15,21,23,24,28,29,30,33,34] in the field, we chose one classical collaborative filtering method, one typical embedding-based recommendation method, and six representative propagation-based recommendation methods (as the mainstream methods of knowledge graph-based recommendation);
- Unlike the above-mentioned research works, which did not choose a connection-based recommendation method as an experimental comparison method, we chose KPRN, a typical connection-based recommendation method, as the comparison method;
- BPRMF [45]. This method is a Bayesian personalized ranking (BPR) optimized matrix factorization (MF) model achieved by applying LearnBPR to MF.
- CKE [2]. This method is a typical knowledge graph embedding-based recommendation method that combines the structural knowledge, textual knowledge and visual knowledge of items to learn item representations.
- KPRN [3]. This method is a typical connection-based recommendation method that generates path representations by combining the semantics of entities and relations and distinguishes the importance of different paths, thereby capturing user preferences.
- RippleNet [23]. This method is a classical propagation-based recommendation method that enhances user representations by propagating users’ potential preferences in the knowledge graph.
- CKAN [4]. This method belongs to the propagation-based recommendation methods that uses a heterogeneous propagation strategy and an attention network to learn ripple set embeddings, thereby generating user embeddings and item embeddings.
- KGCN [5]. This method belongs to the propagation-based recommendation methods, which applies the graph convolutional network to the knowledge graph to aggregate neighborhood information to refine item representations.
- KGNN-LS [7]. This method belongs to the propagation-based recommendation methods that adds a label-smoothness mechanism to the KGCN framework to propagate user-interaction labels, so as to provide effective recommendations.
- KGAT [8]. This method belongs to the propagation-based recommendation methods that applies the graph attention network to the collaborative knowledge graph to learn user representations and item representations.
- KGIN [6]. This method is currently the state-of-the-art propagation-based recommendation method. It uses auxiliary item knowledge to explore the users’ intention behind the user-item interactions, thus refining the representations of users and items.
5.3. Hyperparameter Settings
5.4. Experimental Results
5.4.1. Recommendation Accuracy (RQ1)
- CKGAT achieves the best recommendation accuracy across all evaluation metrics on the four datasets, with the exception of the precision metric on the MovieLens 20M dataset. This result shows that on the basis of the heterogeneous propagation strategy, the user embeddings and the item embeddings generated by the knowledge-aware GAT-based attentive embedding layer and the attention aggregator enable CKGAT to accurately capture the users’ potential interests and improve the accuracy of personalized recommendation.
- In terms of the precision metric on the MovieLens 20M dataset, CKGAT is slightly lower than KGNN-LS, but is comparable with KGNN-LS in terms of Precision@20. Since N is usually set to 20–50 [54] in most practical application scenarios, it can be considered that CKGAT still has good recommendation accuracy on the MovieLens 20M dataset. The reason why the above exception exists may be that the average number of user-item interactions per user is so large that the propagation of knowledge in the knowledge graph is almost ineffective.
- The recommendation accuracy of CKGAT on all the datasets is significantly better than CKAN. This result indicates that the topological proximity structures of entities in multi-hop ripple sets can effectively enrich the ripple set embeddings, thereby generating the refined user embeddings and item embeddings.
- We have observed that the recommendation accuracy of CKGAT on all the datasets is overall better than all the comparison methods, and the recommendation accuracy of CKAN on all the datasets is overall better than other comparison methods except KGIN. These results show that both CKGAT and CKAN, which adopt the heterogeneous propagation strategy, can enhance the user embeddings and the item embeddings by effectively combining the collaborative signals in the user-item interactions and the knowledge associations in the knowledge graph, thereby improving the recommendation performance.
- CKGAT, KGIN, CKAN, and KGAT outperform KGNN-LS, KGCN, and RippleNet in terms of most evaluation metrics on all the datasets. This result shows that CKGAT, KGIN, CKAN, and KGAT can use both the first-order user-item interaction information in the user-item interaction matrix and the knowledge associations in the knowledge graph to mine accurate user preferences.
- The seven propagation-based recommendation methods, including CKGAT, are significantly better than the baseline methods BPRMF, CKE and KPRN across all evaluation metrics on all the datasets. This result shows that the propagation-based methods can effectively exploit the high-order relations in the knowledge graph to more accurately capture the users’ potential interests in items.
5.4.2. Recommendation Diversity (RQ2)
- The diversity of CKGAT’s recommendation results is significantly better than that of KGNN-LS, KGAT, CKAN, and KGIN, although the recommendation accuracy of CKGAT on the MovieLens 20M dataset is slightly better than these four methods.
- The diversity of KGIN’s recommendation results is slightly better than that of CKAN, KGAT and KGNN-LS. This may be due to the fact that KGIN handles user intent at a more fine-grained level, and considers multiple aspects of user intent.
- The diversity of CKAN’s recommendation results is slightly better than that of KGNN-LS and KGAT. This may be due to the fact that CKAN can extract a variety of user preferences and various item features in the process of heterogeneous propagation.
- The diversity of KGAT’s recommendation results is better than that of KGNN-LS. This may be because KGAT uses a collaborative knowledge graph containing user-item interaction information, which contains a wealth of user preference information.
5.4.3. Different Components’ Influences (RQ3)
- . This variant method is achieved by removing the attention aggregator from CKGAT. The purpose is to verify the influence of the knowledge-aware GAT-based attentive embedding layer on recommendation accuracy.
- . This variant method is achieved by removing the knowledge-aware GAT-based attentive embedding layer from CKGAT. The purpose is to verify the influence of the attention aggregator on recommendation accuracy.
- CKGAT is significantly better than the two variants and across all evaluation metrics on all the datasets, with the exception of the NDCG@10 metric on the Book-Crossing dataset. This shows that CKGAT can more effectively capture users’ preferences and improve recommendation accuracy by integrating the knowledge-aware GAT-based attentive embedding layer and the attention aggregator to learn user embeddings and item embeddings.
- The two variant methods are superior to CKAN (the basic method of CKGAT) across all the evaluation metrics on all the datasets, with the exception of the NDCG@10 metric on the MovieLens 20M dataset. This shows that using the knowledge-aware GAT-based attentive embedding layer to capture the topological proximity structures of entities in multi-hop ripple sets as well as using the attention aggregator to distinguish the importance of ripple set embeddings can refine the representations of users and the representations of items.
- The recommendation accuracy of is better than in terms of most evaluation metrics on all the datasets. This shows that the topological proximity structures of the entities in multi-hop ripple sets play a more important role in learning the user representations and item representations.
5.4.4. Hyperparameter Sensitivity (RQ4)
- CKGAT achieves the best recommendation performance when and on the Last.FM and Book-Crossing datasets, respectively, and achieves the best recommendation performance when on both the MovieLens 20M and Dianping-Food datasets.
- When L increases to four, the recommendation performance of CKGAT on the four datasets is the worst. This may be because when the maximum hop number is large, CKGAT introduces entities with lower relevance to users/items and generates inaccurate user/item embeddings, thus limiting the recommendation performance.
- CKGAT achieves the best recommendation performance when and on the Last.FM and Book-Crossing datasets, respectively, and achieves the best recommendation performance when on both the MovieLens 20M and Dianping-Food datasets.
- When K decreases to two, the recommendation performance of CKGAT on the four datasets is the worst. This may be because when the number of neighbors is small, the high-order entity representations learned by CKGAT from the topological proximity structures are insufficient to support the generation of accurate user/item embeddings, thus limiting the recommendation performance.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Guo, Q.; Zhuang, F.; Qin, C.; Zhu, H.; Xie, X.; Xiong, H.; He, Q. A Survey on Knowledge Graph-Based Recommender Systems. IEEE Trans. Knowl. Data Eng. 2020. Early Access. [Google Scholar] [CrossRef]
- Zhang, F.; Yuan, N.J.; Lian, D.; Xie, X.; Ma, W.-Y. Collaborative Knowledge Base Embedding for Recommender Systems. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 353–362. [Google Scholar] [CrossRef]
- Wang, X.; Wang, D.; Xu, C.; He, X.; Cao, Y.; Chua, T.-S. Explainable Reasoning over Knowledge Graphs for Recommendation. In Proceedings of the 33rd AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; pp. 5329–5336. [Google Scholar] [CrossRef] [Green Version]
- Wang, Z.; Lin, G.; Tan, H.; Chen, Q.; Liu, X. CKAN: Collaborative Knowledge-aware Attentive Network for Recommender Systems. In Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual Event, 25–30 July 2020; pp. 219–228. [Google Scholar] [CrossRef]
- Wang, H.; Zhao, M.; Xie, X.; Li, W.; Guo, M. Knowledge Graph Convolutional Networks for Recommender Systems. In Proceedings of the 28th World Wide Web Conference, WWW 2019, San Francisco, CA, USA, 13–17 May 2019; pp. 3307–3313. [Google Scholar] [CrossRef] [Green Version]
- Wang, X.; Huang, T.; Wang, D.; Yuan, Y.; Liu, Z.; He, X.; Chua, T.-S. Learning Intents behind Interactions with Knowledge Graph for Recommendation. In Proceedings of the Web Conference 2021, Ljubljana, Slovenia, 19–23 April 2021; pp. 878–887. [Google Scholar] [CrossRef]
- Wang, H.; Zhang, F.; Zhang, M.; Leskovec, J.; Zhao, M.; Li, W.; Wang, Z. Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 968–977. [Google Scholar] [CrossRef] [Green Version]
- Wang, X.; He, X.; Cao, Y.; Liu, M.; Chua, T.-S. KGAT: Knowledge Graph Attention Network for Recommendation. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 950–958. [Google Scholar] [CrossRef] [Green Version]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Yu, P.S. A Comprehensive Survey on Graph Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 4–24. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yang, J.-H.; Chen, C.-M.; Wang, C.-J.; Tsai, M.-F. HOP-rec: High-order proximity for implicit recommendation. In Proceedings of the 12th ACM Conference on Recommender Systems, Vancouver, BC, Canada, 2–7 October 2018; pp. 140–144. [Google Scholar] [CrossRef]
- Ai, Q.; Azizi, V.; Chen, X.; Zhang, Y. Learning Heterogeneous Knowledge Base Embeddings for Explainable Recommendation. Algorithms 2018, 11, 137. [Google Scholar] [CrossRef] [Green Version]
- Wang, H.; Zhang, F.; Xie, X.; Guo, M. DKN: Deep Knowledge-Aware Network for News Recommendation. In Proceedings of the 2018 World Wide Web Conference on World Wide Web, Lyon, France, 23–27 April 2018; pp. 1835–1844. [Google Scholar] [CrossRef] [Green Version]
- Goodfellow, I.J.; Bengio, Y.; Courville, A.C. Convolutional Networks. In Deep Learning; MIT Press: Cambridge, MA, USA, 2016; pp. 330–372. Available online: http://www.deeplearningbook.org/contents/convnets.html (accessed on 14 December 2021).
- Ji, G.; He, S.; Xu, L.; Liu, K.; Zhao, J. Knowledge Graph Embedding via Dynamic Mapping Matrix. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Beijing, China, 26–31 July 2015; pp. 687–696. [Google Scholar] [CrossRef]
- Cao, Y.; Wang, X.; He, X.; Hu, Z.; Chua, T.-S. Unifying Knowledge Graph Learning and Recommendation: Towards a Better Understanding of User Preferences. In Proceedings of the 28th World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019; pp. 151–161. [Google Scholar] [CrossRef] [Green Version]
- Hu, B.; Shi, C.; Zhao, W.X.; Yu, P.S. Leveraging Meta-path based Context for Top- N Recommendation with A Neural Co-Attention Model. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, London, UK, 19–23 August 2018; pp. 1531–1540. [Google Scholar] [CrossRef]
- Yu, X.; Ren, X.; Sun, Y.; Sturt, B.; Khandelwal, U.; Gu, Q.; Norick, B.; Han, J. Recommendation in heterogeneous information networks with implicit user feedback. In Proceedings of the 7th ACM Conference on Recommender Systems, Hong Kong, China, 12–16 October 2013; pp. 347–350. [Google Scholar] [CrossRef]
- Zhao, H.; Yao, Q.; Li, J.; Song, Y.; Lee, D.L. Meta-Graph Based Recommendation Fusion over Heterogeneous Information Networks. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, 13–17 August 2017; pp. 635–644. [Google Scholar] [CrossRef]
- Luo, C.; Pang, W.; Wang, Z.; Lin, C. Hete-CF: Social-Based Collaborative Filtering Recommendation Using Heterogeneous Relations. In Proceedings of the 2014 IEEE International Conference on Data Mining, Shenzhen, China, 14–17 December 2014; pp. 917–922. [Google Scholar] [CrossRef] [Green Version]
- Huang, X.; Fang, Q.; Qian, S.; Sang, J.; Li, Y.; Xu, C. Explainable Interaction-driven User Modeling over Knowledge Graph for Sequential Recommendation. In Proceedings of the 27th ACM International Conference on Multimedia, Nice, France, 21–25 October 2019; pp. 548–556. [Google Scholar] [CrossRef]
- Sun, Z.; Yang, J.; Zhang, J.; Bozzon, A.; Huang, L.-K.; Xu, C. Recurrent knowledge graph embedding for effective recommendation. In Proceedings of the 12th ACM Conference on Recommender Systems, Vancouver, BC, Canada, 2 October 2018; pp. 297–305. [Google Scholar] [CrossRef] [Green Version]
- Goodfellow, I.J.; Bengio, Y.; Courville, A.C. Sequence Modeling: Recurrent and Recursive Nets. In Deep Learning; MIT Press: Cambridge, MA, USA, 2016; pp. 330–372. Available online: http://www.deeplearningbook.org/contents/rnn.html (accessed on 14 December 2021).
- Wang, H.; Zhang, F.; Wang, J.; Zhao, M.; Li, W.; Xie, X.; Guo, M. RippleNet: Propagating User Preferences on the Knowledge Graph for Recommender Systems. In Proceedings of the 27th ACM International Conference on Information and Knowledge Management, Torino, Italy, 22–26 October 2018; pp. 417–426. [Google Scholar] [CrossRef] [Green Version]
- He, M.; Zhang, H.; Wen, H. RE-KGR: Relation-Enhanced Knowledge Graph Reasoning for Recommendation. In Proceedings of the Database Systems for Advanced Applications—26th International Conference, Taipei, Taiwan, 11–14 April 2021; pp. 297–305. [Google Scholar] [CrossRef]
- Hamilton, W.L.; Ying, Z.; Leskovec, J. Inductive Representation Learning on Large Graphs. In Proceedings of the 31st Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; pp. 1024–1034. Available online: https://papers.nips.cc/paper/6703-inductive-representation-learning-on-large-graphs (accessed on 14 December 2021).
- Schlichtkrull, M.S.; Kipf, T.N.; Bloem, P.; Berg, R.; Titov, I.; Welling, M. Modeling Relational Data with Graph Convolution-al Networks. In Proceedings of the Semantic Web—15th International Conference, Heraklion, Crete, Greece, 3–7 June 2018; pp. 593–607. [Google Scholar] [CrossRef] [Green Version]
- Niepert, M.; Ahmed, M.; Kutzkov, K. Learning Convolutional Neural Networks for Graphs. In Proceedings of the 33nd International Conference on Machine Learning, New York, NY, USA, 19–24 June 2016; pp. 2014–2023. Available online: https://proceedings.mlr.press/v48/niepert16.html (accessed on 14 December 2021).
- Mu, N.; Zha, D.; Gong, R. Gated Knowledge Graph Neural Networks for Top-N Recommendation System. In Proceedings of the 24th IEEE International Conference on Computer Supported Cooperative Work in Design, Dalian, China, 5–7 May 2021; pp. 1111–1116. [Google Scholar] [CrossRef]
- Tu, K.; Cui, P.; Wang, D.; Zhang, Z.; Zhou, J.; Qi, Y.; Zhu, W. Conditional Graph Attention Networks for Distilling and Refining Knowledge Graphs in Recommendation. In Proceedings of the 30th ACM International Conference on Information and Knowledge Management, Virtual Event, Gold Coast, QLD, Australia, 1–5 November 2021; pp. 1834–1843. [Google Scholar] [CrossRef]
- Wang, Y.; Liu, Z.; Fan, Z.; Sun, L.; Yu, P.S. DSKReG: Differentiable Sampling on Knowledge Graph for Recommendation with Relational GNN. In Proceedings of the 30th ACM International Conference on Information and Knowledge Management, Gold Coast, QLD, Australia, 1–5 November 2021; pp. 3513–3517. [Google Scholar] [CrossRef]
- Lin, Y.; Liu, Z.; Sun, M.; Liu, Y.; Zhu, X. Learning Entity and Relation Embeddings for Knowledge Graph Completion. In Proceedings of the 29th AAAI Conference on Artificial Intelligence, Austin, TX, USA, 25–30 January 2015; pp. 2181–2187. Available online: https://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/view/9571 (accessed on 14 December 2021).
- Zhao, J.; Zhou, Z.; Guan, Z.; Zhao, W.; Ning, W.; Qiu, G.; He, X. IntentGC: A Scalable Graph Convolution Framework Fusing Heterogeneous Information for Recommendation. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 2347–2357. [Google Scholar] [CrossRef] [Green Version]
- Huang, R.; Han, C.; Cui, L. Entity-aware Collaborative Relation Network with Knowledge Graph for Recommendation. In Proceedings of the 30th ACM International Conference on Information and Knowledge Management, Gold Coast, QLD, Australia, 1–5 November 2021; pp. 3098–3102. [Google Scholar] [CrossRef]
- Lo, K.; Ishigaki, T. X-2ch: Quad-Channel Collaborative Graph Network over Knowledge-Embedded Edges. In Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual Event, 11–15 July 2021; pp. 2076–2080. [Google Scholar] [CrossRef]
- Velickovic, P.; Cucurull, G.; Casanova, A.; Romero, A.; Liò, P.; Bengio, Y. Graph Attention Networks. In Proceedings of the 6th International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2018; Available online: https://openreview.net/forum?id=rJXMpikCZ (accessed on 14 December 2021).
- Hu, D. An Introductory Survey on Attention Mechanisms in NLP Problems. In Proceedings of the 2019 Intelligent Systems Conference, London, UK, 5–6 September 2019; pp. 432–448. [Google Scholar] [CrossRef] [Green Version]
- Yang, Z.; Yang, D.; Dyer, C.; He, X.; Smola, A.J.; Hovy, E.H. Hierarchical Attention Networks for Document Classification. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, CA, USA, 12–17 June 2016; pp. 1480–1489. [Google Scholar] [CrossRef] [Green Version]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention is All you Need. In Proceedings of the Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, Long Beach, CA, USA, 4–9 December 2017; pp. 5998–6008. Available online: http://papers.nips.cc/paper/7181-attention-is-all-you-need (accessed on 14 December 2021).
- The Last.FM Dataset. Available online: https://grouplens.org/datasets/hetrec-2011/ (accessed on 14 December 2021).
- The Book-Crossing Dataset. Available online: https://grouplens.org/datasets/book-crossing/ (accessed on 14 December 2021).
- The MovieLens 20M Dataset. Available online: https://grouplens.org/datasets/movielens/20m/ (accessed on 14 December 2021).
- The Dianping-Food Dataset. Available online: https://www.dianping.com/ (accessed on 14 December 2021).
- The Preprocessed Last.FM, Book-Crossing and MovieLens 20M Datasets and Their Corresponding Knowledge Graphs. Available online: https://github.com/weberrr/CKAN/tree/master/data (accessed on 14 December 2021).
- The Preprocessed Dianping-Food Dataset and its Corresponding Knowledge Graph. Available online: https://github.com/hwwang55/KGNN-LS/tree/master/data/restaurant (accessed on 14 December 2021).
- Rendle, S.; Freudenthaler, C.; Gantner, Z.; Schmidt-Thieme, L. BPR: Bayesian Personalized Ranking from Implicit Feedback. In Proceedings of the 25th Conference on Uncertainty in Artificial Intelligence, Montreal, QC, Canada, 18–21 June 2009; pp. 452–461. [Google Scholar]
- Source Code of BPRMF, CKE and KGAT. Available online: https://github.com/xiangwang1223/knowledge_graph_attention_network/tree/master/Model (accessed on 14 December 2021).
- Source Code of KPRN. Available online: https://github.com/xiangwang1223/KPRN (accessed on 14 December 2021).
- Source Code of RippleNet. Available online: https://github.com/hwwang55/RippleNet (accessed on 14 December 2021).
- Source Code of CKAN. Available online: https://github.com/weberrr/CKAN (accessed on 14 December 2021).
- Source Code of KGCN. Available online: https://github.com/hwwang55/KGCN (accessed on 14 December 2021).
- Source Code of KGNN-LS. Available online: https://github.com/hwwang55/KGNN-LS (accessed on 14 December 2021).
- Source Code of KGIN. Available online: https://github.com/huangtinglin/Knowledge_Graph_based_intent_Network (accessed on 14 December 2021).
- Aggarwal, C.C. Evaluating Recommender Systems. In Recommender Systems, 1st ed.; Springer: Cham, Switzerland, 2016; pp. 225–254. [Google Scholar] [CrossRef]
- Herlocker, J.L.; Konstan, J.A.; Ried, J. An empirical analysis of design choices in neighborhood-based collaborative filtering algorithms. Inf. Retr. 2002, 5, 287–310. [Google Scholar] [CrossRef]
- Hu, L.; Cao, L.; Wang, S.; Xu, G.; Cao, J.; Gu, Z. Diversifying Personalized Recommendation with User-session Context. In Proceedings of the 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia, 19–25 August 2017; pp. 1858–1864. [Google Scholar] [CrossRef] [Green Version]
Datasets | Last.FM | Book-Crossing | MovieLens 20M | Dianping-Food | |
---|---|---|---|---|---|
user-item interactions | #users | 1872 | 17,860 | 138,159 | 2,298,698 |
#items | 3846 | 14,967 | 16,954 | 1362 | |
#interactions | 42,346 | 139,746 | 13,501,622 | 23,416,418 | |
knowledge graph | #entities | 9366 | 77,903 | 102,569 | 28,115 |
#relations | 60 | 25 | 32 | 7 | |
#triples | 15,518 | 151,500 | 499,474 | 160,519 |
Datasets | Metrics | CKAN | CKGAT | ||
---|---|---|---|---|---|
Last.FM | precision@10 | 0.0310 | 0.0353 | 0.0338 | 0.0390 |
recall@10 | 0.1215 | 0.1383 | 0.1345 | 0.1481 | |
F1-measure@10 | 0.0469 | 0.0562 | 0.0530 | 0.0587 | |
NDCG@10 | 0.0893 | 0.0935 | 0.0928 | 0.0984 | |
Book-Crossing | precision@10 | 0.0149 | 0.0152 | 0.0154 | 0.0157 |
recall@10 | 0.0537 | 0.0562 | 0.0553 | 0.0603 | |
F1-measure@10 | 0.0183 | 0.0185 | 0.0186 | 0.0193 | |
NDCG@10 | 0.0623 | 0.0628 | 0.0640 | 0.0635 | |
MovieLens 20M | precision@10 | 0.0610 | 0.0622 | 0.0628 | 0.0640 |
recall@10 | 0.1273 | 0.1282 | 0.1280 | 0.1319 | |
F1-measure@10 | 0.0687 | 0.0692 | 0.0694 | 0.0719 | |
NDCG@10 | 0.0962 | 0.0923 | 0.0935 | 0.0935 | |
Dianping- Food | precision@10 | 0.0360 | 0.0430 | 0.0406 | 0.0450 |
recall@10 | 0.1254 | 0.1545 | 0.1432 | 0.1855 | |
F1-measure@10 | 0.0500 | 0.0566 | 0.0542 | 0.0632 | |
NDCG@10 | 0.0883 | 0.0993 | 0.0965 | 0.1168 |
L | 1 | 2 | 3 | 4 |
---|---|---|---|---|
Last.FM | 0.1042 | 0.1269 | 0.1481 | 0.1203 |
Book-Crossing | 0.0371 | 0.0603 | 0.0417 | 0.0201 |
MovieLens 20M | 0.1319 | 0.1072 | 0.0808 | 0.0742 |
Dianping-Food | 0.1855 | 0.1409 | 0.1125 | 0.1061 |
K | 2 | 3 | 4 | 5 |
---|---|---|---|---|
Last.FM | 0.1173 | 0.1481 | 0.1071 | 0.0762 |
Book-Crossing | 0.0208 | 0.0298 | 0.0603 | 0.0450 |
MovieLens 20M | 0.0477 | 0.0672 | 0.1014 | 0.1319 |
Dianping-Food | 0.0570 | 0.1257 | 0.1345 | 0.1855 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, Z.; Liu, H.; Li, J.; Zhang, Q.; Tang, Y. CKGAT: Collaborative Knowledge-Aware Graph Attention Network for Top-N Recommendation. Appl. Sci. 2022, 12, 1669. https://doi.org/10.3390/app12031669
Xu Z, Liu H, Li J, Zhang Q, Tang Y. CKGAT: Collaborative Knowledge-Aware Graph Attention Network for Top-N Recommendation. Applied Sciences. 2022; 12(3):1669. https://doi.org/10.3390/app12031669
Chicago/Turabian StyleXu, Zhuoming, Hanlin Liu, Jian Li, Qianqian Zhang, and Yan Tang. 2022. "CKGAT: Collaborative Knowledge-Aware Graph Attention Network for Top-N Recommendation" Applied Sciences 12, no. 3: 1669. https://doi.org/10.3390/app12031669
APA StyleXu, Z., Liu, H., Li, J., Zhang, Q., & Tang, Y. (2022). CKGAT: Collaborative Knowledge-Aware Graph Attention Network for Top-N Recommendation. Applied Sciences, 12(3), 1669. https://doi.org/10.3390/app12031669