A Hyperbolic Graph Neural Network Model with Contrastive Learning for Rating–Review Recommendation
Abstract
1. Introduction
- (1)
- We propose a dual-graph strategy: A review-aware graph, in which users and items constitute nodes and review texts serve as edges, captures fine-grained user opinions and item descriptions, whereas a user–item graph, with rating scores as edges, encodes macro-level interaction strengths. By jointly learning over these complementary views, the model explicitly disentangles and fuses textual semantics with numerical preferences, thereby learning effective user and item representations from the complex user–item interaction patterns.
- (2)
- We introduce a hyperbolic graph neural network framework that unifies hyperbolic embeddings and graph neural networks within a single representation space. By jointly performing hierarchical learning and structured feature extraction in hyperbolic space, the framework markedly enhances the model’s adaptability to the global properties of interaction data and substantially improves its capacity to represent complex graph structures.
- (3)
- We introduce a contrastive learning framework to optimize the representations of users and items across different views. By performing contrastive learning tasks in hyperbolic space, we enhance the discriminative power of user and item representations. This process refines the quality of representations in hyperbolic graph space, enabling the model to capture more nuanced and complex patterns in the data.
2. Related Work
2.1. Review-Based Recommendation Models
2.2. Graph Neural Network-Based Recommendation Models
2.3. Hyperbolic Recommendation Models
3. Methods and Models
3.1. Review-Aware Graph and a User–Item Graph
3.1.1. Initialize the User–Item Graph
3.1.2. Initialize the Review-Aware Graph
3.2. Hyperbolic Embedding
Interactive Graph Generation and Representation
3.3. User–Item Graph Representation Learning
3.3.1. Hyperbolic Graph Convolutional Layer
3.3.2. Multi-Layer Embedding Aggregation
3.4. Review-Aware Graph Representation Learning
3.4.1. Review-Aware Message Passing
3.4.2. Review-Aware Graph Weighted Aggregation
3.4.3. Multi-Layer Embedding Aggregation
3.5. Cross-View Contrastive Learning
3.6. Interaction Representation Modeling
3.7. Rating Prediction
3.8. Model Training and Loss Function
3.9. Model Optimization
4. Experiments
4.1. Datasets
4.2. Evaluation Metrics
4.3. Comparison Model
- SVD [41] breaks down the user–item rating matrix into simpler low-rank matrics to uncover hidden semantic features, addressing issues of sparsity, scalability, and overfitting in recommendation systems.
- NCF [42] introduces a neural network architecture to learn the nonlinear interactions between users and items, addressing the limitations of traditional collaborative filtering methods in capturing complex user preferences and item features.
- DeepCoNN [43] integrates review text information to address the limitations of traditional recommendation systems that rely solely on rating data, making it difficult to capture user preferences and item characteristics.
- NARRE [44] introduces an attention mechanism to model user reviews, automatically identifying the most influential review segments for rating prediction.
- DAML [45] uses rating information to guide review feature extraction and enhances rating prediction with review information, addressing the issue of insufficient integration of rating and review information in traditional recommendation systems.
- SDNet [37] incorporates external knowledge into recommendation systems efficiently through adversarial training, addressing the inefficiency and difficulty in integrating large-scale external knowledge in traditional recommendation systems.
- TransNets [46] learns the dynamic transformation relationships of user and item features, addressing the issue of overly static user preference and item characteristic representations in traditional recommendation systems that fail to capture dynamic interactions.
- GC-MC [47] represents user–item interaction data as a graph structure to address issues in recommendation systems.
- RMG [48] constructs a heterogeneous user–item review graph and uses a hierarchical attention mechanism to model both internal review information and multiple reviews for users/items.
- SSG [49] models user review information from three different perspectives—sets, sequences, and graphs—to address the insufficient utilization of review information and the difficulty in capturing complex semantic relationships between users and items in traditional recommendation systems.
- RGCL [12] constructs a heterogeneous user–item review graph and enhances user and item representations using contrastive learning methods, addressing the underutilization of review information and the challenge of capturing deep semantic relationships between users and items in traditional recommendation systems.
4.4. Performance Analysis and Discussion
4.5. Parameter Settings and Experimental Results Analysis
4.5.1. Parameter Settings
4.5.2. Analysis of Graph Aggregator Models
- Simple Datasets (, ): The optimal configuration is , . This suggests that shallow networks suffice to achieve peak performance, indicating that the graph structures of these datasets are relatively simple, and node relationships can be adequately captured through two-layer feature aggregation.
- Moderate-Complexity Datasets (, ): The optimal configuration shifts to , . The deeper graph encoding layer likely facilitates modeling more intricate item-association patterns. For instance, in the Clothing dataset, user interactions may involve brand, style, or outfit compatibility, while higher-order relationships in the dataset (e.g., artist affiliations, genres, or user preferences) necessitate additional graph convolution layers to enhance feature representation.
- High-Complexity Dataset (): The deepest network (, ) is required, attributable to its larger scale and more complex interaction patterns. Yelp data typically encompasses multi-dimensional user reviews (e.g., ratings, textual feedback) alongside auxiliary information such as social relationships and geographical proximity. Thus, deeper networks are essential to integrate these higher-order interactive features effectively.
4.6. Ablation Analysis Experiments
- w/o T (No Text Feature Module): Removes the text encoder, retaining only the hyperbolic graph structure and contrastive learning components. This variant relies solely on structured data (e.g., item IDs, user behavior sequences). The full model (Ours) achieves optimal performance across all datasets, validating its holistic design. Removal leads to the most significant performance drop (e.g., MSE increase on ), underscoring the indispensable role of textual information in item recommendation.
- w/o H (No Hyperbolic Projection): Replaces the hyperbolic space with traditional Euclidean GNNs while preserving other components. Its absence notably degrades performance on dense-interaction datasets (e.g., MSE increase on ), confirming hyperbolic space’s superiority in modeling hierarchical relations.
- w/o CL (No Contrastive Learning): Excludes the contrastive loss, relying exclusively on supervised learning signals without self-supervised representation enhancement. Critical for sparse datasets (Yelp); its removal increases MSE by , demonstrating its efficacy in mitigating data sparsity.
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Musto, C.; Rossiello, G.; de Gemmis, M.; Lops, P.; Semeraro, G. Combining text summarization and aspect-based sentiment analysis of users’ reviews to justify recommendations. In Proceedings of the RecSys’19: Thirteenth ACM Conference on Recommender Systems, Copenhagen, Denmark, 16–20 September 2019; pp. 383–387. [Google Scholar] [CrossRef]
- Wei, T.; Chow, T.W.; Ma, J.; Zhao, M. ExpGCN: Review-aware Graph Convolution Network for explainable recommendation. Neural Netw. 2023, 157, 202–215. [Google Scholar] [CrossRef]
- Chen, J.; Cai, C.; Cai, Y.; Yan, F.; Li, J. Recommendation system based on improved graph neural networks. Int. J. Wire. Mob. Comput. 2024, 27, 290–296. [Google Scholar] [CrossRef]
- Malitesta, D. Graph Neural Networks for Recommendation Leveraging Multimodal Information. SIGIR Forum 2024, 58, 1–2. [Google Scholar] [CrossRef]
- Zhang, Y.; Zuo, W.; Shi, Z.; Adhikari, B.K. Integrating reviews and ratings into graph neural networks for rating prediction. J. Ambient Intell. Humaniz. Comput. 2023, 14, 8703–8723. [Google Scholar] [CrossRef]
- Liu, Y.; Kertkeidkachorn, N.; Miyazaki, J.; Ichise, R. Review-enhanced contrastive learning on knowledge graphs for recommendation. Expert Syst. Appl. 2025, 277, 127250. [Google Scholar] [CrossRef]
- Zou, D.; Wei, W.; Mao, X.L.; Wang, Z.; Qiu, M.; Zhu, F.; Cao, X. Multi-level Cross-view Contrastive Learning for Knowledge-aware Recommender System. In Proceedings of the SIGIR’22: The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, Madrid, Spain, 11–15 July 2022; pp. 1358–1368. [Google Scholar] [CrossRef]
- Zhang, X.; Xu, B.; Ma, F.; Wang, Z.; Yang, L.; Lin, H. Rethinking contrastive learning in session-based recommendation. Pattern Recognit. 2026, 169, 111924. [Google Scholar] [CrossRef]
- Jiang, Y.; Li, C.; Chen, G.; Li, P.; Zhang, Q.; Lin, J.; Jiang, P.; Sun, F.; Zhang, W. MMGCL: Meta Knowledge-Enhanced Multi-view Graph Contrastive Learning for Recommendations. In Proceedings of the RecSys’24: 18th ACM Conference on Recommender Systems, Bari, Italy, 14–18 October 2024; pp. 538–548. [Google Scholar] [CrossRef]
- Lu, Y.; Dong, R.; Smyth, B. Coevolutionary Recommendation Model: Mutual Learning between Ratings and Reviews. In Proceedings of the WWW’18: The Web Conference 2018, Lyon, France, 23–27 April 2018; pp. 773–782. [Google Scholar] [CrossRef]
- Ren, Y.; Zhang, H.; Li, Q.; Fu, L.; Wang, X.; Zhou, C. Self-supervised graph disentangled networks for review-based recommendation. In Proceedings of the IJCAI’23: Thirty-Second International Joint Conference on Artificial Intelligence, Macao, China, 19–25 August 2023. [Google Scholar] [CrossRef]
- Shuai, J.; Zhang, K.; Wu, L.; Sun, P.; Hong, R.; Wang, M.; Li, Y. A Review-aware Graph Contrastive Learning Framework for Recommendation. In Proceedings of the SIGIR’22: The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, Madrid, Spain, 11–15 July 2022; pp. 1283–1293. [Google Scholar] [CrossRef]
- Wang, Q.; Cao, X.; Wang, J.; Zhang, W. Knowledge-Aware Collaborative Filtering With Pre-Trained Language Model for Personalized Review-Based Rating Prediction. IEEE Trans. Knowl. Data Eng. 2024, 36, 1170–1182. [Google Scholar] [CrossRef]
- Kong, B.; Jia, C. A Graph Based Approach Towards Exploiting Reviews for Recommendation. In Proceedings of the ICIAI 2022: 2022 the 6th International Conference on Innovation in Artificial Intelligence, Guangzhou, China, 4–6 March 2022; pp. 221–227. [Google Scholar] [CrossRef]
- Liu, J.; Li, T.; Wu, D.; Tang, Z.; Fang, Y.; Yang, Z. An Aspect Performance-aware Hypergraph Neural Network for Review-based Recommendation. In Proceedings of the WSDM’25: The Eighteenth ACM International Conference on Web Search and Data Mining, Hannover, Germany, 10–14 March 2025; pp. 503–511. [Google Scholar] [CrossRef]
- Kim, J.; Kim, E.; Yeo, K.; Jeon, Y.; Kim, C.; Lee, S.; Lee, J. Content-based Graph Reconstruction for Cold-start Item Recommendation. In Proceedings of the SIGIR 2024: The 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, Washington, DC, USA, 14–18 July 2024; pp. 1263–1273. [Google Scholar] [CrossRef]
- Li, C.; Niu, X.; Luo, X.; Chen, Z.; Quan, C. A Review-Driven Neural Model for Sequential Recommendation. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, Macao, China, 10–16 August 2019; pp. 2866–2872. [Google Scholar] [CrossRef]
- Zhang, X.; Xu, B.; Wu, Y.; Zhong, Y.; Lin, H.; Ma, F. FineRec: Exploring Fine-grained Sequential Recommendation. In Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, Washington, DC, USA, 14–18 July 2024; Yang, G.H., Wang, H., Han, S., Hauff, C., Zuccon, G., Zhang, Y., Eds.; ACM: New York, NY, USA, 2024; pp. 1599–1608. [Google Scholar] [CrossRef]
- Scarselli, F.; Gori, M.; Tsoi, A.C.; Hagenbuchner, M.; Monfardini, G. The graph neural network model. Trans. Neur. Netw. 2009, 20, 61–80. [Google Scholar] [CrossRef]
- Ponzi, V.; Napoli, C. Graph Neural Networks: Architectures, Applications, and Future Directions. IEEE Access 2025, 13, 62870–62891. [Google Scholar] [CrossRef]
- Ding, Y.; Zhang, Z.; Wang, B. Category-integrated Dual-Task Graph Neural Networks for session-based recommendation. Expert Syst. Appl. 2025, 263, 125784. [Google Scholar] [CrossRef]
- Zhao, Y.; Ju, J.; Gong, J.; Zhao, J.; Chen, M.; Chen, L.; Feng, X.; Peng, J. Cross-domain recommendation via adaptive bi-directional transfer graph neural networks: Cross-domain recommendation via adaptive bi-directional transfer. Knowl. Inf. Syst. 2024, 67, 579–602. [Google Scholar] [CrossRef]
- Li, Z.; Wang, J.; Chen, Z.; Wu, K.; Wei, Y.; Huang, H. Adaptive Graph Neural Networks for Cold-Start Multimedia Recommendation. In Proceedings of the 2024 IEEE International Conference on Data Mining (ICDM), Abu Dhabi, United Arab Emirates, 9–12 December 2024; pp. 201–210. [Google Scholar] [CrossRef]
- Liu, Y.; Xia, L.; Huang, C. SelfGNN: Self-Supervised Graph Neural Networks for Sequential Recommendation. In Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, Washington, DC, USA, 14–18 July 2024. [Google Scholar]
- He, X.; Deng, K.; Wang, X.; Li, Y.; Zhang, Y.; Wang, M. LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation. In Proceedings of the SIGIR’20: The 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual, China, 25–30 July 2020; pp. 639–648. [Google Scholar] [CrossRef]
- Yang, F.; Du, H.; Zhang, X.; Yang, Y.; Wang, Y. Self-supervised category-enhanced graph neural networks for recommendation. Knowl.-Based Syst. 2025, 311, 113109. [Google Scholar] [CrossRef]
- Lin, R.; Tang, F.; Yuan, C.; Zhong, H.; Li, W.; Tang, Y. DeHier: Decoupled and hierarchical graph neural networks for multi-interest session-based recommendation. World Wide Web 2024, 28, 1. [Google Scholar] [CrossRef]
- Yang, P.; Xiao, Y.; Zheng, W.; Liu, Y.; Hsu, C.H. A graph attention network with contrastive learning for temporal review-based recommendations. Appl. Soft Comput. 2024, 159, 111652. [Google Scholar] [CrossRef]
- Kannikaklang, N.; Thamviset, W.; Wongthanavasu, S. BiGCAN: A novel SRS-based bidirectional graph Convolution Attention Network for dynamic user preference and next-item recommendation. Expert Syst. Appl. 2025, 265, 126016. [Google Scholar] [CrossRef]
- Li, A.; Yang, B.; Huo, H.; Chen, H.; Xu, G.; Wang, Z. Hyperbolic Neural Collaborative Recommender. IEEE Trans. Knowl. Data Eng. 2023, 35, 9114–9127. [Google Scholar] [CrossRef]
- Ma, Q.; Yang, M.; Ju, M.; Zhao, T.; Shah, N.; Ying, R. HARec: Hyperbolic Graph-LLM Alignment for Exploration and Exploitation in Recommender Systems. arXiv 2024. [Google Scholar] [CrossRef]
- Zhang, C.; Zhang, A.; Zhang, L.; Yu, Y.; Zhao, W.; Geng, H. A Graph Neural Networks-Based Learning Framework With Hyperbolic Embedding for Personalized Tag Recommendation. IEEE Access 2024, 12, 339–350. [Google Scholar] [CrossRef]
- Choi, Y.; Choi, J.; Ko, T.; Kim, C.K. Review-Based Hyperbolic Cross-Domain Recommendation. In Proceedings of the WSDM’25: The Eighteenth ACM International Conference on Web Search and Data Mining, Hannover, Germany, 10–14 March 2025; pp. 146–155. [Google Scholar] [CrossRef]
- Hu, H.; He, C.; Chen, X.; Guan, Q. HCKGL: Hyperbolic collaborative knowledge graph learning for recommendation. Neurocomputing 2025, 634, 129808. [Google Scholar] [CrossRef]
- Huang, J.; Tang, D.; Zhong, W.; Lu, S.; Shou, L.; Gong, M.; Jiang, D.; Duan, N. WhiteningBERT: An Easy Unsupervised Sentence Embedding Approach. In Findings of the Association for Computational Linguistics: EMNLP 2021; Association for Computational Linguistics: Stroudsburg, PA, USA, 2021; pp. 238–244. [Google Scholar] [CrossRef]
- Rendle, S.; Freudenthaler, C.; Gantner, Z.; Schmidt-Thieme, L. BPR: Bayesian Personalized Ranking from Implicit Feedback. arXiv 2009. [Google Scholar] [CrossRef]
- Chen, X.; Zhang, Y.; Xu, H.; Qin, Z.; Zha, H. Adversarial Distillation for Efficient Recommendation with External Knowledge. ACM Trans. Inf. Syst. 2018, 37, 1–28. [Google Scholar] [CrossRef]
- Wu, L.; Quan, C.; Li, C.; Wang, Q.; Zheng, B.; Luo, X. A Context-Aware User-Item Representation Learning for Item Recommendation. ACM Trans. Inf. Syst. 2019, 37, 1–29. [Google Scholar] [CrossRef]
- Li, Z.; Cheng, W.; Kshetramade, R.; Houser, J.; Chen, H.; Wang, W. Recommend for a Reason: Unlocking the Power of Unsupervised Aspect-Sentiment Co-Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2021; Association for Computational Linguistics: Stroudsburg, PA, USA, 2021; pp. 763–778. [Google Scholar] [CrossRef]
- Mikolov, T.; Sutskever, I.; Chen, K.; Corrado, G.; Dean, J. Distributed representations of words and phrases and their compositionality. In Proceedings of the NIPS’13: Proceedings of the 27th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, 5–10 December 2013; Curran Associates Inc.: Red Hook, NY, USA, 2013; pp. 3111–3119. [Google Scholar]
- Koren, Y.; Bell, R.; Volinsky, C. Matrix Factorization Techniques for Recommender Systems. Computer 2009, 42, 30–37. [Google Scholar] [CrossRef]
- He, X.; Liao, L.; Zhang, H.; Nie, L.; Hu, X.; Chua, T.S. Neural Collaborative Filtering. In Proceedings of the WWW’17: 26th International World Wide Web Conference, Perth, Australia, 3–7 April 2017; pp. 173–182. [Google Scholar] [CrossRef]
- Zheng, L.; Noroozi, V.; Yu, P.S. Joint Deep Modeling of Users and Items Using Reviews for Recommendation. In Proceedings of the WSDM 2017: Tenth ACM International Conference on Web Search and Data Mining, Cambridge, UK, 6–10 February 2017; pp. 425–434. [Google Scholar] [CrossRef]
- Chen, C.; Zhang, M.; Liu, Y.; Ma, S. Neural Attentional Rating Regression with Review-level Explanations. In Proceedings of the WWW’18: The Web Conference 2018, Lyon, France, 23–27 April 2018; pp. 1583–1592. [Google Scholar] [CrossRef]
- Liu, D.; Li, J.; Du, B.; Chang, J.; Gao, R. DAML: Dual Attention Mutual Learning between Ratings and Reviews for Item Recommendation. In Proceedings of the KDD’19: The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 344–352. [Google Scholar] [CrossRef]
- Catherine, R.; Cohen, W. TransNets: Learning to Transform for Recommendation. In Proceedings of the RecSys’17: Eleventh ACM Conference on Recommender Systems, Como, Italy, 27–31 August 2017; pp. 288–296. [Google Scholar] [CrossRef]
- van den Berg, R.; Kipf, T.N.; Welling, M. Graph Convolutional Matrix Completion. arXiv 2017. [Google Scholar] [CrossRef]
- Wu, C.; Wu, F.; Qi, T.; Ge, S.; Huang, Y.; Xie, X. Reviews Meet Graphs: Enhancing User and Item Representations for Recommendation with Hierarchical Attentive Graph Neural Network. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 3–7 November 2019; Association for Computational Linguistics: Stroudsburg, PA, USA, 2019; pp. 4886–4895. [Google Scholar] [CrossRef]
- Gao, J.; Lin, Y.; Wang, Y.; Wang, X.; Yang, Z.; He, Y.; Chu, X. Set-Sequence-Graph: A Multi-View Approach Towards Exploiting Reviews for Recommendation. In Proceedings of the CIKM’20: The 29th ACM International Conference on Information and Knowledge Management, Virtual, 19–23 October 2020; pp. 395–404. [Google Scholar] [CrossRef]
- Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, 13–15 May 2010. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
Dataset | Users | Items | Ratings | Density (%) |
---|---|---|---|---|
Digital_Music | 5541 | 3568 | 64,706 | 0.330 |
Toys_and_Games | 19,412 | 11,924 | 167,597 | 0.072 |
Clothing | 39,387 | 23,033 | 278,677 | 0.031 |
CDs_and_Vinyl | 75,258 | 64,443 | 1,097,592 | 0.023 |
Yelp | 8423 | 3742 | 88,647 | 0.281 |
Method | Dataset | ||||
---|---|---|---|---|---|
Digital_Music | Toys_and_Games | Clothing | CDs_and_Vinyl | Yelp | |
SVD | 0.8523 ± 4 × | 0.8086 ± 1 × | 1.1167 ± 1 × | 0.8662 ± 2 × | 1.1939 ± 1 × |
NCF | 0.8403 ± 5 × | 0.8078 ± 2 × | 1.1094 ± 1 × | 0.8781 ± 1 × | 1.1896 ± 4 × |
DeepCoNN | 0.8378 ± 1 × | 0.8028 ± 7 × | 1.1184 ± 2 × | 0.8621 ± 1 × | 1.1877 ± 1 × |
NARRE | 0.8172 ± 1 × | 0.7962 ± 1 × | 1.1064 ± 1 × | 0.8495 ± 1 × | 1.1862 ± 1 × |
DAML | 0.8237 ± 2 × | 0.7936 ± 4 × | 1.1065 ± 2 × | 0.8483 ± 1 × | 1.1793 ± 1 × |
SDNet | 0.8331 ± 3 × | 0.8006 ± 1 × | 1.1080 ± 1 × | 0.8654 ± 5 × | 1.1837 ± 3 × |
TransNets | 0.8273 ± 5 × | 0.7980 ± 1 × | 1.1141 ± 5 × | 0.8440 ± 1 × | 1.1855 ± 2 × |
GC-MC | 0.8090 ± 1 × | 0.7986 ± 5 × | 1.1088 ± 1 × | 0.8404 ± 1 × | 1.1737 ± 1 × |
RMG | 0.8074 ± 1 × | 0.7901 ± 1 × | 1.1064 ± 2 × | 0.8425 ± 8 × | 1.1705 ± 1 × |
SSG | 0.8218 ± 2 × | 0.8064 ± 1 × | 1.1228 ± 1 × | 0.8458 ± 1 × | 1.1807 ± 1 × |
RGCL | 0.7735 ± 4 × | 0.7771 ± 1 × | 1.0858 ± 1 × | 0.8180 ± 7 × | 1.1609 ± 8 × |
Ours | 0.7150 ± 3 × | 0.7150 ± 1 × | 0.9680 ± 1 × | 0.7450 ± 6 × | 1.0500 ± 7 × |
(8.0%) | (8.0%) | (10.8%) | (11.0%) | (8.7%) |
Dataset | Digital_Music | Toys_and_Games | Clothing | CDs_and_Vinyl | Yelp |
---|---|---|---|---|---|
2 | 2 | 3 | 3 | 3 | |
2 | 2 | 2 | 2 | 3 |
Method | Digital_Music | Toys_and_Games | Clothing | CDs_and_Vinyl | Yelp |
---|---|---|---|---|---|
w/o T | 0.8326 | 0.7995 | 1.0558 | 0.8654 | 1.1607 |
w/o H | 0.7511 | 0.7429 | 1.0456 | 0.7787 | 1.1203 |
w/o CL | 0.7441 | 0.7404 | 1.0576 | 0.7910 | 1.1121 |
Ours | 0.7150 | 0.7150 | 0.9680 | 0.7450 | 1.0500 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fang, S.; Wang, J.; Chen, F. A Hyperbolic Graph Neural Network Model with Contrastive Learning for Rating–Review Recommendation. Entropy 2025, 27, 886. https://doi.org/10.3390/e27080886
Fang S, Wang J, Chen F. A Hyperbolic Graph Neural Network Model with Contrastive Learning for Rating–Review Recommendation. Entropy. 2025; 27(8):886. https://doi.org/10.3390/e27080886
Chicago/Turabian StyleFang, Shuyun, Junling Wang, and Fukun Chen. 2025. "A Hyperbolic Graph Neural Network Model with Contrastive Learning for Rating–Review Recommendation" Entropy 27, no. 8: 886. https://doi.org/10.3390/e27080886
APA StyleFang, S., Wang, J., & Chen, F. (2025). A Hyperbolic Graph Neural Network Model with Contrastive Learning for Rating–Review Recommendation. Entropy, 27(8), 886. https://doi.org/10.3390/e27080886