Research on Graph Neural Networks and Knowledge Graph

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "E1: Mathematics and Computer Science".

Deadline for manuscript submissions: 30 September 2025 | Viewed by 443

Special Issue Editors


E-Mail Website
Guest Editor
School of Computer and Control Engineering, Yantai University, Yantai 264005, China
Interests: graph neural networks; graph convolutional networks; intelligence control; multi-agent systems; machine learning

E-Mail Website
Guest Editor
Department of Computer Science and Technology, Ocean University of China, Qingdao 266005, China
Interests: data mining; machine learning; database systems
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Computer Science and Engineering, Central South University, Changsha 410083, China
Interests: spatiotemporal data mining; graph data mining; deep learning; urban computing
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Graph Neural Networks (GNNs) and Knowledge Graphs have emerged as powerful tools in artificial intelligence and data science. GNNs excel at processing graph-structured data, while Knowledge Graphs effectively represent and organize complex relationships between entities. These technologies have found applications across numerous domains including recommendation systems, drug discovery, social network analysis, and natural language processing. The advances in these fields have enabled more sophisticated ways to capture and utilize structural information in data.

This Special Issue focuses on innovative approaches and applications in GNNs and Knowledge Graphs. It provides a platform for researchers to present their novel work in areas such as graph representation learning, knowledge graph completion, reasoning over graphs, and their real-world applications. This will help advance our understanding of graph-based deep learning and knowledge representation.

Prof. Dr. Zhaowei Liu
Prof. Dr. Yanwei Yu
Prof. Dr. Senzhang Wang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • graph neural networks
  • knowledge graph representation
  • deep graph learning

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 12581 KiB  
Article
Aggregation and Pruning for Continuous Incremental Multi-Task Inference
by Lining Li, Fenglin Cen, Quan Feng and Ji Xu
Mathematics 2025, 13(9), 1414; https://doi.org/10.3390/math13091414 - 25 Apr 2025
Viewed by 138
Abstract
In resource-constrained mobile systems, efficiently handling incrementally added tasks under dynamically evolving requirements is a critical challenge. To address this, we propose aggregate pruning (AP), a framework that combines pruning with filter aggregation to optimize deep neural networks for continuous incremental multi-task learning [...] Read more.
In resource-constrained mobile systems, efficiently handling incrementally added tasks under dynamically evolving requirements is a critical challenge. To address this, we propose aggregate pruning (AP), a framework that combines pruning with filter aggregation to optimize deep neural networks for continuous incremental multi-task learning (MTL). The approach reduces redundancy by dynamically pruning and aggregating similar filters across tasks, ensuring efficient use of computational resources while maintaining high task-specific performance. The aggregation strategy enables effective filter sharing across tasks, significantly reducing model complexity. Additionally, an adaptive mechanism is incorporated into AP to adjust filter sharing based on task similarity, further enhancing efficiency. Experiments on different backbone networks, including LeNet, VGG, ResNet, and so on, show that AP achieves substantial parameter reduction and computational savings with minimal accuracy loss, outperforming existing pruning methods and even surpassing non-pruning MTL techniques. The architecture-agnostic design of AP also enables potential extensions to complex architectures like graph neural networks (GNNs), offering a promising solution for incremental multi-task GNNs. Full article
(This article belongs to the Special Issue Research on Graph Neural Networks and Knowledge Graph)
Show Figures

Figure 1

Back to TopTop