Advances and Applications in Graph Neural Networks (GNNs)

A special issue of AI (ISSN 2673-2688). This special issue belongs to the section "AI Systems: Theory and Applications".

Deadline for manuscript submissions: 10 November 2026 | Viewed by 2287

Special Issue Editors


E-Mail Website
Guest Editor
Machine Learning and Data Science Department, lastminute.com Group, Chiasso, Switzerland
Interests: pattern recognition; machine learning; deep learning; graph neural networks and their applications
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Machine Learning and Data Science Department, lastminute.com Group, Chiasso, Switzerland
Interests: computer vision; machine learning; GNN

Special Issue Information

Dear Colleagues,

Graph Neural Networks (GNNs) have garnered significant attention in machine learning research, enabling novel approaches to modeling and reasoning over complex relational data. This Special Issue seeks to explore both methodological advances in, and real‐world deployments of, GNNs, highlighting cutting‐edge innovations that are pushing the boundaries of what graph‐based AI can achieve.

We invite original contributions that address, but are not limited to, the following topics:

Core GNN architectures and optimization:

  • Novel GNN architectures;
  • Techniques to mitigate over-smoothing and graph bottlenecks;
  • Scalable training strategies for billion-node graphs;

Transfer and few-shot learning on graphs:

  • Domain adaptation across heterogeneous graph domains;
  • Meta-learning frameworks for rapid adaptation to new topologies.

Dynamic, temporal, and spatio-temporal graphs:

  • Modeling evolving network structures and streaming edge updates;
  • Forecasting on spatio-temporal graphs (e.g., traffic, epidemic spread);
  • Online learning and real-time inference in rapidly changing environments.

Graph explainability, interpretability, and trustworthiness:

  • Counterfactual explanations and attribution methods for graph predictions;
  • Uncertainty quantification, robustness to adversarial attacks, and certification.

Applications across domains:

  • Bioinformatics and Cheminformatics: Protein–protein interactions, drug discovery, and genomics;
  • Social Networks and Recommendation Systems: Influence maximization and user profiling;
  • Computer Vision and Graphics: Scene graphs, point cloud analysis, and 3D shape generation;
  • Natural Language Processing: Semantic dependency parsing and document‐level graphs;
  • Finance and Economics: Fraud detection, knowledge graph risk analysis, and market modeling;
  • Energy and Transportation: Power‐grid stability, traffic flow optimization, and route planning. 

Submissions that combine rigorous theoretical insights with compelling empirical validations—especially on large-scale real-world datasets—are highly encouraged. We also welcome comprehensive surveys that map the evolving landscape of graph neural methods.

Dr. Alessandro Rozza
Dr. Marco Leonardi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. AI is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • graph neural networks
  • transfer learning in GNNs
  • GNN applications

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Review

50 pages, 986 KB  
Review
A Survey and Taxonomy of Loss Functions in Machine Learning
by Lorenzo Ciampiconi, Adam Elwood, Marco Leonardi, Ashraf Mohamed and Alessandro Rozza
AI 2026, 7(4), 128; https://doi.org/10.3390/ai7040128 - 1 Apr 2026
Viewed by 1577
Abstract
Most state-of-the-art machine learning techniques revolve around the optimization of loss functions, making the choice of an objective critical to model performance and reliability. Although recent reviews discuss loss functions in specific domains or in deep learning settings, there is still no single [...] Read more.
Most state-of-the-art machine learning techniques revolve around the optimization of loss functions, making the choice of an objective critical to model performance and reliability. Although recent reviews discuss loss functions in specific domains or in deep learning settings, there is still no single reference that presents widely used losses across major task families within a unified formal setting and with consistent optimization-relevant property annotations. In this survey, we compile and systematize the most widely adopted loss functions for regression, classification, generative modeling, ranking, energy-based modeling, and relational learning. Our selection procedure combines seeding from foundational textbooks and prior surveys with cross-checking of highly cited literature and common implementations in mainstream machine learning frameworks. We introduce 52 loss functions and organize them into an intuitive taxonomy, summarizing their theoretical motivation, key mathematical properties, and typical application contexts, with compact appendix tables for quick lookup. This survey is intended as a resource for undergraduate, graduate, and Ph.D. students, as well as researchers seeking a structured reference for selecting and comparing loss functions. Full article
(This article belongs to the Special Issue Advances and Applications in Graph Neural Networks (GNNs))
Show Figures

Figure 1

Back to TopTop