Metaheuristics and Artificial Intelligence: Latest Advances and Prospects

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "E1: Mathematics and Computer Science".

Deadline for manuscript submissions: 20 December 2025 | Viewed by 2069

Special Issue Editors


E-Mail Website
Guest Editor
1. Faculty of Physics and Applied Computer Science, AGH University of Science and Technology, 30-059 Krakow, Poland
2. Systems Research Institute, Polish Academy of Sciences, 30-059 Warsaw, Poland
Interests: data science; artificial neural networks; metaheuristics; swarm intelligence; evolutionary computation; fuzzy logic; machine learning; deep learning; explainable artificial intelligence
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Metaheuristics and artificial intelligence have become increasingly important in solving complex optimization problems in various fields. Metaheuristics are powerful search algorithms that can find near-optimal solutions to problems that are difficult or impossible to solve using traditional methods. On the other hand, artificial intelligence (AI) techniques, including machine learning, deep learning, and neural networks, have shown remarkable success in solving many real-world problems.

This Special Issue in the Mathematics journal aims to provide an overview of the latest advances and prospects in the field of metaheuristics and artificial intelligence. It features contributions from leading experts in the field who present their research on new and innovative methods and techniques that have been developed to tackle complex optimization problems.

The topics covered in this Special Issue include, but are not limited to, optimization algorithms, metaheuristics, swarm intelligence, genetic algorithms, evolutionary computation, artificial neural networks, fuzzy logic, machine learning, deep learning, and hybrid systems. In addition to the aforementioned topics, this Special Issue also focuses on the importance of explainable artificial intelligence (XAI) techniques which aim to provide interpretable and transparent models and algorithms to enhance the understanding and trustworthiness of the decision-making process in AI systems. This Special Issue provides a comprehensive overview of the latest research in the field and highlights the ability of these techniques to solve a wide range of problems in diverse areas such as engineering, finance, healthcare, and transportation.

Prof. Dr. Piotr A. Kowalski
Dr. Rafal Scherer
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • metaheuristics
  • artificial intelligence
  • optimization
  • swarm intelligence
  • genetic algorithms
  • evolutionary computation
  • artificial neural networks
  • fuzzy logic
  • machine learning
  • deep learning
  • hybrid systems
  • explainable artificial intelligence

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 308 KiB  
Article
Comparative Study of Modern Differential Evolution Algorithms: Perspectives on Mechanisms and Performance
by Janez Brest and Mirjam Sepesy Maučec
Mathematics 2025, 13(10), 1556; https://doi.org/10.3390/math13101556 - 9 May 2025
Viewed by 257
Abstract
Since the discovery of the Differential Evolution algorithm, new and improved versions have continuously emerged. In this paper, we review selected algorithms based on Differential Evolution that have been proposed in recent years. We examine the mechanisms integrated into them and compare the [...] Read more.
Since the discovery of the Differential Evolution algorithm, new and improved versions have continuously emerged. In this paper, we review selected algorithms based on Differential Evolution that have been proposed in recent years. We examine the mechanisms integrated into them and compare the performance of algorithms. To compare their performances, statistical comparisons were used as they enable us to draw reliable conclusions about the algorithms’ performances. We use the Wilcoxon signed-rank test for pairwise comparisons and the Friedman test for multiple comparisons. Subsequently, the Mann–Whitney U-score test was added. We conducted not only a cumulative analysis of algorithms, but we also focused on their performances regarding the function family (i.e., unimodal, multimodal, hybrid, and composition functions). Experimental results of algorithms were obtained on problems defined for the CEC’24 Special Session and Competition on Single Objective Real Parameter Numerical Optimization. Problem dimensions of 10, 30, 50, and 100 were analyzed. In this paper, we highlight promising mechanisms for further development and improvements based on the study of the selected algorithms. Full article
Show Figures

Figure 1

36 pages, 6443 KiB  
Article
A Model for Learning-Curve Estimation in Efficient Neural Architecture Search and Its Application in Predictive Health Maintenance
by David Solís-Martín, Juan Galán-Páez and Joaquín Borrego-Díaz
Mathematics 2025, 13(4), 555; https://doi.org/10.3390/math13040555 - 7 Feb 2025
Cited by 1 | Viewed by 856
Abstract
A persistent challenge in machine learning is the computational inefficiency of neural architecture search (NAS), particularly in resource-constrained domains like predictive maintenance. This work introduces a novel learning-curve estimation framework that reduces NAS computational costs by over 50% while maintaining model performance, addressing [...] Read more.
A persistent challenge in machine learning is the computational inefficiency of neural architecture search (NAS), particularly in resource-constrained domains like predictive maintenance. This work introduces a novel learning-curve estimation framework that reduces NAS computational costs by over 50% while maintaining model performance, addressing a critical bottleneck in automated machine learning design. By developing a data-driven estimator trained on 62 different predictive maintenance datasets, we demonstrate a generalized approach to early-stopping trials during neural network optimization. Our methodology not only reduces computational resources but also provides a transferable technique for efficient neural network architecture exploration across complex industrial monitoring tasks. The proposed approach achieves a remarkable balance between computational efficiency and model performance, with only a 2% performance degradation, showcasing a significant advancement in automated neural architecture optimization strategies. Full article
Show Figures

Figure 1

Back to TopTop