Numerical Optimization and Algorithms: 4th Edition

A special issue of Algorithms (ISSN 1999-4893). This special issue belongs to the section "Algorithms for Multidisciplinary Applications".

Deadline for manuscript submissions: closed (30 November 2025) | Viewed by 4107

Special Issue Editors

Special Issue Information

Dear Colleagues,

Numerical algorithms and optimization are widely used in fields of science and engineering, such as physics, environment, mechanics, biology, data science, economics, finance, and so on. These problems are complex, highly nonlinear, and difficult to predict. Over the last decade, computational problems have become popular and have gained much attention due to the improved computer performance, computing methods, and the rapid development of data science technology. However, these developments have also raised various issues and challenges, such as high non-linearity, the curse of dimensionality, uncertainty, complexity, and so on. Therefore, these challenges urgently need to be addressed by developing new numerical algorithms, such as graph theory, optimization algorithms, algebra, uncertainty, data science or analysis, new differential equations solving algorithms and methods, probability, and statistics algorithms and methods.

This Special Issue deals with various numerical algorithms in the fields of both science and engineering.

Prof. Dr. Dunhui Xiao
Prof. Dr. Shuai Li
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • graph theory
  • optimization
  • algebra
  • uncertainty
  • data science
  • differential equations
  • probability and statistics
  • numerical algorithms

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Related Special Issue

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 674 KB  
Article
Parallel Dynamic Programming for the Exact Computation of Density of State for 2D Spin-Crossover Nanomaterials
by Thomas Dufaud, Jorge Linares and Devan Sohier
Algorithms 2026, 19(2), 111; https://doi.org/10.3390/a19020111 - 1 Feb 2026
Viewed by 677
Abstract
We discuss the design, the analysis and the parallel implementation of a dynamic programming approach for the computation of the density of state in the simulation of spin-crossover nanoparticles. The motivation is the computation of a Hamiltonian, which is usually approximated using Monte [...] Read more.
We discuss the design, the analysis and the parallel implementation of a dynamic programming approach for the computation of the density of state in the simulation of spin-crossover nanoparticles. The motivation is the computation of a Hamiltonian, which is usually approximated using Monte Carlo techniques. However, physicists need better control of the accuracy of this approximation. An exact counting algorithm allows this error to be controlled, and also measures the impact on accuracy for the entire simulation. We propose an exact parallel counting algorithm and its two-level parallel implementation to tackle nanoscale problems on HPC architecture. We discuss its scalability and feasibility for 2D grids of n molecules. The new algorithm enables the exact computation for a three-variable density of state at nanoscale, which is seen as intractable. A comparison between the expectation of the model and implementation is proposed. The parallel complexity achieved is O(n522n) and the results allow the prediction of never-before-seen phenomena. Full article
(This article belongs to the Special Issue Numerical Optimization and Algorithms: 4th Edition)
Show Figures

Figure 1

19 pages, 554 KB  
Article
Bias Reduction in Robust Mean–Geometric Mean Linking via SIMEX
by Alexander Robitzsch
Algorithms 2026, 19(1), 59; https://doi.org/10.3390/a19010059 - 9 Jan 2026
Viewed by 431
Abstract
Robust mean–geometric mean (MGM) linking is a method for comparing the performance of two groups on a test involving dichotomous items and is particularly suited to settings with fixed and sparse differential item functioning (DIF). However, robust MGM linking has been shown to [...] Read more.
Robust mean–geometric mean (MGM) linking is a method for comparing the performance of two groups on a test involving dichotomous items and is particularly suited to settings with fixed and sparse differential item functioning (DIF). However, robust MGM linking has been shown to yield biased estimates in finite samples because the estimated item parameters are affected by sampling error, which in turn induces bias in the estimated linking parameters. To address this issue, the simulation extrapolation (SIMEX) method is applied to robust MGM linking to reduce bias in the linking parameter estimates. Results from a simulation study demonstrate that SIMEX reduces bias in robust MGM linking. Moreover, SIMEX with a linear extrapolation function also reduces the variance of the parameter estimates in the absence of DIF effects. These findings indicate that the application of SIMEX in robust MGM linking methods can be generally recommended for empirical research aimed at removing DIF items from group comparisons. Full article
(This article belongs to the Special Issue Numerical Optimization and Algorithms: 4th Edition)
Show Figures

Figure 1

38 pages, 4891 KB  
Article
Thermonuclear Fusion Based Quantum-Inspired Algorithm for Solving Multiobjective Optimization Problems
by Liliya Demidova and Vladimir Maslennikov
Algorithms 2025, 18(12), 793; https://doi.org/10.3390/a18120793 - 15 Dec 2025
Viewed by 748
Abstract
This paper introduces a novel quantum-inspired algorithm for numerical multiobjective optimization, uniquely integrating the multilevel structure of qudits with principles of controlled thermonuclear fusion. Moving beyond conventional qubit-based approaches, the algorithm leverages the qudit’s higher-dimensional state space to enhance search capabilities. Fusion-inspired dynamics—modeling [...] Read more.
This paper introduces a novel quantum-inspired algorithm for numerical multiobjective optimization, uniquely integrating the multilevel structure of qudits with principles of controlled thermonuclear fusion. Moving beyond conventional qubit-based approaches, the algorithm leverages the qudit’s higher-dimensional state space to enhance search capabilities. Fusion-inspired dynamics—modeling particle interaction, energy release, and plasma cooling—provide a powerful metaheuristic framework for navigating complex, high-dimensional Pareto fronts. A hybrid quantum-classical version of the algorithm is presented, designed to exploit the complementary strengths of both computational paradigms for improved efficiency in solving dynamic multiobjective problems. Experimental evaluation on standard dynamic multiobjective benchmarks demonstrates clear performance advantages. Both the quantum-inspired and hybrid variants consistently outperform leading classical algorithms such as NSGA-III, MOEA/D and GDE3, as well as the quantum-inspired NSGA-III, in key metrics: identifying a greater number of unique non-dominated solutions, ensuring superior uniformity along the Pareto front, maintaining stable convergence across generations, and achieving higher accuracy in approximating the ideal solution. Full article
(This article belongs to the Special Issue Numerical Optimization and Algorithms: 4th Edition)
Show Figures

Figure 1

41 pages, 762 KB  
Article
MCMC Methods: From Theory to Distributed Hamiltonian Monte Carlo over PySpark
by Christos Karras, Leonidas Theodorakopoulos, Aristeidis Karras, George A. Krimpas, Charalampos-Panagiotis Bakalis and Alexandra Theodoropoulou
Algorithms 2025, 18(10), 661; https://doi.org/10.3390/a18100661 - 17 Oct 2025
Cited by 2 | Viewed by 1770
Abstract
The Hamiltonian Monte Carlo (HMC) method is effective for Bayesian inference but suffers from synchronization overhead in distributed settings. We propose two variants: a distributed HMC (DHMC) baseline with synchronized, globally exact gradient evaluations and a communication-avoiding leapfrog HMC (CALF-HMC) method that interleaves [...] Read more.
The Hamiltonian Monte Carlo (HMC) method is effective for Bayesian inference but suffers from synchronization overhead in distributed settings. We propose two variants: a distributed HMC (DHMC) baseline with synchronized, globally exact gradient evaluations and a communication-avoiding leapfrog HMC (CALF-HMC) method that interleaves local surrogate micro-steps with a single–global Metropolis–Hastings correction per trajectory. Implemented on Apache Spark/PySpark and evaluated on a large synthetic logistic regression (N=107, d=100, workers J{4,8,16,32}), DHMC attained an average acceptance of 0.986, mean ESS of 1200, and wall-clock of 64.1 s per evaluation run, yielding 18.7 ESS/s; CALF-HMC achieved an acceptance of 0.942, mean ESS of 5.1, and 14.8 s, i.e., ≈0.34 ESS/s under the tested surrogate configuration. While DHMC delivered higher ESS/s due to robust mixing under conservative integration, CALF-HMC reduced the per-trajectory runtime and exhibited more favorable scaling as inter-worker latency increased. The study contributes (i) a systems-oriented communication cost model for distributed HMC, (ii) an exact, communication-avoiding leapfrog variant, and (iii) practical guidance for ESS/s-optimized tuning on clusters. Full article
(This article belongs to the Special Issue Numerical Optimization and Algorithms: 4th Edition)
Show Figures

Figure 1

Back to TopTop