Special Issue "Evolutionary Computation and Mathematical Programming 2020"

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Mathematics and Computer Science".

Deadline for manuscript submissions: 31 March 2021.

Special Issue Editors

Prof. Dr. Marjan Mernik
grade Website
Guest Editor
Faculty of Electrical Engineering and Computer Science, University of Maribor, Koroška cesta 46, 2000 Maribor, Slovenia
Interests: concepts and implementation of programming languages; formal language definition, attribute grammars; compiler generators; domain specific languages; grammar-based systems; grammatical inference; meta-heuristics, single and multi-objective optimization
Special Issues and Collections in MDPI journals
Dr. Foad Nazari
Website
Guest Editor
Villanova Center for Analytics of Dynamic Systems (VCADS), Faculty of Engineering, Villanova University Villanova, PA 19085, USA
Interests: artificial intelligence, machine learning, meta-heuristics, single and multi-objective optimization, condition monitoring, vibration analysis, bio-medical diagnosis
Special Issues and Collections in MDPI journals

Special Issue Information

Evolutionary computation is an abstraction from the theory of biological evolution for the creation of techniques or methodologies to obtain highly optimized solutions in a wide range of complex optimization problems. In various areas of human activity, computer programs based on probability and mathematical models must be used to choose the best alternative from a set of available options and to determine the most efficient method to allocate scarce resources. Here, the concept of mathematical programming is employed.

This Special Issue entitled “Evolutionary Computation and Mathematical Programming 2020” intends to collect the recent advances and studies associated with the theoretical development of this field as well as its new applications in engineering and science. All researchers are invited to contribute to this Special Issue with their original research articles, review papers, and short communications. Submitted manuscripts should clearly indicate the contribution of the work and meet high quality standards.

Contributions related to the advancement of metaheuristic optimization algorithms are particularly appreciated.

Prof. Marjan Mernik
Dr. Foad Nazari
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Linear, nonlinear, stochastic, integer, and combinatorial optimization
  • Mathematical programming
  • Evolutionary algorithms
  • Swarm intelligence
  • Memetic algorithms

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Solving the Capacitated Vertex K-Center Problem through the Minimum Capacitated Dominating Set Problem
Mathematics 2020, 8(9), 1551; https://doi.org/10.3390/math8091551 - 10 Sep 2020
Abstract
The capacitated vertex k-center problem receives as input a complete weighted graph and a set of capacity constraints. Its goal is to find a set of k centers and an assignment of vertices that does not violate the capacity constraints. Furthermore, the distance [...] Read more.
The capacitated vertex k-center problem receives as input a complete weighted graph and a set of capacity constraints. Its goal is to find a set of k centers and an assignment of vertices that does not violate the capacity constraints. Furthermore, the distance from the farthest vertex to its assigned center has to be minimized. The capacitated vertex k-center problem models real situations where a maximum number of clients must be assigned to centers and the travel time or distance from the clients to their assigned center has to be minimized. These centers might be hospitals, schools, police stations, among many others. The goal of this paper is to explicitly state how the capacitated vertex k-center problem and the minimum capacitated dominating set problem are related. We present an exact algorithm that consists of solving a series of integer programming formulations equivalent to the minimum capacitated dominating set problem over the bottleneck input graph. Lastly, we present an empirical evaluation of the proposed algorithm using off-the-shelf optimization software. Full article
(This article belongs to the Special Issue Evolutionary Computation and Mathematical Programming 2020)
Open AccessArticle
A Hybrid Particle Swarm Optimization Algorithm Enhanced with Nonlinear Inertial Weight and Gaussian Mutation for Job Shop Scheduling Problems
Mathematics 2020, 8(8), 1355; https://doi.org/10.3390/math8081355 - 13 Aug 2020
Abstract
Job shop scheduling problem (JSSP) has high theoretical and practical significance in academia and manufacturing respectively. Therefore, scholars in many different fields have been attracted to study this problem, and many meta-heuristic algorithms have been proposed to solve this problem. As a meta-heuristic [...] Read more.
Job shop scheduling problem (JSSP) has high theoretical and practical significance in academia and manufacturing respectively. Therefore, scholars in many different fields have been attracted to study this problem, and many meta-heuristic algorithms have been proposed to solve this problem. As a meta-heuristic algorithm, particle swarm optimization (PSO) has been used to optimize many practical problems in industrial manufacturing. This paper proposes a hybrid PSO enhanced with nonlinear inertia weight and and Gaussian mutation (NGPSO) to solve JSSP. Nonlinear inertia weight improves local search capabilities of PSO, while Gaussian mutation strategy improves the global search ability of NGPSO, which is beneficial to the population to maintain diversity and reduce probability of the algorithm falling into the local optimal solution. The proposed NGPSO algorithm is implemented to solve 62 benchmark instances of JSSP, and the experimental results are compared with other algorithms. The results obtained by analyzing the experimental data show that the algorithm is better than other comparison algorithms in solving JSSP. Full article
(This article belongs to the Special Issue Evolutionary Computation and Mathematical Programming 2020)
Show Figures

Figure 1

Open AccessArticle
Towards a Better Basis Search through a Surrogate Model-Based Epistasis Minimization for Pseudo-Boolean Optimization
Mathematics 2020, 8(8), 1287; https://doi.org/10.3390/math8081287 - 04 Aug 2020
Abstract
Epistasis, which indicates the difficulty of a problem, can be used to evaluate the basis of the space in which the problem lies. However, calculating epistasis may be challenging as it requires all solutions to be searched. In this study, a method for [...] Read more.
Epistasis, which indicates the difficulty of a problem, can be used to evaluate the basis of the space in which the problem lies. However, calculating epistasis may be challenging as it requires all solutions to be searched. In this study, a method for constructing a surrogate model, based on deep neural networks, that estimates epistasis is proposed for basis evaluation. The proposed method is applied to the Variant-OneMax problem and the NK-landscape problem. The method is able to make successful estimations on a similar level to basis evaluation based on actual epistasis, while significantly reducing the computation time. In addition, when compared to the epistasis-based basis evaluation, the proposed method is found to be more efficient. Full article
(This article belongs to the Special Issue Evolutionary Computation and Mathematical Programming 2020)
Show Figures

Figure 1

Open AccessArticle
The Use of Evolutionary Methods for the Determination of a DC Motor and Drive Parameters Based on the Current and Angular Speed Response
Mathematics 2020, 8(8), 1269; https://doi.org/10.3390/math8081269 - 03 Aug 2020
Abstract
Determination of the seven parameters of a Direct Current (DC) motor and drive is presented, based on the speed and current step responses. The method is extended for the motor and drive parameter determination in the case of a controlled drive. The influence [...] Read more.
Determination of the seven parameters of a Direct Current (DC) motor and drive is presented, based on the speed and current step responses. The method is extended for the motor and drive parameter determination in the case of a controlled drive. The influence of a speed controller on the responses is considered in the motor model with the use of the measured voltage. Current limitation of the supply unit is also considered in the DC motor model. For parameter determination, a motor model is used, which is determined with two coupled differential equations. Euler’s first-order and Runge–Kutta fourth-order methods are used for the motor model simulations. For parameter determination, evolutionary methods are used and compared to each other. Methods used are Genetic Algorithm, Differential Evolutions with two strategies, Teaching–Learning-Based Optimization, and Artificial Bee Colony. To improve results, deviation of the motor model simulation time is used and Memory Assistance with three different approaches is analyzed to shorten the calculation time. The tests showed that Differential Evolution (DE)/rand/1/exp is the most appropriate for the presented problem. The division of the motor model simulation time improves the results. For the presented problem, short-term memory assistance can be suggested for calculation time reduction. Full article
(This article belongs to the Special Issue Evolutionary Computation and Mathematical Programming 2020)
Show Figures

Graphical abstract

Open AccessArticle
Estimating the Strain-Rate-Dependent Parameters of the Johnson-Cook Material Model Using Optimisation Algorithms Combined with a Response Surface
Mathematics 2020, 8(7), 1105; https://doi.org/10.3390/math8071105 - 05 Jul 2020
Abstract
Under conditions where a product is subjected to extreme mechanical loading over a very short time period, the strain rate has considerable influence on the behaviour of the product’s material. To simulate the behaviour of the material accurately under these loading conditions, the [...] Read more.
Under conditions where a product is subjected to extreme mechanical loading over a very short time period, the strain rate has considerable influence on the behaviour of the product’s material. To simulate the behaviour of the material accurately under these loading conditions, the appropriate strain-rate parameters for the selected material model should be used. The aim of this paper is to present a quick method for easily determining the appropriate strain-rate-dependent parameter values of the selected material model. The optimisation procedure described in the article combines the design-of-experiment (DoE) technique, finite-element simulations, modelling a response surface and an evolutionary algorithm. First, a non-standard dynamic experiment was designed to study the behaviour of thin, flat, metal sheets during an impact. The experimental data from this dynamic and the conventional tensile experiments for mild steel were the basis for the determination of the Johnson-Cook material model parameters. The paper provides a comparison of two optimisation processes with different DoE techniques and with three different optimisation algorithms (one traditional and two metaheuristic). The performances of the presented method are compared, and the engineering applicability of the results is discussed. The identified parameter values, which were estimated with the presented procedure, are very similar to those from the literature. The paper shows how the application of a properly designed plan of simulations can significantly reduce the simulation time, with only a minor influence on the estimated parameters. Furthermore, it can be concluded that in some cases the traditional optimisation method is as good as the two metaheuristic methods. Finally, it was proven that experiments with different strain rates must be carried out when estimating the corresponding material parameters. Full article
(This article belongs to the Special Issue Evolutionary Computation and Mathematical Programming 2020)
Show Figures

Figure 1

Open AccessArticle
Optimizing the Estimation of a Histogram-Bin Width—Application to the Multivariate Mixture-Model Estimation
Mathematics 2020, 8(7), 1090; https://doi.org/10.3390/math8071090 - 03 Jul 2020
Abstract
A maximum-likelihood estimation of a multivariate mixture model’s parameters is a difficult problem. One approach is to combine the REBMIX and EM algorithms. However, the REBMIX algorithm requires the use of histogram estimation, which is the most rudimentary approach to an empirical density [...] Read more.
A maximum-likelihood estimation of a multivariate mixture model’s parameters is a difficult problem. One approach is to combine the REBMIX and EM algorithms. However, the REBMIX algorithm requires the use of histogram estimation, which is the most rudimentary approach to an empirical density estimation and has many drawbacks. Nevertheless, because of its simplicity, it is still one of the most commonly used techniques. The main problem is to estimate the optimum histogram-bin width, which is usually set by the number of non-overlapping, regularly spaced bins. For univariate problems it is usually denoted by an integer value; i.e., the number of bins. However, for multivariate problems, in order to obtain a histogram estimation, a regular grid must be formed. Thus, to obtain the optimum histogram estimation, an integer-optimization problem must be solved. The aim is therefore the estimation of optimum histogram binning, alone and in application to the mixture model parameter estimation with the REBMIX&EM strategy. As an estimator, the Knuth rule was used. For the optimization algorithm, a derivative based on the coordinate-descent optimization was composed. These proposals yielded promising results. The optimization algorithm was efficient and the results were accurate. When applied to the multivariate, Gaussian-mixture-model parameter estimation, the results were competitive. All the improvements were implemented in the rebmix R package. Full article
(This article belongs to the Special Issue Evolutionary Computation and Mathematical Programming 2020)
Show Figures

Figure 1

Open AccessArticle
From Grammar Inference to Semantic Inference—An Evolutionary Approach
Mathematics 2020, 8(5), 816; https://doi.org/10.3390/math8050816 - 18 May 2020
Abstract
This paper describes a research work on Semantic Inference, which can be regarded as an extension of Grammar Inference. The main task of Grammar Inference is to induce a grammatical structure from a set of positive samples (programs), which can sometimes also be [...] Read more.
This paper describes a research work on Semantic Inference, which can be regarded as an extension of Grammar Inference. The main task of Grammar Inference is to induce a grammatical structure from a set of positive samples (programs), which can sometimes also be accompanied by a set of negative samples. Successfully applying Grammar Inference can result only in identifying the correct syntax of a language. With the Semantic Inference a further step is realised, namely, towards inducing language semantics. When syntax and semantics can be inferred, a complete compiler/interpreter can be generated solely from samples. In this work Evolutionary Computation was employed to explore and exploit the enormous search space that appears in Semantic Inference. For the purpose of this research work the tool LISA.SI has been developed on the top of the compiler/interpreter generator tool LISA. The first results are encouraging, since we were able to infer the semantics only from samples and their associated meanings for several simple languages, including the Robot language. Full article
(This article belongs to the Special Issue Evolutionary Computation and Mathematical Programming 2020)
Show Figures

Figure 1

Back to TopTop