Special Issue "Unconventional Methods for Particle Swarm Optimization"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: 31 January 2020.

Special Issue Editor

Guest Editor
Prof. Dr. Leonardo Vanneschi Website E-Mail
NOVA Information Management School (NOVA IMS), Universidade Nova of Lisbon, Campus de Campolide, 1070-312 Lisboa, Portugal
Interests: machine learning; genetic programming; particle swarm optimization

Special Issue Information

Dear Colleagues,

Particle swarm optimization (PSO) is a population-based optimization metaheuristic inspired by the collective dynamics of groups of animals, like insects, birds, and fishes. Recent research trends have indicated the potentiality of the approach and its large possibilities of improvement. With the term “unconventional methods for PSO”, here, we mean modifications of the standard PSO, with the objective of improving its performance, or bestowing on it some particular properties. For instance, new methods for choosing the inertia weight, constriction factor, cognition and social weights; parallelizing PSO in several different ways; defining hybrid algorithms in which PSO is integrated with other types of metaheuristic optimization methods; entropy-based PSO; etc. The study of unconventional methods for PSO is a very lively and active research field, and the objective of this Special Issue is to collect contributions in this recent and exciting area, with particular focus on entropic, information-theoretic, or probability theoretic techniques.

Prof. Dr. Leonardo Vanneschi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Entropy-based PSO
  • Information Theory for PSO
  • Probability Theory for PSO
  • Theoretically motivated hybrid PSO systems
  • Theoretically motivated parallelizations of PSO
  • Theoretically motivated niching
  • New accelertion strategies
  • Automatic static and/or dynamic parameter setting
  • Improvements and/or specializations of particle movements
  • PSO for the optimization/improvement of machine learning methods
  • Real-life applications using theortically motivated unconventional PSO systems

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
A Self-Adaptive Discrete PSO Algorithm with Heterogeneous Parameter Values for Dynamic TSP
Entropy 2019, 21(8), 738; https://doi.org/10.3390/e21080738 - 27 Jul 2019
Abstract
This paper presents a discrete particle swarm optimization (DPSO) algorithm with heterogeneous (non-uniform) parameter values for solving the dynamic traveling salesman problem (DTSP). The DTSP can be modeled as a sequence of static sub-problems, each of which is an instance of the TSP. [...] Read more.
This paper presents a discrete particle swarm optimization (DPSO) algorithm with heterogeneous (non-uniform) parameter values for solving the dynamic traveling salesman problem (DTSP). The DTSP can be modeled as a sequence of static sub-problems, each of which is an instance of the TSP. In the proposed DPSO algorithm, the information gathered while solving a sub-problem is retained in the form of a pheromone matrix and used by the algorithm while solving the next sub-problem. We present a method for automatically setting the values of the key DPSO parameters (except for the parameters directly related to the computation time and size of a problem).We show that the diversity of parameters values has a positive effect on the quality of the generated results. Furthermore, the population in the proposed algorithm has a higher level of entropy. We compare the performance of the proposed heterogeneous DPSO with two ant colony optimization (ACO) algorithms. The proposed algorithm outperforms the base DPSO and is competitive with the ACO. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Graphical abstract

Open AccessArticle
User-Oriented Summaries Using a PSO Based Scoring Optimization Method
Entropy 2019, 21(6), 617; https://doi.org/10.3390/e21060617 - 22 Jun 2019
Abstract
Automatic text summarization tools have a great impact on many fields, such as medicine, law, and scientific research in general. As information overload increases, automatic summaries allow handling the growing volume of documents, usually by assigning weights to the extracted phrases based on [...] Read more.
Automatic text summarization tools have a great impact on many fields, such as medicine, law, and scientific research in general. As information overload increases, automatic summaries allow handling the growing volume of documents, usually by assigning weights to the extracted phrases based on their significance in the expected summary. Obtaining the main contents of any given document in less time than it would take to do that manually is still an issue of interest. In this article, a new method is presented that allows automatically generating extractive summaries from documents by adequately weighting sentence scoring features using Particle Swarm Optimization. The key feature of the proposed method is the identification of those features that are closest to the criterion used by the individual when summarizing. The proposed method combines a binary representation and a continuous one, using an original variation of the technique developed by the authors of this paper. Our paper shows that using user labeled information in the training set helps to find better metrics and weights. The empirical results yield an improved accuracy compared to previous methods used in this field. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Figure 1

Open AccessArticle
Competitive Particle Swarm Optimization for Multi-Category Text Feature Selection
Entropy 2019, 21(6), 602; https://doi.org/10.3390/e21060602 - 18 Jun 2019
Abstract
Multi-label feature selection is an important task for text categorization. This is because it enables learning algorithms to focus on essential features that foreshadow relevant categories, thereby improving the accuracy of text categorization. Recent studies have considered the hybridization of evolutionary feature wrappers [...] Read more.
Multi-label feature selection is an important task for text categorization. This is because it enables learning algorithms to focus on essential features that foreshadow relevant categories, thereby improving the accuracy of text categorization. Recent studies have considered the hybridization of evolutionary feature wrappers and filters to enhance the evolutionary search process. However, the relative effectiveness of feature subset searches of evolutionary and feature filter operators has not been considered. This results in degenerated final feature subsets. In this paper, we propose a novel hybridization approach based on competition between the operators. This enables the proposed algorithm to apply each operator selectively and modify the feature subset according to its relative effectiveness, unlike conventional methods. The experimental results on 16 text datasets verify that the proposed method is superior to conventional methods. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Figure 1

Back to TopTop