E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Entropy-based Data Mining"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".

Deadline for manuscript submissions: 31 January 2018

Special Issue Editors

Guest Editor
Dr. Massimiliano Zanin

Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, Spain
Website | E-Mail
Interests: complex systems; complex networks; network science; data mining
Guest Editor
Dr. Ernestina Menasalvas

Centro de Tecnología Biomédica, Universidad Politécnica de Madrid, Spain
Website | E-Mail
Interests: Big Data; Predictive Analytics; Data Mining; Data stream Mining

Special Issue Information

Dear Colleagues,

Entropy and data mining are not so distant as concepts as it may initially appear. They both share a common idea: Information contained in data presents some regularities, or structures, which we ought to understand in order to better understand the system under study. If entropy aims at assessing the presence of these structures, data mining goes one step further, by extracting and making them, explicitly, for further use; however, it is clear that the former is a first and necessary step for the latter.

Not surprising, entropy and data mining have had an intermingled history. Specifically, entropy has been used extensively to define and support data mining algorithms. Examples include the use of entropy metrics as splitting and pruning criteria in Decision Trees; as a mean to weight distances in high-dimensional k-mean clustering algorithms; to select features subsets in classification ensembles; and as a criterion to combine multiple classifiers. Entropy has also buttressed the creation of data mining models, as in maximum entropy classifiers, implementations of the multinomial logistic regression concept, and in outlier detection. On the other hand, entropy has also been used as a way to create new features from data, in order to feed standard data mining algorithms. For instance, different types of entropies have been used to describe time series, e.g., to distinguish between normal and ictal brain dynamics, or to assess heart rate complexity; to describe symbolic sequences, to then compare a set of them, as in DNA and in the identification of protein coding and non-coding sequences; or to assess the complexity of graphs and networks, in order to then distinguish and classify them.

This Special Issue seeks contributions clarifying and strengthening the relationship between these two research fields, with a special focus on, but not limited to, the improvement of data-mining algorithms through the entropy concept, and on the application of entropy in real-world data-mining tasks. We welcome theoretical, as well as experiment works, original research and review papers.

Dr. Massimiliano Zanin
Dr. Ernestina Menasalvas
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Data mining algorithms
  • Classification
  • Clustering
  • Feature selection
  • Time series analysis
  • Network entropy

Published Papers (1 paper)

View options order results:
result details:
Displaying articles 1-1
Export citation of selected articles as:

Research

Open AccessArticle Cross Entropy Method Based Hybridization of Dynamic Group Optimization Algorithm
Entropy 2017, 19(10), 533; doi:10.3390/e19100533
Received: 7 August 2017 / Revised: 21 September 2017 / Accepted: 29 September 2017 / Published: 9 October 2017
PDF Full-text (793 KB) | HTML Full-text | XML Full-text
Abstract
Recently, a new algorithm named dynamic group optimization (DGO) has been proposed, which lends itself strongly to exploration and exploitation. Although DGO has demonstrated its efficacy in comparison to other classical optimization algorithms, DGO has two computational drawbacks. The first one is related
[...] Read more.
Recently, a new algorithm named dynamic group optimization (DGO) has been proposed, which lends itself strongly to exploration and exploitation. Although DGO has demonstrated its efficacy in comparison to other classical optimization algorithms, DGO has two computational drawbacks. The first one is related to the two mutation operators of DGO, where they may decrease the diversity of the population, limiting the search ability. The second one is the homogeneity of the updated population information which is selected only from the companions in the same group. It may result in premature convergence and deteriorate the mutation operators. In order to deal with these two problems in this paper, a new hybridized algorithm is proposed, which combines the dynamic group optimization algorithm with the cross entropy method. The cross entropy method takes advantage of sampling the problem space by generating candidate solutions using the distribution, then it updates the distribution based on the better candidate solution discovered. The cross entropy operator does not only enlarge the promising search area, but it also guarantees that the new solution is taken from all the surrounding useful information into consideration. The proposed algorithm is tested on 23 up-to-date benchmark functions; the experimental results verify that the proposed algorithm over the other contemporary population-based swarming algorithms is more effective and efficient. Full article
(This article belongs to the Special Issue Entropy-based Data Mining)
Figures

Figure 1

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
E-Mail: 
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Special Issue Edit a special issue Review for Entropy
logo
loading...
Back to Top