You are currently viewing a new version of our website. To view the old version click .

Information Theoretic Learning

This special issue belongs to the section “Information Theory, Probability and Statistics“.

Special Issue Information

Dear Colleague,

In the past decades, especially in recent years, entropy and related information theoretic measures (e.g. mutual information) have been successfully applied in machine learning (supervised or unsupervised) and signal processing. Information theoretic quantities can capture higher-order statistics and offer potentially significant performance improvement in machine learning applications. In information theoretic learning (ITL), the measures from information theory (entropy, mutual information, divergences, etc.) are often used as an optimization cost instead of the conventional second-order statistical measures such as variance and covariance. For example, in supervised learning, such as regression, the problem can be formulated as that of minimizing the entropy of the error between model output and desired response. This optimization criterion is called in ITL the minimum error entropy (MEE) criterion. The information theoretic learning also links information theory, nonparametric estimators, and reproducing kernel Hilbert spaces (RKHS) in a simple and unconventional way. In particular, the correntropy as a nonlinear similarity measure in kernel space has its root in Renyi's entropy. Since correntropy (especially with a small kernel bandwidth) is insensitive to outliers, it is naturally a robust cost for machine learning. The correntropy induced metric (CIM) as an approximation of the l0 norm can also be used as a sparsity penalty in sparse learning.

In this Special Issue, we seek contributions that apply information theoretic quantities (entropy, mutual information, divergences, etc.) and related measures, such as correntropy, to deal with machine learning problems. The scope of the contributions will be very broad, including theoretical research and practical applications to regression, classification, clustering, graph and kernel learning, deep learning, and so on.

Prof. Dr. Badong Chen
Prof. Dr. Jose C. Principe
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.


Keywords

  • entropy
  • correntropy
  • information theoretic learning
  • kernel methods
  • machine learning

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Published Papers

Get Alerted

Add your email address to receive forthcoming issues of this journal.

XFacebookLinkedIn
Entropy - ISSN 1099-4300