entropy-logo

Journal Browser

Journal Browser

Mutual Information in Statistical Learning

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 31 March 2026 | Viewed by 72

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Statistical Science, Academia Sinica, Academia Rd, No. 128, Section 2, Taipei 115, Taiwan
Interests: categorical data analysis; educational statistics; missing data analysis; nonparametric regression; psychometrics; statistics for functional magnetic resonance imaging

E-Mail Website
Guest Editor
Institute of Statistical Science, Academia Sinica, Academia Rd, No. 128, Section 2, Taipei 115, Taiwan
Interests: psychometric methods; mutual information analysis; neuroimaging; EEG signal processing

Special Issue Information

Dear Colleagues,

Emerging from the coding information theory within classical engineering and physical sciences, Shannon entropy and mutual information (MI) have played crucial roles in data analysis across the fields of biomedical, computer, and psychological sciences. Indeed, MI connects the classical maximum likelihood estimation to the principle of maximum entropy, which, however, has rarely been acknowledged in contemporary statistical inference. In recent decades, MI has been utilized to develop various feature selection (FS) methods, which significantly enhance regression and classification analysis within the machine learning literature. While the identification of ‘relevant’ features and the removal of ‘redundant’ ones have been thoroughly investigated through the classical concept of ‘interaction information’ as explored by McGill (1954), Fano (1961), and Cover and Thomas (1991), a formal derivation of ‘interaction information’ in the context of four or more features is notably absent from the literature. It has also been observed that the orthogonal decomposition of the MI of a random vector is either missing or has been overlooked. In practice, it is widely recognized that estimating empirical MI for both continuous and discrete variables using smoothing windows can pose significant challenges, particularly in the case of high-dimensional data. These observations suggest that further analysis of MI concerning relevant and redundant features, along with empirical MI estimation, should be pursued to advance FS and model selection for data interpretation, classification, and prediction within the realm of statistical learning. This Special Issue invites novel research and review articles on various FS criteria using the MI analysis, together with empirical and simulation studies. Theoretical and experimental learning methods developed through the MI analysis across wide research disciplines are mostly welcome.

Dr. Philip E. Cheng
Prof. Dr. Michelle Liou
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • conditional mutual information
  • discrete data analysis
  • entropy
  • feature selection
  • mutual information

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers

This special issue is now open for submission.
Back to TopTop