entropy-logo

Journal Browser

Journal Browser

Advances in Information-Theoretic Methods for Representation Learning and Data Science

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 1 February 2027 | Viewed by 196

Special Issue Editor


E-Mail Website
Guest Editor
Department of Electrical and Computer Engineering, University of Kentucky, Lexington, KY 40506, USA
Interests: machine learning for signal processing; information theoretic learning; representation learning; computer vision; computational neuroscience
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Information theory is at the core of machine learning and data science algorithms and methodologies. Quantities such as entropy, divergence, and mutual information provide a formidable framework for quantifying information content and relations among random variables. In representation learning, information-theoretic quantities can be used to evaluate and guide the process of feature learning, and in data science, to derive insight from data while making minimal assumptions. Recent advances in information-theoretic methods in machine learning include the development of new dependence measures and divergences in high-dimensional spaces, information-theoretic objectives for self-supervised learning and reinforcement learning, and regularization frameworks based on entropy and the information bottleneck. Moreover, recent research provides empirical evidence that quantities such as representation entropy or the effective rank of the internal representations of a model correlate with performance metrics on various tasks, further motivating the use of information-theoretic quantities in understanding the capabilities of the learned representation in foundation models. This Special Issue provides a venue to gather the latest research on information theory for representation learning and data science. The following is a list of topics of interest:

  • Novel estimators of information-theoretic quantities in high-dimensional spaces as well as alternatives to Shannon’s entropy, Kullback–Leibler divergence, and mutual information.
  • Theoretical and empirical analysis of representation learning algorithms and models using information-theoretic quantities.
  • Information theory for causal representation learning.
  • Information theory for interpretable models.

While this list illustrates some of the relevant topics, it is by no means comprehensive and articles covering other aspects of information theory for representation learning and data science are welcome.

Dr. Luis Gonzalo Sánchez Giraldo
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information-theoretic learning
  • entropy regularization
  • information noise contrastive estimation
  • neural scaling laws
  • information bottleneck
  • information-theoretic attribution methods
  • estimators of entropy, divergence and mutual information

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers

This special issue is now open for submission.
Back to TopTop