You are currently viewing a new version of our website. To view the old version click .

Information-Theoretic Approaches for Machine Learning and AI

This special issue belongs to the section “Information Theory, Probability and Statistics“.

Special Issue Information

Dear Colleagues,

With the rapid development of artificial intelligence (AI) technology, especially large language models, the ways in which information is acquired, processed, and transmitted are undergoing revolutionary changes. In this context, Shannon entropy and information theory, as fundamental theories for understanding and measuring information, play a crucial role.

As the complexity of deep learning models continues to increase, their internal mechanisms often become a “black box”, posing challenges to the credibility and application of these models. By introducing methods from information theory, we can explore how to quantify the uncertainty and information flow within models, thereby revealing their decision-making processes. This not only aids in understanding the internal workings of the models but also provides effective guidance for model optimization and downstream tasks, such as multimodal compression and knowledge editing. Simultaneously, quantum entropy and quantum information theory offer entirely new perspectives and tools, which are expected to propel the forefront of AI in computational capabilities, algorithm design, and secure communication. Coding theory also plays a critical role in machine learning, by improving the efficiency, privacy, and security of data processing through information encoding and error correction.

The aim of this Special Issue is to attract research investigations, from an information–theoretic perspective, addressing current challenges faced by theory and applications of machine learning. Prospective authors are invited to submit original research contributions on leveraging information theory and quantum information theory, in solving problems on (but not limited to) the following topics:

  • Model interpretability;
  • Reinforcement learning;
  • Data compression and semantic communication;
  • Federated learning;
  • Large language models;
  • Optimization;
  • Sustainable AI;
  • Security and privacy;
  • Unbiasedness and fairness in AI.

Prof. Dr. Songze Li
Prof. Dr. Linqi Song
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory
  • coding theory
  • data compression
  • quantum computing
  • semantic information theory
  • statistical learning theory
  • reinforcement learning
  • large language models
  • federated learning
  • security and privacy
  • unbiasedness and fairness

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Published Papers

Get Alerted

Add your email address to receive forthcoming issues of this journal.

XFacebookLinkedIn
Entropy - ISSN 1099-4300