entropy-logo

Journal Browser

Journal Browser

Information Theory and Data Compression

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 15 October 2025 | Viewed by 805

Special Issue Editor


E-Mail Website
Guest Editor
Department of Electrical Engineering, Stanford University, Stanford, CA 94305, USA
Interests: information theory; entropy; channel coding; data compression; statistical signal processing; digital communications

Special Issue Information

Dear Colleagues,

Data compression is more critical than ever for enabling our technologies. This Special Issue is geared toward key advancements in this area, with an emphasis on bridging theory and practice.

Topics of interest include, but are not limited to, the following:

  • New distortion criteria tailored to tasks like perceptual coding and machine learning; 
  • Emerging data types such as graphs and point clouds;
  • Tradeoffs between compression, distortion, and complexity;
  • The interplay between compression and other information processing tasks. 

Prof. Dr. Tsachy Weissman
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory
  • data compression (lossless and lossy)
  • compression complexity
  • source coding
  • coding theory
  • distortion
  • point clouds

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

33 pages, 1024 KiB  
Article
Graph-Theoretic Limits of Distributed Computation: Entropy, Eigenvalues, and Chromatic Numbers
by Mohammad Reza Deylam Salehi and Derya Malak
Entropy 2025, 27(7), 757; https://doi.org/10.3390/e27070757 - 15 Jul 2025
Viewed by 207
Abstract
We address the problem of the distributed computation of arbitrary functions of two correlated sources, X1 and X2, residing in two distributed source nodes, respectively. We exploit the structure of a computation task by coding source characteristic graphs (and multiple [...] Read more.
We address the problem of the distributed computation of arbitrary functions of two correlated sources, X1 and X2, residing in two distributed source nodes, respectively. We exploit the structure of a computation task by coding source characteristic graphs (and multiple instances using the n-fold OR product of this graph with itself). For regular graphs and general graphs, we establish bounds on the optimal rate—characterized by the chromatic entropy for the n-fold graph products—that allows a receiver for asymptotically lossless computation of arbitrary functions over finite fields. For the special class of cycle graphs (i.e., 2-regular graphs), we establish an exact characterization of chromatic numbers and derive bounds on the required rates. Next, focusing on the more general class of d-regular graphs, we establish connections between d-regular graphs and expansion rates for n-fold graph products using graph spectra. Finally, for general graphs, we leverage the Gershgorin Circle Theorem (GCT) to provide a characterization of the spectra, which allows us to derive new bounds on the optimal rate. Our codes leverage the spectra of the computation and provide a graph expansion-based characterization to succinctly capture the computation structure, providing new insights into the problem of distributed computation of arbitrary functions. Full article
(This article belongs to the Special Issue Information Theory and Data Compression)
Show Figures

Figure 1

14 pages, 262 KiB  
Article
Universal Encryption of Individual Sequences Under Maximal Information Leakage
by Neri Merhav
Entropy 2025, 27(6), 551; https://doi.org/10.3390/e27060551 - 24 May 2025
Viewed by 292
Abstract
We consider the Shannon cipher system in the framework of individual sequences and finite-state encrypters under the metric of maximal information leakage. A lower bound and an asymptotically matching upper bound on the leakage are derived, which lead to the conclusion that asymptotically [...] Read more.
We consider the Shannon cipher system in the framework of individual sequences and finite-state encrypters under the metric of maximal information leakage. A lower bound and an asymptotically matching upper bound on the leakage are derived, which lead to the conclusion that asymptotically minimum leakage can be attained by Lempel–Ziv compression followed by one-time pad encryption of the compressed bitstream. Full article
(This article belongs to the Special Issue Information Theory and Data Compression)
Back to TopTop