Next Article in Journal
Probability Mass Exclusions and the Directed Components of Mutual Information
Previous Article in Journal
The Capacity for Correlated Semantic Memories in the Cortex
Previous Article in Special Issue
Entropy Power, Autoregressive Models, and Mutual Information
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Special Issue on Rate Distortion Theory and Information Theory

Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106, USA
Entropy 2018, 20(11), 825; https://doi.org/10.3390/e20110825
Submission received: 24 October 2018 / Accepted: 25 October 2018 / Published: 27 October 2018
(This article belongs to the Special Issue Rate-Distortion Theory and Information Theory)
Shannon introduced the fields of information theory and rate distortion theory in his landmark 1948 paper [1], where he defined “The Rate for a Source Relative to a Fidelity Evaluation.” Shannon officially coined the term “rate distortion function” in his seminal contribution in 1959 [2]. The 1950s, 1960s and 1970s showed considerable activity on deriving rate distortion functions for various source models and distortion measures and on methods for computing the rate distortion function. Entering the 1980s, attention turned to multiterminal rate distortion problems such as successive refinement of information and multiple descriptions, along with activity pointed toward evaluating rate distortion regions for multiterminal problems. Berger and Gibson [3] provided an overview of rate distortion theory research and its impact on source codec designs for the first 50 years of information theory in 1998. In the subsequent 20 years, research contributions to rate distortion theory have seemingly declined, which is somewhat perplexing, since lossy source compression applications have proliferated in those same decades. Specifically, speech, image, audio and video codecs are ubiquitous in our lives today, with biomedical applications now garnering substantial lossy source compression attention.
As was stated in the invitation to the Special Issue, “It is the goal of this Special Issue to reemphasize the critical accomplishments of rate distortion theory and to highlight new directions in rate distortion theoretic and information theoretic research.” Toward this end, we have collected 10 papers that examine source models that are relevant to real-world sources [4,5,6,7], consider new distortion measures [8], extend prior work in multiterminal source coding [9,10], examine source model identification and coding [11], study rate-distortion optimization for 3D video coding [12] and present a review of the field of information theory and cognition [13]. These papers not only make significant contributions to the rate distortion theory literature, but also highlight new research directions and indicate that there is much more to be done to harness the full promise of rate distortion theory as defined by Shannon.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Shannon, C.E. A mathematical theory of communication. Bell Sys. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  2. Shannon, C.E. Coding Theorems for a Discrete Source with a Fidelity Criterion. IRE Conv. Rec. 1959, 7, 142–163. [Google Scholar]
  3. Berger, T.; Gibson, J.D. Lossy Source Coding. IEEE Trans. Inf. Theory 1998, 44, 2693–2723. [Google Scholar] [CrossRef]
  4. Gibson, J. Rate Distortion Functions and Rate Distortion Function Lower Bounds for Real-World Sources. Entropy 2017, 19, 604. [Google Scholar] [CrossRef]
  5. Gutiérrez-Gutiérrez, J.; Zárraga-Rodríguez, M.; Insausti, X. Upper bounds for the rate distortion function of finite-length data blocks of Gaussian WSS sources. Entropy 2017, 19, 554. [Google Scholar] [CrossRef]
  6. Gutiérrez-Gutiérrez, J.; Zárraga-Rodríguez, M.; Villar-Rosety, F.; Insausti, X. Rate-Distortion Function Upper Bounds for Gaussian Vectors and Their Applications in Coding AR Sources. Entropy 2018, 20, 399. [Google Scholar] [CrossRef]
  7. Gibson, J. Entropy Power, Autoregressive Models, and Mutual Information. Entropy 2018, 20, 750. [Google Scholar] [CrossRef]
  8. Shkel, Y.; Verdu, S. A Coding Theorem for f-Separable Distortion Measures. Entropy 2018, 20, 111. [Google Scholar] [CrossRef]
  9. Benammar, M.; Zaidi, A. Rate-Distortion Region of a Gray-Wyner Model with Side Information. Entropy 2018, 20, 2. [Google Scholar] [CrossRef]
  10. Oohama, Y. Exponential Strong Converses for Source Coding with Side Infrormation at the Decoder. Entropy 2018, 20, 352. [Google Scholar] [CrossRef]
  11. Silva, J.; Derpich, M. Fixed-Rate Universal Lossy Source Coding and Model Identification: Connection with Zero-Rate Density Estimation and the Skeleton Estimator. Entropy 2018, 20, 640. [Google Scholar] [CrossRef]
  12. Chen, Y.; Liu, G. Content Adaptive Lagrange Multiplier Selection for Rate-Distortion Optimization in 3-D Wavelet-Based Scalable Video Coding. Entropy 2018, 20, 181. [Google Scholar] [CrossRef]
  13. Sayood, K. Information Theory and Cognition: A Review. Entropy 2018, 20, 706. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Gibson, J. Special Issue on Rate Distortion Theory and Information Theory. Entropy 2018, 20, 825. https://doi.org/10.3390/e20110825

AMA Style

Gibson J. Special Issue on Rate Distortion Theory and Information Theory. Entropy. 2018; 20(11):825. https://doi.org/10.3390/e20110825

Chicago/Turabian Style

Gibson, Jerry. 2018. "Special Issue on Rate Distortion Theory and Information Theory" Entropy 20, no. 11: 825. https://doi.org/10.3390/e20110825

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop