Next Article in Journal
Multilevel Diversity Coding with Secure Regeneration: Separate Coding Achieves the MBR Point
Next Article in Special Issue
Special Issue on Rate Distortion Theory and Information Theory
Previous Article in Journal
Analysis of Heat Dissipation and Reliability in Information Erasure: A Gaussian Mixture Approach
Previous Article in Special Issue
Information Theory and Cognition: A Review
 
 
Article

Entropy Power, Autoregressive Models, and Mutual Information

Department of Electrical and Computer Engineering, University of California, Santa Barbara, Santa Barbara, CA 93106, USA
Entropy 2018, 20(10), 750; https://doi.org/10.3390/e20100750
Received: 7 August 2018 / Revised: 10 September 2018 / Accepted: 17 September 2018 / Published: 30 September 2018
(This article belongs to the Special Issue Rate-Distortion Theory and Information Theory)
Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals the difference in the differential entropy of the two processes. Furthermore, we use the log ratio of entropy powers to analyze the change in mutual information as the model order is increased for autoregressive processes. We examine when we can substitute the minimum mean squared prediction error for the entropy power in the log ratio of entropy powers, thus greatly simplifying the calculations to obtain the differential entropy and the change in mutual information and therefore increasing the utility of the approach. Applications to speech processing and coding are given and potential applications to seismic signal processing, EEG classification, and ECG classification are described. View Full-Text
Keywords: autoregressive models; entropy rate power; mutual information autoregressive models; entropy rate power; mutual information
Show Figures

Figure 1

MDPI and ACS Style

Gibson, J. Entropy Power, Autoregressive Models, and Mutual Information. Entropy 2018, 20, 750. https://doi.org/10.3390/e20100750

AMA Style

Gibson J. Entropy Power, Autoregressive Models, and Mutual Information. Entropy. 2018; 20(10):750. https://doi.org/10.3390/e20100750

Chicago/Turabian Style

Gibson, Jerry. 2018. "Entropy Power, Autoregressive Models, and Mutual Information" Entropy 20, no. 10: 750. https://doi.org/10.3390/e20100750

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop