Next Article in Journal
Entropic Data Envelopment Analysis: A Diversification Approach for Portfolio Optimization
Next Article in Special Issue
Traction Inverter Open Switch Fault Diagnosis Based on Choi–Williams Distribution Spectral Kurtosis and Wavelet-Packet Energy Shannon Entropy
Previous Article in Journal
On the Capacity and the Optimal Sum-Rate of a Class of Dual-Band Interference Channels
Previous Article in Special Issue
A Characterization of the Domain of Beta-Divergence and Its Connection to Bregman Variational Model
Article Menu
Issue 9 (September) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(9), 496; doi:10.3390/e19090496

Log Likelihood Spectral Distance, Entropy Rate Power, and Mutual Information with Applications to Speech Coding

Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106, USA
*
Author to whom correspondence should be addressed.
Received: 22 August 2017 / Revised: 9 September 2017 / Accepted: 10 September 2017 / Published: 14 September 2017
(This article belongs to the Special Issue Entropy in Signal Analysis)
View Full-Text   |   Download PDF [1194 KB, uploaded 14 September 2017]   |  

Abstract

We provide a new derivation of the log likelihood spectral distance measure for signal processing using the logarithm of the ratio of entropy rate powers. Using this interpretation, we show that the log likelihood ratio is equivalent to the difference of two differential entropies, and further that it can be written as the difference of two mutual informations. These latter two expressions allow the analysis of signals via the log likelihood ratio to be extended beyond spectral matching to the study of their statistical quantities of differential entropy and mutual information. Examples from speech coding are presented to illustrate the utility of these new results. These new expressions allow the log likelihood ratio to be of interest in applications beyond those of just spectral matching for speech. View Full-Text
Keywords: log likelihood ratio; spectral distance; differential entropy; mutual information; speech codec design log likelihood ratio; spectral distance; differential entropy; mutual information; speech codec design
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Gibson, J.D.; Mahadevan, P. Log Likelihood Spectral Distance, Entropy Rate Power, and Mutual Information with Applications to Speech Coding. Entropy 2017, 19, 496.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top