Next Article in Journal
Next Article in Special Issue
Previous Article in Journal
Previous Article in Special Issue
Entropy 2012, 14(3), 480-490; doi:10.3390/e14030480
Article

Interval Entropy and Informative Distance

1,*  and 2
Received: 20 December 2011; in revised form: 4 February 2012 / Accepted: 7 February 2012 / Published: 2 March 2012
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Download PDF [119 KB, uploaded 5 March 2012]
Abstract: The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.
Keywords: uncertainty; discrepancy; characterization uncertainty; discrepancy; characterization
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Misagh, F.; Yari, G. Interval Entropy and Informative Distance. Entropy 2012, 14, 480-490.

AMA Style

Misagh F, Yari G. Interval Entropy and Informative Distance. Entropy. 2012; 14(3):480-490.

Chicago/Turabian Style

Misagh, Fakhroddin; Yari, Gholamhossein. 2012. "Interval Entropy and Informative Distance." Entropy 14, no. 3: 480-490.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert