A Utility-Based Approach to Some Information Measures
AbstractWe review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of thesegeneralized quantities. We then consider these generalized quantities in an easily interpreted spe-cial case. We show that the resulting quantities, share many of the properties of entropy andrelative entropy, such as the data processing inequality and the second law of thermodynamics.We formulate an important statistical learning problem – probability estimation – in terms of ageneralized relative entropy. The solution of this problem reflects general risk preferences via theutility function; moreover, the solution is optimal in a sense of robust absolute performance. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Friedman, C.; Huang, J.; Sandow, S. A Utility-Based Approach to Some Information Measures. Entropy 2007, 9, 1-26.
Friedman C, Huang J, Sandow S. A Utility-Based Approach to Some Information Measures. Entropy. 2007; 9(1):1-26.Chicago/Turabian Style
Friedman, Craig; Huang, Jinggang; Sandow, Sven. 2007. "A Utility-Based Approach to Some Information Measures." Entropy 9, no. 1: 1-26.