# Information Loss in Binomial Data Due to Data Compression

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Information Decomposition under Data Compression

## 3. Entropy Decomposition under Data Compression

## 4. Data Compression and Information Loss

## 5. Discussion

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Vieland, V.J.; Das, J.; Hodge, S.E.; Seok, S.-C. Measurement of statistical evidence on an absolute scale following thermodynamic principles. Theory Biosci.
**2013**, 132, 181–194. [Google Scholar] [CrossRef] [PubMed] - Vieland, V.J.; Seok, S.-C. Statistical evidence measured on a properly calibrated scale for multinomial hypothesis comparisons. Entropy
**2016**, 18, 114. [Google Scholar] [CrossRef] - Vieland, V.J. Evidence, temperature, and the laws of thermodynamics. Hum. Hered.
**2014**, 78, 153–163. [Google Scholar] [CrossRef] [PubMed] - Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev.
**1961**, 5, 183–191. [Google Scholar] [CrossRef] - Duncan, T.L.; Semura, J.S. The deep physics behind the second law: Information and energy as independent forms of bookkeeping. Entropy
**2004**, 6, 21–29. [Google Scholar] [CrossRef] - Duncan, T.L.; Semura, J.S. Information loss as a foundational principle for the second law of thermodynamics. Found. Phys.
**2007**, 37, 1767–1773. [Google Scholar] [CrossRef] - Stuart, A.; Ord, K.; Arnold, S. Kendall’s Advancd Theory of Statistics, Classical Inference, and the Linear Model; Wiley: New York, NY, USA, 2010. [Google Scholar]
- Shannon, C. A mathematical theory of communication. Bell. Syst. Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] - Attard, P. Is the information entropy the same as the statistical mechanical entropy? arXiv, 2012; arXiv:1209.5500. [Google Scholar]
- Toffoli, T. Entropy? Honest! Entropy
**2016**, 18, 247. [Google Scholar] [CrossRef] - Zwillinger, D.; Kokoska, S. CRC Standard Probability and Statistics Tables and Formulae; Chapman & Hall/CRC: Boca Raton, FL, USA, 2000. [Google Scholar]
- Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat.
**1951**, 22, 79–86. [Google Scholar] [CrossRef] - Barrett, M.; Sober, E. The second law of probability dynamics. Br. J. Philos. Sci.
**1994**, 45, 941–953. [Google Scholar] [CrossRef] - Vieland, V.J.; Seok, S.-C. Statistical evidence measured on a properly calibrated scale across nested and non-nested hypothesis comparisons. Entropy
**2015**, 17, 5333–5352. [Google Scholar] [CrossRef] - Jaynes, E.T. The Gibbs Paradox. Available online: http://worrydream.com/refs/Jaynes%20-%20The%20Gibbs%20Paradox.pdf (accessed on 15 February 2017).
- Kullback, S. Information Theory and Statistics; Dover: New York, NY, USA, 1968. [Google Scholar]
- Soofi, E.S. Principal information theoretic approaches. J. Am. Stat. Assoc.
**2000**, 95, 1349–1353. [Google Scholar] [CrossRef] - Osteyee, D.B.; Good, I.J. Information, weight of evidence, the singularity between probability measures and signal detection. Lect. Notes Math.
**1970**, 376, 338–341. [Google Scholar] - Edwards, A.W.F. Likelihood: Expanded Edition; Hopkins: Baltimore, MD, USA, 1992. [Google Scholar]
- Royall, R. Statistical Evidence: A Likelihood Paradigm; Chapman & Hall: Boca Raton, FL, USA, 1997. [Google Scholar]
- Taper, M.L.; Lele, S.R. The Nature of Statistical Evidence; University of Chicago: Chicago, IL, USA, 2004. [Google Scholar]
- Bickel, D.R. The strength of statistical evidence for composite hypotheses: Inference to the best explanation. Stat. Sin.
**2012**, 22, 1147–1198. [Google Scholar] - Stern, J.; Pereira, C. Bayesian epistemic values: Focus on surprise, measure probability! Log. J. IGPL
**2014**, 22, 236–254. [Google Scholar] [CrossRef] - Evans, M. Measuring Statistical Evidence Using Relative Belief; Chapman & Hall/CRC: Boca Raton, FL, USA, 2015. [Google Scholar]
- Zhang, Z. A Law of Likelihood for Composite Hypotheses. arXiv, 2009; arXiv:0901.0463. [Google Scholar]

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Hodge, S.E.; Vieland, V.J.
Information Loss in Binomial Data Due to Data Compression. *Entropy* **2017**, *19*, 75.
https://doi.org/10.3390/e19020075

**AMA Style**

Hodge SE, Vieland VJ.
Information Loss in Binomial Data Due to Data Compression. *Entropy*. 2017; 19(2):75.
https://doi.org/10.3390/e19020075

**Chicago/Turabian Style**

Hodge, Susan E., and Veronica J. Vieland.
2017. "Information Loss in Binomial Data Due to Data Compression" *Entropy* 19, no. 2: 75.
https://doi.org/10.3390/e19020075