# The Entropy of a Discrete Real Variable

## Abstract

**:**

## 1. Introduction

## 2. Derivation of η

## 3. General Characteristics of η

_{0}defined over ${\mathcal{K}}_{0}$. Furthermore log(k

_{0}) is the maximal differential Shannon entropy that could be associated with ${\mathcal{K}}_{0}$. Equation (48) therefore implies

_{e}, x) measures the intrinsic indeterminacy in the spatial configuration of x. It is therefore fitting, as explicitly evident in Equation (30), that τ is sensitive to each of the n(n − 1)/2 uniquely defined positive intervals x

_{l}− x

_{j}within a given x.

## 4. Quantitative Study

^{5}⌊ κ

_{0}/κ

_{*}⌋ samples, where κ

_{*}= 2 π/(x

_{n}− x

_{1}) represents the finest periodicity in τ. Note that x

_{n}− x

_{1}is always the largest positive interval within a one-dimensional x. All computations were precise to 15 significant figures. The accuracy, in comparison to the “true” value of η produced by an infinite number of samples, is always better than ${10}^{-6}.$η.

_{e}, x

_{e}), η(p

_{a}, x

_{e}) and η(p

_{r}, x

_{e}) plotted as functions of n. Recall that η(p

_{r}, x

_{e}) and η(p

_{a}, x

_{e}) should behave respectively as quantities of entropy attributed to the variables X

_{re}and X

_{ae}examined in Section 1. We find that η(p

_{r}, x

_{e}) is appreciably greater than η(p

_{a}, x

_{e}) for all but trivially small n, which is consistent with the expected result. Because the only difference between each p

_{a}and p

_{r}is the order of their respective terms, the differences between the associated quantities of entropy are entirely due to the contribution from Ξ.

**Figure 1.**Plots of $\eta ({\mathbf{p}}_{e},{\mathbf{x}}_{e})$, $\eta ({\mathbf{p}}_{a},{\mathbf{x}}_{e})$ and $\eta ({\mathbf{p}}_{r},{\mathbf{x}}_{e})$ using the respective symbols “•” , “◃” and “+”.

**Figure 2.**Plots of $\eta ({\mathbf{p}}_{e},{\mathbf{x}}_{r})$, $\eta ({\mathbf{p}}_{a},{\mathbf{x}}_{r})$ and $\eta ({\mathbf{p}}_{r},{\mathbf{x}}_{r})$, using the respective symbols “×”, “⋄” and “*”. The solid line follows a plot of $\mathrm{log}\left(n\right)$ for reference.

_{p}consist of the first n prime numbers. It follows from the Prime Number Theorem (PNT) that the local average gap in the vicinity of a given x

_{j}∈ x

_{p}varies as log(j) for sufficiently large j. The term gap is used throughout in reference to a positive interval between consecutive terms in any given one-dimensional $\mathbf{x}$ and is defined such that the j-th gap is

_{p}.

**Figure 4.**Plots of $\eta ({\mathbf{p}}_{e},{\mathbf{x}}_{p})$, $\eta ({\mathbf{p}}_{e},{\stackrel{\xb4}{\mathbf{x}}}_{p})$, $\eta ({\mathbf{p}}_{e},{\stackrel{\u02d8}{\mathbf{x}}}_{p})$ and $\eta ({\mathbf{p}}_{e},{\tilde{\mathbf{x}}}_{p})$, shown with “•”, “◃”, “⋄” and “*”, respectively, measuring intrinsic structure in the primes ${\mathbf{x}}_{p}$ and in three variants constructed by changing the order of the first $n-2$ even prime gaps.

## 5. Summary and Conclusions

## Acknowledgments

## References

- Shannon, C.E. A mathematical theory of communication. Bell. Syst. Tech. J.
**1948**, 27, 379. [Google Scholar] [CrossRef] - Bialynicki-Birula, I.; Mycielski, J. Uncertainty relations for information entropy in wave mechanics. Commun. Math. Phys.
**1975**, 44, 129–132. [Google Scholar] [CrossRef] - Gadre, S.R.; Bendale, R.D. Rigorous relationships among quantum-mechanical kinetic energy and atomic information entropies: Upper and lower bounds. Phys. Rev. A
**1987**, 36, 1932–1935. [Google Scholar] [CrossRef] [PubMed] - Lalazissis, G.A.; Massen, S.E.; Panos, C.P.; Dimitrova, S.S. Information entropy as a measure of the quality of a nuclear density distribution. Int. J. Mod. Phys. E
**1998**, 7, 485–494. [Google Scholar] [CrossRef] - Guevara, N.L.; Sagar, R.P.; Esquivel, R.O. Shannon-information entropy sum as a correlation measure in atomic systems. Phys. Rev. A
**2003**, 67, 012507. [Google Scholar] [CrossRef] - Massen, S.E. Application of information entropy to nuclei. Phys. Rev. C
**2003**, 67, 014314. [Google Scholar] [CrossRef] - Shi, Q.; Kais, S. Discontinuity of Shannon information entropy for two-electron atoms. J. Chem. Phys.
**2004**, 309, 127–131. [Google Scholar] [CrossRef] - Bialynicki-Birula, I.; Rudnicki, L. Entropic uncertainty relations in quantum physics. In Statistical Complexity; Sen, K.D., Ed.; Springer: Berlin, Heidelberg, Germany, 2007; pp. 1–34. [Google Scholar]
- Ghosh, S.K.; Berkowitz, M.; Parr, R.G. Transcription of ground-state density-functional theory into a local thermodynamics. Proc. Natl. Acad. Sci. USA
**1984**, 81, 8028. [Google Scholar] [CrossRef] [PubMed] - Kullback, S.; Leibler, R.A. On information and sufficiency. Annals Math. Stat.
**1951**, 22, 79–86. [Google Scholar] [CrossRef] - Feynman, R.P. Theory of Fundamental Processes; W. A. Benjamin, Inc.: New York, NY, USA, 1962; pp. 1–6. [Google Scholar]
- Cartwright, C.; Funkhouser, S.; Sengupta, D.; Williams, B. Periodicity in the intervals between primes. Am. J. Comput. Math.
**2012**. submitted for publication. [Google Scholar] - Chiribella, G.; D’Ariano, G.M.; Perinotti, P. Informational derivation of quantum theory. Phys. Rev. A
**2011**, 84, 012311. [Google Scholar] [CrossRef]

© 2012 by the author; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

## Share and Cite

**MDPI and ACS Style**

Funkhouser, S.
The Entropy of a Discrete Real Variable. *Entropy* **2012**, *14*, 1522-1538.
https://doi.org/10.3390/e14081522

**AMA Style**

Funkhouser S.
The Entropy of a Discrete Real Variable. *Entropy*. 2012; 14(8):1522-1538.
https://doi.org/10.3390/e14081522

**Chicago/Turabian Style**

Funkhouser, Scott.
2012. "The Entropy of a Discrete Real Variable" *Entropy* 14, no. 8: 1522-1538.
https://doi.org/10.3390/e14081522