Next Article in Journal
Fisher Information and Semiclassical Treatments
Previous Article in Journal
Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
Article Menu

Export Article

Entropy 2009, 11(4), 949-958; https://doi.org/10.3390/e11040949

Article
Calculation of Entropy for a Sinusoid with Beta-Distributed Phase
1
U.S. Naval Research Laboratory, Optical Science Division, Washington DC 20375, USA
2
NRC Postdoctoral Research Associate, Naval Research Laboratory, Optical Science Division, Washington DC 20375, USA
*
Author to whom correspondence should be addressed.
#
Permanent Address: Global Strategies Group, Crofton, MD 21114, USA.
Received: 17 September 2009 / Accepted: 13 November 2009 / Published: 2 December 2009

Abstract

:
In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression we derive is approximate as it relies on a series expansion for one of the key terms needed in the derivation. However, we are able to show that the approximation is accurate (error 5 % ) for a wide variety of Beta parameter choices.
Keywords:
differential entropy; sine wave; Beta distribution; phase noise

1. Introduction

The concept of entropy as a measure of information was introduced by Shannon [1] and has since been discussed in many books (e.g., see [2], [3]). Entropy has played a central role in the field of communications, providing limits on both data compression and channel capacity [2]. Entropy-based signal processing techniques and analysis have also enjoyed a great deal of success in a diverse set of applications ranging from ecological system monitoring [4] to crystallography [5]. Successful application of entropy-based approaches is often predicated on having analytical expressions for the entropy of a given signal model, particularly in the communications field (e.g., see [6]. For some probability distributions expressions for differential entropy are well known (e.g., see Table 17.1 in reference [2]) while for others such expressions have not yet been derived.
This work derives the differential entropy for harmonic signals with non-uniform (Beta) distributed phase. In a number of applications the signal under study is harmonic and the primary uncertainty mechanism giving rise to the probability distribution is the phase noise which can be distributed non-uniformly. Optical communications strategies, for example, often involve the phase modulation of a carrier signal and subsequent demodulation at the receiver. The measured photocurrent at the receiver takes the general form [7]
I ( t ) = I 1 1 - cos Δ θ ( t ) + Δ ϕ ( t )
where I 1 is a scalar amplitude, Δ θ ( t ) contains the desired information, and Δ ϕ ( t ) represents the noise. In differential phase shift keying, the probability distribution function associated with Δ ϕ ( t ) has been modeled both as a Gaussian and as a uni-modal (but not Gaussian) distribution [7], [8]. Specifically, Ho [9] found the distribution to be a convolution of a Gaussian and a non-central chi-squared distribution with two degrees-of-freedom.
An additional situation in which the signal model (1) appears is in interferometric measurement or detection systems [10]. In these applications the goal is still the same as in communications: to recover the parameter Δ θ given phase noise Δ ϕ . The work of Arie et al. considered a Gaussian phase model [10] while Freitas [11] considered a non-Gaussian phase-noise model. Calculation of the differential entropy for the case of a sinusoid with non-uniform phase distributions does not appear in the literature however.
This paper will therefore derive the differential entropy for the sine wave for the case when the phase angle is Beta-distributed. The Beta distribution is an extremely flexible distribution possessing finite support and can be made, via appropriate choice of parameters, to approximate a number of other well-known distributions. The uniform distribution, for example, is a special case of the Beta distribution. It will be shown, in fact, that for appropriate choice of parameters the differential entropy for the Beta-distributed phase case reduces to that of the well-known expression for differential entropy associated with a uniform distributed phase.

2. Mathematical Development and Results

The following notation will be utilized. The differential entropy for a continuous random variable, X, with probability density function p ( x ) is given by
h ( X ) = - S p ( x ) log [ p ( x ) ] d x
where S = { x | p ( x ) > 0 } is the support set of X. Here log usually means log 2 . If the log is taken to the base e, the notation h e ( X ) is used. Consider the sinusoidal signal written as:
Y = A sin θ
where θ is uniformly distributed on [ - π , π ] (A similar discussion applies if Y = A sin ( ω θ ) or Y = A sin ( θ + ϕ ) , where ω and ϕ are constants). It is well known that this signal has a probability density function given by
p ( y ) = 1 π A 2 - y 2 - A < y < A
and possesses zero mean and variance σ 2 = A 2 / 2 [12]. For this distribution the associated differential entropy can be computed by introducing the transformation z = y 2 A + 1 2 which yields the probability density function
p ( z ) = 1 π z ( 1 - z ) = Γ [ 1 ] Γ [ 1 2 ] Γ [ 1 2 ] z - 1 2 ( 1 - z ) - 1 2
where Γ ( · ) is the Gamma function. Equation (4) defines the Beta distribution, with differential entropy given by
h e ( Z ) = ln B ( 1 2 , 1 2 ) + Ψ 1 2 - Ψ 1 = ln ( π ) - γ - 2 ln ( 2 ) + γ = ln π 4
where B ( · , · ) is the Beta function and Ψ ( · ) is the Digamma function (see Cover & Thomas [2] p. 486) so
h ( Z ) = log 2 π 4 bits .
Consequently
h ( Y ) = h ( 2 A Z - A ) = h ( 2 A Z ) = h ( Z ) + log 2 | 2 A | = log 2 π A 2
Note that h ( Y ) can be positive or negative, depending on the value of A. This is a well known result, e.g., see [6]. Indeed the entropy of arbitrary Beta distributions is known [13]. Next it will be shown that the result given by Equation (7) is a special case of the differential entropy of a sine-wave with a Beta-distributed phase angle.
The probability density function for a Beta-distributed random variable on [0,1] is expressed as:
f ( x ) = Γ ( η + γ ) Γ ( η ) Γ ( γ ) x γ - 1 ( 1 - x ) η - 1 0 x 1
where η > 0 , γ > 0 .
The transformation θ = π ( x - 1 2 ) provides
f ( θ ) = 1 π Γ ( η + γ ) Γ ( η ) Γ ( γ ) 1 2 + θ π γ - 1 1 2 - θ π η - 1 - π 2 θ π 2
as the Beta distribution on the interval [ - π 2 , π 2 ] . Consequently, the sine wave Z = A sin θ , where θ is Beta-distributed on [ - π 2 , π 2 ] has probability density function:
p ( z ) = 1 π Γ ( η + γ ) Γ ( η ) Γ ( γ ) 1 2 + θ π γ - 1 1 2 - θ π η - 1 A cos θ = 1 π Γ ( η + γ ) Γ ( η ) Γ ( γ ) 1 2 + 1 π sin - 1 z A γ - 1 1 2 - 1 π sin - 1 z A η - 1 A 2 - z 2 - A < z < A
The differential entropy for Z in nats, is calculated as follows:
h e ( Z ) = - - A A 1 π Γ ( η + γ ) Γ ( η ) Γ ( γ ) 1 2 + 1 π sin - 1 z A γ - 1 1 2 - 1 π sin - 1 z A η - 1 A 2 - z 2 ln 1 π Γ ( η + γ ) Γ ( η ) Γ ( γ ) 1 2 + 1 π sin - 1 z A γ - 1 1 2 - 1 π sin - 1 z A η - 1 A 2 - z 2 d z
Letting w = sin - 1 z A , so z = A sin w and d z = A cos w d w , gives:
h e ( Z ) = - - π 2 π 2 1 π Γ ( η + γ ) Γ ( η ) Γ ( γ ) 1 2 + w π γ - 1 1 2 - w π η - 1 A cos w ln 1 π Γ ( η + γ ) Γ ( η ) Γ ( γ ) + ( γ - 1 ) ln 1 2 + w π + ( η - 1 ) ln 1 2 - w π - ln ( A cos w ) A cos w d w
so
h e ( Z ) = ln [ π B ( η , γ ) ] - 1 π B ( η , γ ) ( γ - 1 ) - π 2 π 2 1 2 + w π γ - 1 1 2 - w π η - 1 ln 1 2 + w π d w + ( η - 1 ) - π 2 π 2 1 2 + w π γ - 1 1 2 - w π η - 1 ln 1 2 - w π d w - - π 2 π 2 1 2 + w π γ - 1 1 2 - w π η - 1 ln ( A cos w ) d w
where we have made the substitution for the Beta function B ( η , γ ) = Γ ( η ) Γ ( γ ) Γ ( η + γ ) .
Setting u = 1 2 + w π in the first integral of Equation (13) leads to:
π 0 1 u γ - 1 ( 1 - u ) η - 1 ln u d u
which is equal to
π B ( η , γ ) [ Ψ ( γ ) - Ψ ( γ + η ) ]
from formula 4.253(1) of Gradshteyn and Ryzhik [14]. Similarly, setting v = 1 2 - w π in the second integral gives:
π B ( η , γ ) [ Ψ ( η ) - Ψ ( γ + η ) ]
Finally, setting u = 1 2 + w π in the third integral and noting that sin π u = sin ( w + π 2 ) = cos w , we obtain
- π 2 π 2 1 2 + w π γ - 1 1 2 - w π η - 1 ln ( A cos w ) d w = π 0 1 u γ - 1 ( 1 - u ) η - 1 ln ( A sin π u ) d u = π ln A 0 1 u γ - 1 ( 1 - u ) η - 1 d u + 0 1 u γ - 1 ( 1 - u ) η - 1 ln ( sin π u ) d u = π B ( η , γ ) ln A + π 0 1 u γ - 1 ( 1 - u ) η - 1 ln ( sin π u ) d u
Collecting terms gives
h e ( Z ) = ln [ π A B ( η , γ ) ] - ( γ - 1 ) Ψ ( γ ) - ( η - 1 ) Ψ ( η ) + ( η + γ - 2 ) Ψ ( γ + η ) + 1 B ( η , γ ) 0 1 u γ - 1 ( 1 - u ) η - 1 ln ( sin π u ) d u
The last term is the average of the function ln ( sin π u ) d u over the Beta distribution. Unfortunately, there does not appear to be an analytic solution for this integral.
A variety of Beta distributions (taken from Hahn and Shapiro, [15]) are shown in Figure 1, along with the corresponding differential entropy, h, for a sine wave of amplitude A = 1 . These entropies are expressed in bits, so the values of h e obtained from Equation (17) are divided by ln 2 . The last term in (17) is calculated by standard numerical integration techniques.
Figure 1. Beta distributions with different parameter values and the associated entropy values, h, (in bits).
Figure 1. Beta distributions with different parameter values and the associated entropy values, h, (in bits).
Entropy 11 00949 g001
We now derive an analytic approximation for the integral term in Equation (17) which is valid when η 1 and γ 1 . This technique is based on the following integral found in Gradshteyn and Ryzhik ([14], formula 3.768(11)):
0 1 u γ - 1 ( 1 - u ) η - 1 sin a u d u = - i 2 B ( η , γ ) 1 F 1 ( γ ; γ + η ; i a ) - 1 F 1 ( γ ; γ + η ; - i a )
and its companion (formula 3.768(12)):
0 1 u γ - 1 ( 1 - u ) η - 1 cos a u d u = 1 2 B ( η , γ ) 1 F 1 ( γ ; γ + η ; i a ) + 1 F 1 ( γ ; γ + η ; - i a )
both expressed in terms of generalized hypergeometric series. In fact, 1 F 1 is defined by:
1 F 1 ( α ; β ; ζ ) = n = 0 ( α ) n ( β ) n ζ n n !
where ( α ) n denotes the product α ( α + 1 ) ( α + 2 ) ( α + n - 1 ) for n > 0 and ( α ) 0 = 1 . ( β ) n is similarly defined. Note that Mathematica also gives expressions for the integrals in (18) and (19); these formulas are in terms of the 2 F 3 generalized hypergeometric series and, with a bit of manipulation, can be shown to be equivalent to the expressions above.
We make use of the power series expansion:
ln ( 1 - x ) = - x + x 2 2 + x 3 3 + x 4 4 +
which converges for | x | < 1 , and apply it to x = 1 - sin π u since | 1 - sin π u | < 1 for 0 < u < 1 . This gives
ln ( sin π u ) = - ( 1 - sin π u ) + ( 1 - sin π u ) 2 2 + ( 1 - sin π u ) 3 3 + ( 1 - sin π u ) 4 4 +
We wish to choose N such that the first N terms of this series converge closely enough to ln ( sin π u ) for purposes of calculating the integral in Equation (17). We have found that this approximation will be valid when η 1 and γ 1 . If either η or γ is less than unity, then the contribution to the integral near the endpoints at u = 0 and u = 1 will exceed the precision of the power series convergence near those points.
Accordingly,
ln ( sin π u ) - 1 - sin π u + 1 2 ( 1 - 2 sin π u + sin 2 π u ) + 1 3 ( 1 - 3 sin π u + 3 sin 2 π u - sin 3 π u ) + + 1 N i = 0 N N i ( - 1 ) i ( sin π u ) i
Collecting the coefficients of like powers of sin π u , we can write ln ( sin π u ) as
ln ( sin π u ) - j = 0 N d j ( N ) ( sin π u ) j
where
d 0 ( N ) = k = 1 N 1 k
and
d j ( N ) = ( - 1 ) j k = j N 1 k k j j = 1 , 2 , , N .
This is an unusual approximation since it is clear that the coefficients diverge as N . However, the coefficients have alternating signs and our calculations show that the approximation is increasingly good for small values of N up to about N = 50 , at which point we begin losing precision. For example consider the approximation obtained with N = 24 terms. In this case the coefficients are given by the values shown in Table 1.
Table 1. Coefficients for the approximation of ln ( sin ( π u ) ) using N = 24 terms.
Table 1. Coefficients for the approximation of ln ( sin ( π u ) ) using N = 24 terms.
j0123456789101112
d j ( N ) 3.776-24138-674.672656.5-8500.822433-4944391934-145278196126-226922225346
j131415161718192021222324
d j ( N ) 192011140090-8716745967-203597477.6-2237.1531.30-96.3812.55-1.0430.0417
We further expand Equation (21) by expressing powers of sin x as linear combinations of sines and cosines of multiples of x (see Gradshteyn and Ryzhik, [4] pp. 25-26). In general, for any positive integer n,
( sin x ) 2 n = 1 2 2 n k = 0 n - 1 ( - 1 ) n - k 2 2 n k cos ( 2 ( n - k ) x ) + 2 n n
and
( sin x ) 2 n - 1 = 1 2 2 n - 2 k = 0 n - 1 ( - 1 ) n + k - 1 2 n - 1 k sin ( ( 2 n - 2 k - 1 ) x )
Combining (21), (22), and (23) leads to the conclusion that ln ( sin π u ) is (approximately) expressible as a linear combination of terms of the form sin ( m π u ) , cos ( m π u ) , and constant terms. Consequently, the integral in Equation (17):
0 1 u γ - 1 ( 1 - u ) η - 1 ln ( sin π u ) d u
can be written as a linear combination of integrals of the form:
0 1 u γ - 1 ( 1 - u ) η - 1 d u = B ( η , γ )
0 1 u γ - 1 ( 1 - u ) η - 1 sin ( m π u ) d u
0 1 u γ - 1 ( 1 - u ) η - 1 cos ( m π u ) d u
which are given analytically by Equations (18) and (19).
Results for this analytic approximation are shown in Table 2 for two values of N. The last term of Equation (17) is shown, as well as the resulting value of differential entropy h e ( Z ) in nats. Again, the amplitude of the sine wave is taken to be A = 1 . Clearly the approximation is a good one, particularly for N = 45 . We find that for a wide variety of Beta parameters the error in the approximation is 5 % . For larger values of N, we find that the quality of the results degrade. Specifically, we find that the values in the range 40 N 50 give the best results.
Table 2. Accuracy of the Analytical Approximations of the Entropy for Various Values of η 1 and γ 1 .
Table 2. Accuracy of the Analytical Approximations of the Entropy for Various Values of η 1 and γ 1 .
1 B ( η , γ ) 0 1 u γ - 1 ( 1 - u ) η - 1 ln ( sin π u ) d u Entropy h e ( Z ) (in nats)
Analytical Approximation Analytical Approximation
ηγNumerical Integration N = 24 N = 45 Numerical Integration N = 24 N = 45
11-0.6931-0.6677-0.67930.45160.47710.4654
22-0.3278-0.3268-0.32740.69190.69280.6922
33-0.2141-0.2140-0.21410.66270.66280.6628
55-0.1264-0.1264-0.12640.53770.53770.5377
21-0.6931-0.6677-0.67930.25840.28390.2723
1.53-0.4970-0.4916-0.49480.37470.38010.3769
1.55-0.7485-0.7377-0.7443-0.1837-0.1729-0.1794
31.5-0.4970-0.4916-0.49480.37470.38010.3769
51.5-0.7485-0.7377-0.7443-0.1837-0.1729-0.1794

3. Conclusions

This paper provides an analytical approximation for the differential entropy of a sine wave with a Beta-distributed phase angle. The results predicted by the expression are in good agreement with those obtained via numerical integration for Beta distribution parameters η 1 and γ 1 . For all parameter combinations we have looked at, the error in the approximation is 5 % when a reasonable number of terms are used in the approximation ( N = 45 ). The result for a uniformly-distributed phase angle is also included as a special case of our more general result. The derived expression may prove useful in entropy-based calculations for signals in which a non-uniform phase distribution model is appropriate.

Acknowledgments

The authors would like to thank the Naval Research Laboratory for providing funding for this work. The authors would also like to thank an anonymous reviewer for providing us with an alternative derivation for the uniform distributed phase case that provided a clear connection to the Beta-distributed phase case.

References

  1. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, ibidem 623–656. [Google Scholar] [CrossRef]
  2. Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley and Sons: Somerset, NJ, USA, 2006. [Google Scholar]
  3. Kullback, S. Information Theory and Statistics; Dover Publications: New York, NY, USA, 1968. [Google Scholar]
  4. Moniz, L.J.; Nichols, J.D.; Nichols, J.M. Mapping the information landscape: Discerning peaks and valleys for ecological monitoring. J. Biol. Phys. 2007, 33, 171–181. [Google Scholar] [CrossRef] [PubMed]
  5. Bricogne, G. A bayesian statistical theory of the phase problem. I. A multichannel maximum-entropy formalism for constructing generalized joint probability distributions of structure factors. Acta Crystallogr. A 1988, 44, 517–545. [Google Scholar] [CrossRef]
  6. McDonnell, M.D.; Stocks, N.G.; Abbott, D. Optimal stimulus and noise distributions for information transmission via suprathreshold stochastic resonance. Phys. Rev. E 2007, 75, 1–13. [Google Scholar] [CrossRef]
  7. Zhang, X.; Qu, Z.; Yang, G. Probability density function of noise statistics for optically pre-amplified DPSK receivers with optical mach-Zehnder interferometer demodulation. Opt. Commun. 2006, 258, 177–183. [Google Scholar] [CrossRef]
  8. Ho, K.-P. Impact of nonlinear phase noise to DPSK signals: A comparison of different models. IEEE Photonic. Technol. Lett. 2004, 16, 1403–1405. [Google Scholar] [CrossRef]
  9. Ho, K.-P. Probability density of nonlinear phase noise. J. Opt. Soc. Amer. B 2003, 20, 1875–1879. [Google Scholar] [CrossRef]
  10. Arie, A.; Tur, M.; Godstein, E.L. Probability-density function of noise at the output of a two-beam interferometer. J. Opt. Soc. Amer. A 1991, 8, 1936–1942. [Google Scholar] [CrossRef]
  11. Freitas, J.M.D. Probability density functions for intensity induced phase noise in CW phase demodulation systems. Meas. Sci. Technol. 2007, 18, 3592–3602. [Google Scholar] [CrossRef]
  12. Damper, R.I. Introduction to Discrete-Time Signals and Systems; Chapman-Hall: London, UK, 1995. [Google Scholar]
  13. Ebrahimi, N.; Maasoumi, E.; Soofi, E.S. Ordering univariate distributions by entropy and variance. J. Econometrics 1999, 90, 317–336. [Google Scholar] [CrossRef]
  14. Gradshteyn, I.S.; Ryzhik, I.M. Table of Integrals, Series and Products, 4th ed.; Academic Press: New York, NY, USA, 1965. [Google Scholar]
  15. Hahn, G.J.; Shapiro, S.S. Statistical Models in Engineering; John Wiley and Sons: New York, NY, USA, 1968. [Google Scholar]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top