Article Calculation of Entropy for a Sinusoid with Beta-Distributed Phase

In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression we derive is approximate as it relies on a series expansion for one of the key terms needed in the derivation. However, we are able to show that the approximation is accurate (error ≤ 5%) for a wide variety of Beta parameter choices.


Introduction
The concept of entropy as a measure of information was introduced by Shannon [1] and has since been discussed in many books (e.g., see [2], [3]). Entropy has played a central role in the field of communications, providing limits on both data compression and channel capacity [2]. Entropy-based signal processing techniques and analysis have also enjoyed a great deal of success in a diverse set of applications ranging from ecological system monitoring [4] to crystallography [5]. Successful application of entropy-based approaches is often predicated on having analytical expressions for the entropy of a given signal model, particularly in the communications field (e.g., see [6]. For some probability distributions expressions for differential entropy are well known (e.g., see Table 17.1 in reference [2]) while for others such expressions have not yet been derived.
This work derives the differential entropy for harmonic signals with non-uniform (Beta) distributed phase. In a number of applications the signal under study is harmonic and the primary uncertainty mechanism giving rise to the probability distribution is the phase noise which can be distributed non-uniformly. Optical communications strategies, for example, often involve the phase modulation of a carrier signal and subsequent demodulation at the receiver. The measured photocurrent at the receiver takes the general form [7] where I 1 is a scalar amplitude, ∆θ(t) contains the desired information, and ∆φ(t) represents the noise. In differential phase shift keying, the probability distribution function associated with ∆φ(t) has been modeled both as a Gaussian and as a uni-modal (but not Gaussian) distribution [7], [8]. Specifically, Ho [9] found the distribution to be a convolution of a Gaussian and a non-central chi-squared distribution with two degrees-of-freedom. An additional situation in which the signal model (1) appears is in interferometric measurement or detection systems [10]. In these applications the goal is still the same as in communications: to recover the parameter ∆θ given phase noise ∆φ. The work of Arie et al. considered a Gaussian phase model [10] while Freitas [11] considered a non-Gaussian phase-noise model. Calculation of the differential entropy for the case of a sinusoid with non-uniform phase distributions does not appear in the literature however.
This paper will therefore derive the differential entropy for the sine wave for the case when the phase angle is Beta-distributed. The Beta distribution is an extremely flexible distribution possessing finite support and can be made, via appropriate choice of parameters, to approximate a number of other well-known distributions. The uniform distribution, for example, is a special case of the Beta distribution. It will be shown, in fact, that for appropriate choice of parameters the differential entropy for the Beta-distributed phase case reduces to that of the well-known expression for differential entropy associated with a uniform distributed phase.

Mathematical Development and Results
The following notation will be utilized. The differential entropy for a continuous random variable, X, with probability density function p(x) is given by where S = {x|p(x) > 0} is the support set of X. Here log usually means log 2 . If the log is taken to the base e, the notation h e (X) is used. Consider the sinusoidal signal written as: where ω and φ are constants). It is well known that this signal has a probability density function given by and possesses zero mean and variance σ 2 = A 2 /2 [12]. For this distribution the associated differential entropy can be computed by introducing the transformation z = y 2A + 1 2 which yields the probability density function where Γ(·) is the Gamma function. Equation (4) defines the Beta distribution, with differential entropy given by where B(·, ·) is the Beta function and Ψ(·) is the Digamma function (see Cover & Thomas [2] p. 486) so h(Z) = log 2 π 4 bits. Consequently Note that h(Y ) can be positive or negative, depending on the value of A. This is a well known result, e.g., see [6]. Indeed the entropy of arbitrary Beta distributions is known [13]. Next it will be shown that the result given by Equation (7) is a special case of the differential entropy of a sine-wave with a Beta-distributed phase angle. The probability density function for a Beta-distributed random variable on [0,1] is expressed as: where η > 0, γ > 0.
as the Beta distribution on the interval [− π 2 , π 2 ]. Consequently, the sine wave Z = A sin θ, where θ is Beta-distributed on [− π 2 , π 2 ] has probability density function: The differential entropy for Z in nats, is calculated as follows: Letting w = sin −1 z A , so z = A sin w and dz = A cos w dw, gives: so where we have made the substitution for the Beta function B(η, γ) = Γ(η)Γ(γ) Γ(η+γ) . Setting u = 1 2 + w π in the first integral of Equation (13) leads to: from formula 4.253(1) of Gradshteyn and Ryzhik [14]. Similarly, setting v = 1 2 − w π in the second integral gives: Finally, setting u = 1 2 + w π in the third integral and noting that sin πu = sin(w + π 2 ) = cos w, we obtain Collecting terms gives The last term is the average of the function ln(sin πu)du over the Beta distribution. Unfortunately, there does not appear to be an analytic solution for this integral. A variety of Beta distributions (taken from Hahn and Shapiro, [15]) are shown in Figure 1, along with the corresponding differential entropy, h, for a sine wave of amplitude A = 1. These entropies are expressed in bits, so the values of h e obtained from Equation (17) are divided by ln 2. The last term in (17) is calculated by standard numerical integration techniques. We now derive an analytic approximation for the integral term in Equation (17) which is valid when η ≥ 1 and γ ≥ 1. This technique is based on the following integral found in Gradshteyn and Ryzhik ([14], formula 3.768(11)): and its companion (formula 3.768(12)): both expressed in terms of generalized hypergeometric series. In fact, 1 F 1 is defined by: where (α) n denotes the product α(α + 1)(α + 2) · · · (α + n − 1) for n > 0 and (α) 0 = 1. (β) n is similarly defined. Note that Mathematica also gives expressions for the integrals in (18) and (19); these formulas are in terms of the 2 F 3 generalized hypergeometric series and, with a bit of manipulation, can be shown to be equivalent to the expressions above. We make use of the power series expansion: which converges for |x| < 1, and apply it to x = 1 − sin πu since |1 − sin πu| < 1 for 0 < u < 1. This gives We wish to choose N such that the first N terms of this series converge closely enough to ln(sin πu) for purposes of calculating the integral in Equation (17). We have found that this approximation will be valid when η ≥ 1 and γ ≥ 1. If either η or γ is less than unity, then the contribution to the integral near the endpoints at u = 0 and u = 1 will exceed the precision of the power series convergence near those points. Accordingly, ln(sin πu) ≈ − 1 − sin πu + 1 2 (1 − 2 sin πu + sin 2 πu) + 1 3 (1 − 3 sin πu + 3 sin 2 πu − sin 3 πu) + . . .
Collecting the coefficients of like powers of sin πu, we can write ln(sin πu) as This is an unusual approximation since it is clear that the coefficients diverge as N → ∞. However, the coefficients have alternating signs and our calculations show that the approximation is increasingly good for small values of N up to about N = 50, at which point we begin losing precision. For example consider the approximation obtained with N = 24 terms. In this case the coefficients are given by the values shown in Table 1. We further expand Equation (21) by expressing powers of sin x as linear combinations of sines and cosines of multiples of x (see Gradshteyn and Ryzhik,[4] pp. 25-26). In general, for any positive integer n, and Combining (21), (22), and (23) leads to the conclusion that ln(sin πu) is (approximately) expressible as a linear combination of terms of the form sin(mπu), cos(mπu), and constant terms. Consequently, the integral in Equation (17): can be written as a linear combination of integrals of the form: which are given analytically by Equations (18) and (19). Results for this analytic approximation are shown in Table 2 for two values of N . The last term of Equation (17) is shown, as well as the resulting value of differential entropy h e (Z) in nats. Again, the amplitude of the sine wave is taken to be A = 1. Clearly the approximation is a good one, particularly for N = 45. We find that for a wide variety of Beta parameters the error in the approximation is ≤ 5%. For larger values of N , we find that the quality of the results degrade. Specifically, we find that the values in the range 40 ≤ N ≤ 50 give the best results.

Conclusions
This paper provides an analytical approximation for the differential entropy of a sine wave with a Beta-distributed phase angle. The results predicted by the expression are in good agreement with those obtained via numerical integration for Beta distribution parameters η ≥ 1 and γ ≥ 1. For all parameter combinations we have looked at, the error in the approximation is ≤ 5% when a reasonable number of terms are used in the approximation (N = 45). The result for a uniformly-distributed phase angle is also included as a special case of our more general result. The derived expression may prove useful in entropy-based calculations for signals in which a non-uniform phase distribution model is appropriate.