On an Inequality for Legendre Polynomials

This paper is concerned with the orthogonal polynomials. Upper and lower bounds of Legendre polynomials are obtained. Furthermore, entropies associated with discrete probability distributions is a topic considered in this paper. Bounds of the entropies which improve some previously known results are obtained in terms of inequalities. In order to illustrate the results obtained in this paper and to compare them with other results from the literature some graphs are provided.


Introduction
The classical orthogonal polynomials play an important role in applications of mathematical analysis, spectral method with applications in fluid dynamics and other areas of interest. In the recent years many theoretical and numerical studies about Jacobi polynomials were given. Inequalities for Jacobi polynomials using entropic and information inequalities were obtained in [1]. Bernstein type inequalities for Jacobi polynomials and their applications in many research topics in mathematics were considered in [2]. A very recent conjecture (see [3]) which asserts that the sum of the squared Bernstein polynomials is a convex function in [0, 1] was validated using properties of Jacobi polynomials.This conjecture aroused great interest, so that new proofs of it were given (see [4,5]). The main objective of this paper is to obtain upper and lower bounds of Legendre polynomials and to apply these results in order to give lower and upper bounds for entropies. Usually the entropies are described by complicated expressions, then it is useful to establish some bounds of them. The concept of entropy is a measure of uncertainty of a random variable and was introduced by C.E. Shannon in [6]. Later, A. Rényi [7] introduced a parametric family of information measures, that includes Shannon entropy as a special case. The classical Shannon entropy was given using discrete probability distributions. This concept was extended to the continuous case involving the continuous probability distributions. For more details about this topic, the reader is referred to the recent papers [8][9][10][11][12]. This paper is devoted to the entropy associated with discrete probability distribution.
If α = β = λ, Jacobi's polynomials P (α,β) n (x) are called Gegenbauer polynomials (ultraspherical polynomial). The usual notation and normalization for ultraspherical polynomial is the following (see [15]) Note that Some important special cases are the Chebyshev polynomial of the first kind the Chebyshev polynomial of the second kind n (x) = (n + 1) and the Legendre polynomial

Preliminary Results
Then, the inequalities hold for all n ∈ N, k ∈ N \ {0}, k ≤ n, where .
We have, Then f attains its minimum value at the points B and C We compute We observe that .

Bounds for Legendre Polynomials
where x is integer part of positive real number x. The following relation holds Proof. From (1) and (2) the polynomials R (α,β) n admite the following representation by hypergeometric function From [14] (eq. 4.7.30, p. 83) we have Therefore, Using the below formula (see [17] ((15.3.3), p. 559)) 2m+1 (x) = x 2 F 1 (n + 2, −n; Then, The link between Chebyshev polynomial of the first and second kind are presented in the following two relations (see [14] (Theorem 4.1, p. 59)) Using relations (15) and (17) we get From relations (16) and (17) it follows Using the equalities we obtain, after a straight forward computation: Theorem 2. Let P n be the Legendre polynomial of degree n, n ∈ N. Then, for x ∈ (−1, 1) 2 π n + 1 2 + 1 where U n is the Chebyshev polynomial of the second kind.
. In Figure 1 are given the graphics of the polynomial 1 − P n (x) and the lower and the upper bounds of it obtained in Theorem 2.

Bounds for Information Potentials
Let p(x) = (p k (x)) k≥0 be a parameterized discrete probability distribution. The associated information potential is defined by The Rényi entropy of order 2 and the Tsallis entropy of order 2 associated with p(t, x) can be expressed in terms of the information potential S(x) as follows (see [18] (pp. 20-21)): In the following, we consider the binomial distribution p n,k and the negative binomial distribution p [1] n,k : The associated information potentials (index of coincidence) of this distributions are defined as follows: From [10] (56) and [12] (21) we know that The following lower and upper bounds for the information potential associated to binomial distribution was obtained in [11]: Using Remark 1 can be obtained new bounds for the information potential S n,−1 and S n,1 as follows: Theorem 3. The following inequalities are satisfied Remark 2. Let n = 10. In Figure 2 we give a graphical representation of the lower bound of S n,−1 (x) from (24) and (25), respectively. Denote these bounds with LS(24) and LS(25), respectively. In Figure 3 we give a graphical representation of the upper bound of S n,−1 (x) from (24) and (25), respectively. Denote these bounds with RS(24) and RS(25), respectively. Remark that on certain intervals the results obtained in Theorem 3 improve the result from [11].

Conclusions
This paper is devoted to the orthogonal polynomials. Bounds of Legendre polynomials are obtained in terms of inequalities. A more general result in regard with the estimate of the coefficients B n,k is obtained and used in order to give bounds of information potentials associated of the binomial distribution and the negative binomial distribution. The bounds of the entropies are useful especially when they are described by complicated expressions. The results obtained in this paper improve some results from the literature. Motivated by many applications of entropies in secure data transmission, speech coding, cryptography, algorithmic complexity theory, finding such bounds for the entropies will be a topic for future work.