Next Article in Journal
Sliding Surface-Based Path Planning for Unmanned Aerial Vehicle Aerobatics
Previous Article in Journal
Element Aggregation for Estimation of High-Dimensional Covariance Matrices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Discrete Entropies of Chebyshev Polynomials

by
Răzvan-Cornel Sfetcu
1,*,
Sorina-Cezarina Sfetcu
1 and
Vasile Preda
2,3,1
1
Faculty of Mathematics and Computer Science, University of Bucharest, Str. Academiei 14, 010014 Bucharest, Romania
2
“Gheorghe Mihoc-Caius Iacob” Institute of Mathematical Statistics and Applied Mathematics, Calea 13 Septembrie 13, 050711 Bucharest, Romania
3
“Costin C. Kiriţescu” National Institute of Economic Research, Calea 13 Septembrie 13, 050711 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(7), 1046; https://doi.org/10.3390/math12071046
Submission received: 29 February 2024 / Revised: 19 March 2024 / Accepted: 28 March 2024 / Published: 30 March 2024

Abstract

:
Because of its flexibility and multiple meanings, the concept of information entropy in its continuous or discrete form has proven to be very relevant in numerous scientific branches. For example, it is used as a measure of disorder in thermodynamics, as a measure of uncertainty in statistical mechanics as well as in classical and quantum information science, as a measure of diversity in ecological structures and as a criterion for the classification of races and species in population dynamics. Orthogonal polynomials are a useful tool in solving and interpreting differential equations. Lately, this subject has been intensively studied in many areas. For example, in statistics, by using orthogonal polynomials to fit the desired model to the data, we are able to eliminate collinearity and to seek the same information as simple polynomials. In this paper, we consider the Tsallis, Kaniadakis and Varma entropies of Chebyshev polynomials of the first kind and obtain asymptotic expansions. In the particular case of quadratic entropies, there are given concrete computations.
MSC:
33C45; 41A46; 41A58; 42C05; 94A17

1. Introduction

Orthogonal polynomials have applications in multiple branches of applied and pure mathematics. In this paper, we study a class of these polynomials (i.e., Chebyshev polynomials) from the point of view of information theory. For example, if the wave function Ψ ( r ) is the solution of the time-independent Schrödinger equation in an n dimensional position space for a single-particle system,
H Ψ ( r ) = E Ψ ( r ) ,   where   r = ( x 1 , , x n ) ,
then the position density of the system is ρ ( r ) = Ψ ( r ) 2 . Similarly, the momentum density γ ( p ) = Ψ ^ ( p ) 2 is given by the Fourier transform of Ψ ( r ) , which is the wave function in momentum space Ψ ^ ( p ) .
The fundamental and experimentally measurable physical quantities are related to the information measures of these densities. Hence, the information measures become useful in the study of the structure and dynamics of atomic and molecular systems. The study of the information measures of orthogonal polynomials is motivated by the fact that the densities of many quantum mechanical systems with shape-invariant potentials, as the harmonic oscillator and the hydrogenic systems, typically contain terms of the form p n 2 μ .
In the literature, there are plenty of papers in which the authors study the asymptotic expansions of the Shannon entropy of some polynomials (Chebyshev, Laguerre, Hermite, Gegenbauer, etc.). Taking into account that Tsallis entropy, Kaniadakis entropy and Varma entropy have applications in some physical phenomena that cannot be modeled by the Shannon entropy, we believe that it is useful to obtain asymptotic expansions of the Tsallis, Kaniadakis and Varma entropies of the aforementioned polynomials.
Being motivated by those discussed above, by the paper [1] (which is among the references of our paper), where the authors define the discrete Shannon entropy of Chebyshev polynomials of the first kind and by the fact that the Tsallis, Kaniadakis and Varma entropies are generalizations of Shannon entropy with applications in physics, we decided to introduce the discrete Tsallis entropy of Chebyshev polynomials of the first kind, the discrete Kaniadakis entropy of Chebyshev polynomials of the first kind and the discrete Varma entropy of Chebyshev polynomials of the first kind.
The concept of entropy was introduced by Shannon and has its roots in communication theory. In information theory, the concept is directly analogous to the entropy in statistical thermodynamics. Entropy also has relevance in other areas of mathematics such as combinatorics. The definition can be derived from a set of axioms which establish that entropy is a measure of how surprising the average outcome of a variable is.
The notion of the Shannon entropy has multiple generalizations (Tsallis entropy, Varma entropy, Rényi entropy, Kaniadakis entropy, relative entropy, cumulative entropy, weighted entropy, etc.), which are useful in many technological areas such as communication theory, physics, statistics, probability and economics.
The Tsallis entropy was introduced in 1988 in the Journal of Statistical Physics (see [2]). The idea of Tsallis was to consider another formula instead of the classical logarithm used in the Shannon entropy (see [3] for the Shannon entropy).
The Kaniadakis entropy was introduced in 2001 in Physica A (see [4]). Like in the case of the Tsallis entropy, instead of classical logarithm (and instead of Tsallis logarithm), it is considered the Kaniadakis logarithm.
In the last couple of years, many researchers have focused on subjects covering different examples of orthogonal polynomials. Among other results, they have proven some explicit formulas, numerical algorithms and the asymptotic behaviour of the entropy of these polynomials (see [1,5,6,7,8]).
Aptekarev et al. [1] obtained asymptotic expansions of the Shannon entropy of Chebyshev polynomials of the first and second kinds. In the proofs, they make use of the results from number theory.
Buyarov et al. [5] introduced an algorithm for an effective and accurate numerical computation of the Shannon entropy of polynomials orthogonal on a segment of the real line from the coefficients of the three-term recurrence relation which they satisfy. The case of Gegenbauer polynomials was studied in detail because of their own relevance (as a very “representative” class of polynomials) and because of their numerous applications. For example, these polynomials control the angular component of the wave functions of single-particle systems in central potentials. The results of several numerical experiments were also discussed, illustrating both the accuracy and efficiency of the algorithm proposed and comparing it with other computing strategies. Finally, using the known relationship between spherical harmonics and Gegenbauer polynomials, the Shannon entropy of spherical harmonics, which measures the spatial complexity of single-particle systems and physical systems with central potentials, was computed.
Dehesa et al. [6] proved how the Shannon entropy of some classical orthogonal polynomials plays a role in some problems related to the harmonic oscillator and the Coulomb potential (hydrogen atom). It was shown that this entropy for orthogonal polynomials is related to the distribution of zeros and to the mutual energy and logarithmic potential of some measures involving the zeros of the orthogonal polynomials. Gegenbauer polynomials, Laguerre polynomials and Hermite polynomials were analyzed in detail. For the logarithmic energy of Chebyshev polynomials of the first and second kinds, the authors provided some closed formulas.
Yáñez et al. [7] initiated a project of computing information entropies of orthogonal polynomials. They defined the entropy of a family of orthonormal polynomials with respect to a measure and computed this entropy exactly for Chebyshev polynomials. In general, there were no simple formulas, and the research largely addressed asymptotics as n , typically relying on some asymptotic formulas for the polynomials.
Sfetcu [8] defined, with the help of a sequence of some generalized Jacobi polynomials, a sequence of discrete probability distributions and introduced Tsallis divergence and Rényi divergence between every element of the sequence of probability distributions introduced above and the element of the equiprobability distribution corresponding to the same index. It was proven that both sequences of divergences which were obtained were convergent. In the particular case of quadratic divergences, the limits were explicitly computed.
In information theory, orthogonal polynomials play an important role. To sustain this idea, we mention the modern density functional theory, which states that the physical and chemical properties of fermionic systems (atoms, molecules, nuclei, solids) can be completely described by means of the single-particle probability density (see [9,10,11]). Furthermore, it is known that we can express the wave function of many important systems (for example, D dimensional harmonic oscillator and hydrogen atom) using families of orthogonal polynomials. As it has been shown in [6,7], the computation of some entropies can be reduced many times to integrals involving these polynomials.
Other new interesting results concerning these topics can be found in [12,13,14,15,16,17].
Le Blanc [12] derived the univariate ultraspherical noncentral t, the normal N, F and χ 2 distributions from translated uniform density distributions on unit radius hyperspheres. For these, he proved that they are expressible as products of their central distribution. He also provided specific generating functions for the Gegenbauer, Hermite, Jacobi and Laguerre orthogonal polynomial families. It was shown that when these duals are expanded on a small number of low-order orthogonal polynomials, the determination of the Gibbs priors in terms of the empirical densities’ entropic convex duals is much simplified. Furthermore, he discussed how prior moments in parametric space are directly provided by the Bayes factor orthogonal polynomial expansion coefficients in a random variable space. Finally, some applications in genomics and geophysics were provided.
Min and Wang [13] derived the difference equations and differential–difference equations satisfied by the recurrence coefficients by using Chen’s and Ismail’s ladder operator approach for polynomials that are orthogonal with respect to the singularly perturbed Freud weight functions. They also obtained, for the orthogonal polynomials, the differential–difference equations and the second-order differential equations, all the coefficients being expressed in terms of the recurrence coefficients.
Abd-Elhameed and Alsuyuti [14] generalized the class of Chebyshev polynomials of the first kind by introducing a new class of orthogonal polynomials. They established some basic properties of the generalized Chebyshev polynomials and their shifted ones, and, additionally, they found, for these generalized polynomials, some new formulas. Some specific problems, which appear in many applications regarding the multi-term linear fractional differential equations, were resolved using these generalized orthogonal polynomials. Behind the derivation of the algorithm is the following basic idea: the usage of a new power form representation of the shifted generalized Chebyshev polynomials along with the application of the spectral Galerkin method to transform the fractional differential equation governed by its initial conditions into a system of linear equations which can be efficiently solved via a suitable numerical solver. They provided some illustrative examples accompanied by comparisons with some other methods in order to show that the presented algorithm is useful and effective.
Abd-Elhameed et al. [15] obtained numerical solutions of the nonlinear time-fractional generalized Kawahara Equation (NTFGKE) by giving an innovative approach involving a spectral collocation algorithm. They introduced the “Eighth-kind Chebyshev polynomials (CPs)” which are a new set of orthogonal polynomials (OPs) and represent special cases of generalized Gegenbauer polynomials. The shifted eighth-kind CPs were incorporated as fundamental functions in order to achieve the proposed numerical approximations. The scope of this method is to facilitate the transformation of the equation and its inherent conditions into a set of nonlinear algebraic equations. By using Newton’s method, they obtained the necessary semi-analytical solutions. The convergence and errors were evaluated following a rigorous analysis, and the reliability and effectiveness of the approach were validated through a series of numerical experiments accompanied by comparative assessments.
Atta et al. [16] introduced a spectral tau solution to the heat conduction equation. The orthogonal polynomials (more exactly, the shifted fifth-kind Chebyshev polynomials (5CPs)) were used as basis functions. The proposed method’s derivation is based on solving the integral equation that corresponds to the original problem. Using the tau approach and some theoretical findings with its underlying conditions, the problem was transformed into a system of equations that can be solved by the Gaussian elimination method. Some numerical examples were given for the precision and applicability of the algorithm.
Zhang et al. [17] presented an introduction to bi-univalent functions linked with q Hermite polynomials.

2. Preliminaries

In this present paper, we obtain asymptotic expansions for the discrete Tsallis, Kaniadakis and Varma entropies of Chebyshev polynomials of the first kind.
We denote N = { 0 , 1 , 2 , } and N * = { 1 , 2 , } . Furthermore, we use the notation G C D ( a , b ) to designate the greatest common divisor of a and b.
Definition 1.
Let w : [ a , b ] [ 0 , ) be a measurable function, where a < b .
1. We say that the polynomials P n are orthogonal if
a b P m ( λ ) P n ( λ ) w ( λ ) d λ = 0
for any m , n N , m n .
2. We say that the polynomials P n are orthonormal if
a b P m ( λ ) P n ( λ ) w ( λ ) d λ = δ m n
for any m , n N , where δ m n is the Kronecker symbol, given as
δ m n = 1 if m = n 0 if m n .
Consider a discrete probability distribution p = ( p 1 , , p n ) such that i = 1 n p i = 1 .
For any α R * , the Tsallis logarithm is defined as
log T x = x α 1 α if x > 0 0 if x = 0 .
The discrete Tsallis entropy is defined as
S T ( p ) = i = 1 n p i log T ( p i ) .
For more information about and applications of the Tsallis entropy, we recommend [8,18,19,20,21,22,23,24,25].
Let α R * . The Kaniadakis logarithm is given by
log K x = x α x α 2 α if x > 0 0 if x = 0 .
The discrete Kaniadakis entropy is defined as
S K ( p ) = i = 1 n p i log K ( p i ) .
The reader can find other results and applications concerning the Kaniadakis entropy in [23,26,27,28,29,30,31].
The Varma entropy was introduced in 1966 in Journal of Mathematical Sciences (see [32]). Let γ , δ R such that δ > 1 and δ 1 γ < δ . The discrete Varma entropy is given by
S V ( p ) = 1 δ γ log i = 1 n p i γ + δ 1 ,
where “log” designates the classical logarithm.
For other details and applications of the Varma entropy, we recommend [33,34,35,36,37,38]. We emphasize the fact that the Varma entropy is a generalization of the Rényi entropy, which is a well-known entropy for its applications (see [8,39,40,41,42,43,44,45,46]).
As we have already suggested, these entropies are generalizations of the Shannon entropy.

3. Results

Chebyshev polynomials of first kind are given as follows ( λ = cos θ , θ [ 0 , π ] ):
P m ( λ ) = 2 cos ( m θ ) if m N * 1 if m = 0 .
It is known that, for any m , n N ,
1 1 P m ( λ ) P n ( λ ) w ( λ ) d λ = δ m n ,
where w ( λ ) = 1 π · 1 1 λ 2 .
We consider the n-th Christoffel function
l n ( λ ) = k = 0 n 1 P k 2 ( λ ) 1 .
We define the discrete Tsallis entropy of Chebyshev polynomials ( P n ) n as
S n T ( λ ) = d e f i = 1 n l n ( λ ) P i 1 2 ( λ ) log T l n ( λ ) P i 1 2 ( λ ) ,
the discrete Kaniadakis entropy of Chebyshev polynomials ( P n ) n as
S n K ( λ ) = d e f i = 1 n l n ( λ ) P i 1 2 ( λ ) log K l n ( λ ) P i 1 2 ( λ )
and the discrete Varma entropy of Chebyshev polynomials ( P n ) n as
S n V ( λ ) = d e f 1 δ γ log i = 1 n l n ( λ ) P i 1 2 ( λ ) γ + δ 1 .
Because log T 0 = log K 0 = 0 , in what follows, we make the convention 0 0 = 0 .
Theorem 1.
We have the following:
( a )   S n T ( λ ) = l n ( λ ) log T ( l n ( λ ) ) 2 l n ( λ ) α + 1 α i = 1 n 1 cos 2 α + 2 ( i θ ) + 2 l n ( λ ) α i = 1 n 1 cos 2 ( i θ ) . ( b )   S n K ( λ ) = l n ( λ ) log K ( l n ( λ ) ) 2 l n ( λ ) α + 1 2 α i = 1 n 1 cos 2 α + 2 ( i θ ) + 2 l n ( λ ) 1 α 2 α i = 1 n 1 cos 2 2 α ( i θ ) . ( c )   S n V ( λ ) = γ + δ 1 δ γ log l n ( λ ) + 1 δ γ log 1 + 2 γ + δ 1 i = 1 n 1 cos 2 ( i θ ) γ + δ 1 .
Proof. 
 
( a ) S n T ( λ ) = i = 1 n l n ( λ ) P i 1 2 ( λ ) log T l n ( λ ) P i 1 2 ( λ ) = i = 0 n 1 l n ( λ ) P i 2 ( λ ) log T l n ( λ ) P i 2 ( λ ) = l n ( λ ) log T ( l n ( λ ) ) l n ( λ ) i = 1 n 1 P i 2 ( λ ) log T l n ( λ ) P i 2 ( λ ) = l n ( λ ) log T ( l n ( λ ) ) l n ( λ ) i = 1 n 1 2 cos 2 ( i θ ) log T l n ( λ ) 2 cos 2 ( i θ ) = l n ( λ ) log T ( l n ( λ ) ) l n ( λ ) i = 1 n 1 2 cos 2 ( i θ ) l n ( λ ) 2 cos 2 ( i θ ) α 1 α = l n ( λ ) log T ( l n ( λ ) ) 2 l n ( λ ) α + 1 α i = 1 n 1 cos 2 α + 2 ( i θ ) + 2 l n ( λ ) α i = 1 n 1 cos 2 ( i θ ) .
( b ) S n K ( λ ) = i = 1 n l n ( λ ) P i 1 2 ( λ ) log K l n ( λ ) P i 1 2 ( λ ) = i = 0 n 1 l n ( λ ) P i 2 ( λ ) log K l n ( λ ) P i 2 ( λ ) = l n ( λ ) log K ( l n ( λ ) ) l n ( λ ) i = 1 n 1 P i 2 ( λ ) log K l n ( λ ) P i 2 ( λ ) = l n ( λ ) log K ( l n ( λ ) ) l n ( λ ) i = 1 n 1 2 cos 2 ( i θ ) log K l n ( λ ) 2 cos 2 ( i θ ) = l n ( λ ) log K ( l n ( λ ) ) l n ( λ ) i = 1 n 1 2 cos 2 ( i θ ) l n ( λ ) 2 cos 2 ( i θ ) α l n ( λ ) 2 cos 2 ( i θ ) α 2 α = l n ( λ ) log K ( l n ( λ ) ) 2 l n ( λ ) α + 1 2 α i = 1 n 1 cos 2 α + 2 ( i θ ) + 2 l n ( λ ) 1 α 2 α i = 1 n 1 cos 2 2 α ( i θ ) .
( c ) S n V ( λ ) = 1 δ γ log i = 1 n l n ( λ ) P i 1 2 ( λ ) γ + δ 1 = 1 δ γ log i = 0 n 1 l n ( λ ) P i 2 ( λ ) γ + δ 1 = 1 δ γ log l n ( λ ) γ + δ 1 + i = 1 n 1 2 l n ( λ ) cos 2 ( i θ ) γ + δ 1 = 1 δ γ log l n ( λ ) γ + δ 1 + 2 l n ( λ ) γ + δ 1 i = 1 n 1 cos 2 ( i θ ) γ + δ 1 = 1 δ γ log l n ( λ ) γ + δ 1 1 + 2 γ + δ 1 i = 1 n 1 cos 2 ( i θ ) γ + δ 1 = γ + δ 1 δ γ log l n ( λ ) + 1 δ γ log 1 + 2 γ + δ 1 i = 1 n 1 cos 2 ( i θ ) γ + δ 1 .
Remark 1.
Because the zeros of Chebyshev polynomials of the first kind and degree n are
λ j ( n ) = cos ( 2 j 1 ) π 2 n ,   j = 1 , 2 , , n ,
we have l n λ j ( n ) = 1 n for any j = 1 , 2 , , n .
Considering S n , j T = d e f S n T ( λ j ( n ) ) , S n , j K = d e f S n K ( λ j ( n ) ) and S n , j V = d e f S n V ( λ j ( n ) ) , we have (see Theorem 1):
( a )   S n , j T = 1 n log T 1 n 2 α + 1 α n α + 1 i = 1 n 1 cos 2 α + 2 ( 2 j 1 ) π 2 n i + 2 α n i = 1 n 1 cos 2 ( 2 j 1 ) π 2 n i . ( b )   S n , j K = 1 n log K 1 n 2 α + 1 2 α n α + 1 i = 1 n 1 cos 2 α + 2 ( 2 j 1 ) π 2 n i + 2 1 α 2 α n 1 α i = 1 n 1 cos 2 2 α ( 2 j 1 ) π 2 n i . ( c )   S n , j V = γ + δ 1 δ γ log 1 n + 1 δ γ log 1 + 2 γ + δ 1 i = 1 n 1 cos 2 ( γ + δ 1 ) ( 2 j 1 ) π 2 n i .
Remark 2.
Let β R and
A n , j ( 2 β ) = i = 1 n 1 cos 2 β ( 2 j 1 ) π 2 n i .
By Remark 1, we have the following:
( a )   S n , j T = 1 n log T 1 n 2 α + 1 α n α + 1 A n , j ( 2 α + 2 ) + 2 α n A n , j ( 2 ) . ( b )   S n , j K = 1 n log K 1 n 2 α + 1 2 α n α + 1 A n , j ( 2 α + 2 ) + 2 1 α 2 α n 1 α A n , j ( 2 2 α ) . ( c )   S n , j V = γ + δ 1 δ γ log 1 n + 1 δ γ log 1 + 2 γ + δ 1 A n , j ( 2 ( γ + δ 1 ) ) .
Proposition 1.
(see [1]). Let n N * and j { 1 , 2 , , n } . If G C D ( 2 j 1 , n ) = d , then
A n , j ( 2 β ) = n 1 2 if j = n + 1 2 d i = 1 ( n / d ) 1 cos 2 β π d 2 n i + d 1 2 otherwise .
Moreover, A n , j ( 2 β ) = A n , n j + 1 ( 2 β ) .
Remark 3.
Let n N * and j { 1 , 2 , , n } , j n + 1 2 . We denote d = G C D ( 2 j 1 , n ) .
According to the preceding proposition,
A n , j ( 2 β ) = d i = 1 ( n / d ) 1 cos 2 β π d 2 n i + d 1 2 = d i = 1 ( π / 2 h ) 1 cos 2 β h i + d 1 2 ,
where h = d e f π d 2 n .
We remark that A n , j ( 2 β ) is related to the Riemann sum of an integral:
h A n , j ( 2 β ) d 0 π 2 cos 2 β u d u + h · d 1 2 = d 0 π 2 cos ( u ) π 2 u 2 β π 2 u 2 β d u + h · d 1 2 .
Consider cos 2 β ( u ) and cos ( u ) π 2 u 2 β as analytic functions at u = 0 , respectively, at u = π 2 , whose single-valued branches in the corresponding neighborhoods are fixed by
cos 2 β ( u ) | u = 0 = cos ( u ) π 2 u 2 β | u = π 2 = 1 .
Denote by
cos 2 β ( u ) = 1 + k = 1 f k ( 2 β ) u k
and
cos ( u ) π 2 u 2 β = 1 + k = 1 g k ( 2 β ) π 2 u k
the Taylor expansions of these functions. It is known that f k ( 2 β ) = g k ( 2 β ) = 0 for odd indices k.
By the Euler–Maclaurin summation formula for integrals with endpoint singularities (see [47]), we have
h A n , j ( 2 β ) = h d i = 1 π / ( 2 h ) 1 cos 2 β ( h i ) + h · d 1 2 d 0 π 2 cos 2 β ( u ) d u + d k = 0 f k ( 2 β ) ζ ( k ) h k + 1 + d k = 0 g k ( 2 β ) ζ ( k 2 β ) h k + 2 β + 1 + h · d 1 2 ,
where ζ is the Zeta function and “∼” means that the right-hand side is an asymptotic expansion as h 0 .
Because ζ ( 0 ) = 1 2 and ζ ( 2 k ) = 0 for any k N * , the preceding formula becomes
A n , j ( 2 β ) d h 0 π 2 cos 2 β ( u ) d u d 2 + d k = 0 g 2 k ( 2 β ) ζ ( 2 k 2 β ) h 2 k + 2 β + d 1 2 = 2 n π 0 π 2 cos 2 β ( u ) d u d 2 + d k = 0 g 2 k ( 2 β ) ζ ( 2 k 2 β ) h 2 k + 2 β + d 1 2 .
Remark 4.
According to Remark 3, we get
( a )   S n , j T 1 n log T 1 n 2 α + 1 α n α + 1 ( 2 n π 0 π 2 cos 2 α + 2 ( u ) d u d 2 + d k = 0 g 2 k ( 2 α + 2 ) ζ ( 2 k 2 α 2 ) h 2 k + 2 α + 2 + d 1 2 ) + 2 α n 2 n π 0 π 2 cos 2 ( u ) d u d 2 + d k = 0 g 2 k ( 2 ) ζ ( 2 k 2 ) h 2 k + 2 + d 1 2 . ( b )   S n , j K 1 n log K 1 n 2 α + 1 2 α n α + 1 ( 2 n π 0 π 2 cos 2 α + 2 ( u ) d u d 2 + d k = 0 g 2 k ( 2 α + 2 ) ζ ( 2 k 2 α 2 ) h 2 k + 2 α + 2 + d 1 2 ) + 2 1 α 2 α n 1 α ( 2 n π 0 π 2 cos 2 2 α ( u ) d u d 2 + d k = 0 g 2 k ( 2 2 α ) ζ ( 2 k 2 + 2 α ) h 2 k + 2 2 α + d 1 2 ) . ( c )   S n , j V γ + δ 1 δ γ log 1 n + 1 δ γ log ( 1 + 2 γ + δ 1 ( 2 n π 0 π 2 cos 2 ( γ + δ 1 ) ( u ) d u d 2 + d k = 0 g 2 k ( 2 ( γ + δ 1 ) ) ζ ( 2 k 2 ( γ + δ 1 ) ) h 2 k + 2 ( γ + δ 1 ) + d 1 2 ) ) .
If we consider α = 1 in formulas ( a ) and ( b ) , we obtain the quadratic discrete Tsallis and Kaniadakis entropies of Chebyshev polynomials of the first kind.
Namely, we have the following:
( a )   S n , j T 1 n log T 1 n 4 n 2 2 n π 0 π 2 cos 4 ( u ) d u d 2 + d k = 0 g 2 k ( 4 ) ζ ( 2 k 4 ) h 2 k + 4 + d 1 2 + 2 n 2 n π 0 π 2 cos 2 ( u ) d u d 2 + d k = 0 g 2 k ( 2 ) ζ ( 2 k 2 ) h 2 k + 2 + d 1 2 = 1 n log T 1 n 4 n 2 2 n π 0 π 2 cos 4 ( u ) d u d 2 + d k = 0 g 2 k ( 4 ) ζ ( 2 ( k + 2 ) ) h 2 k + 4 + d 1 2 + 2 n 2 n π 0 π 2 cos 2 ( u ) d u d 2 + d k = 0 g 2 k ( 2 ) ζ ( 2 ( k + 1 ) ) h 2 k + 2 + d 1 2 .
( b )   S n , j K 1 n log K 1 n 2 n 2 2 n π 0 π 2 cos 4 ( u ) d u d 2 + d k = 0 g 2 k ( 4 ) ζ ( 2 k 4 ) h 2 k + 4 + d 1 2 + 1 2 n d + d 1 2 = 1 n log K 1 n 2 n 2 2 n π 0 π 2 cos 4 ( u ) d u d 2 + d k = 0 g 2 k ( 4 ) ζ ( 2 ( k + 2 ) ) h 2 k + 4 + d 1 2 + 1 2 n d 2 1 2 .
Taking into account that ζ ( 0 ) = 1 2 and ζ ( 2 k ) = 0 for any k N * , we obtain the following:
( a ) S n , j T 1 n log T 1 n 4 n 2 2 n π 0 π 2 cos 4 ( u ) d u d 2 + d 1 2 + 2 n 2 n π 0 π 2 cos 2 ( u ) d u d 2 + d 1 2 .
( b )   S n , j K 1 n log K 1 n 2 n 2 2 n π 0 π 2 cos 4 ( u ) d u d 2 + d 1 2 + 1 2 n d 2 1 2 .
After some elementary computations, we obtain the following:
( a )   S n , j T 1 n 1 n 2 4 n 2 2 n π · 3 π 16 1 2 + 2 n 2 n π · π 4 1 2 = 1 n 1 n 2 3 2 n + 2 n 2 + 1 1 n = 1 + 1 n 2 3 2 n . ( b )   S n , j K 1 2 1 2 n 2 2 n 2 2 n π · 3 π 16 1 2 + 1 2 n d 2 1 2 = 1 4 1 2 n 2 3 4 n + 1 n 2 + n 2 d 4 = 1 4 + 1 2 n 2 3 4 n + n 2 d 4 = n 2 d 1 4 + 1 2 n 2 3 4 n .
Now, we take γ = 1 and δ = 2 in formula ( c ) , obtaining the quadratic discrete Varma entropy of Chebyshev polynomials of the first kind.
We have
S n , j V 2 log 1 n + log 1 + 4 2 n π 0 π 2 cos 4 ( u ) d u d 2 + d k = 0 g 2 k ( 4 ) ζ ( 2 k 4 ) h 2 k + 4 + d 1 2 .
Using previous computations, we obtain
S n , j V log 1 n 2 + log 1 + 4 2 n π · 3 π 16 d 2 + d 1 2 = log 1 n 2 + log 1 + 3 n 2 2 = log 3 2 n 1 n 2 .

4. Conclusions

We introduced the discrete Tsallis, Kaniadakis and Varma entropies of Chebyshev polynomials of the first kind and proved some of their asymptotic expansions (as n ). In the quadratic case, the asymptotic expansions from above are written more simply.

Author Contributions

Conceptualization, R.-C.S., S.-C.S. and V.P.; methodology, R.-C.S., S.-C.S. and V.P.; validation, R.-C.S., S.-C.S. and V.P.; formal analysis, R.-C.S., S.-C.S. and V.P.; investigation, R.-C.S., S.-C.S. and V.P.; resources, R.-C.S., S.-C.S. and V.P.; writing—original draft preparation, R.-C.S., S.-C.S. and V.P.; writing—review and editing, R.-C.S., S.-C.S. and V.P.; visualization, R.-C.S., S.-C.S. and V.P.; supervision, R.-C.S., S.-C.S. and V.P.; project administration, R.-C.S., S.-C.S. and V.P. All authors have contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The authors are very much indebted to the editors and to the anonymous referees for their most valuable comments and suggestions which improved the quality of this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Aptekarev, A.I.; Dehesa, J.S.; Martínez-Finkelshtein, A.; Yáñez, R.J. Discrete entropies of orthogonal polynomials. Constr. Approx. 2009, 30, 93–119. [Google Scholar] [CrossRef]
  2. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  3. Guiaşu, S. Information Theory with Applications; McGraw-Hill Inc.: New York, NY, USA, 1977. [Google Scholar]
  4. Kaniadakis, G. Non-linear kinetics underlying generalized statistics. Physica A 2001, 296, 405–425. [Google Scholar] [CrossRef]
  5. Buyarov, V.S.; Dehesa, J.S.; Martínez-Finkelshtein, A.; Sánchez-Lara, J. Computation of the entropy of polynomials orthogonal on an interval. SIAM J. Sci. Comput. 2004, 26, 488–509. [Google Scholar] [CrossRef]
  6. Dehesa, J.S.; Van Assche, W.; Yáñez, R.J. Information entropy of classical orthogonal polynomials and their application to the harmonic oscillator and Coulomb potentials. Methods Appl. Anal. 1997, 4, 91–110. [Google Scholar] [CrossRef]
  7. Yáñez, R.J.; Van Assche, W.; Dehesa, J.S. Position and momentum information entropies of the D- dimensional harmonic oscillator and hydrogen atom. Phys. Rev. A 1994, 50, 3065–3079. [Google Scholar] [CrossRef]
  8. Sfetcu, R.-C. Tsallis and Rényi divergences of generalized Jacobi polynomials. Physica A 2016, 460, 131–138. [Google Scholar] [CrossRef]
  9. Dreizler, R.M.; Gross, E.K.U. Density Functional Theory: An Approach to the Quantum Mechanics; Springer: Berlin/Heidelberg, Germany, 1990. [Google Scholar]
  10. Hohenberg, P.; Kohn, W. Inhomogeneous electron gas. Phys. Rev. B 1964, 136, 864–870. [Google Scholar] [CrossRef]
  11. March, N.H. Electron Density Theory of Atoms and Molecules; Academic Press: New York, NY, USA, 1992. [Google Scholar]
  12. Le Blanc, R. Jaynes-Gibbs entropic convex duals and orthogonal polynomials. Entropy 2022, 24, 709. [Google Scholar] [CrossRef]
  13. Min, C.; Wang, L. Orthogonal polynomials with singularly perturbed Freud weights. Entropy 2023, 25, 829. [Google Scholar] [CrossRef]
  14. Abd-Elhameed, W.M.; Alsuyuti, M.M. Numerical treatment of multi-term fractional differential equations via new kind of generalized Chebyshev polynomials. Fractal Fract. 2023, 7, 74. [Google Scholar] [CrossRef]
  15. Abd-Elhameed, W.M.; Youssri, Y.H.; Amin, A.K.; Atta, A.G. Eighth-kind Chebyshev polynomials collocation algorithm for the nonlinear time-fractional generalized Kawahara equation. Fractal Fract. 2023, 7, 652. [Google Scholar] [CrossRef]
  16. Atta, A.G.; Abd-Elhameed, W.M.; Moatimid, G.M.; Youssri, Y.H. Modal shifted fifth-kind Chebyshev tau integral approach for solving heat conduction equation. Fractal Fract. 2022, 6, 619. [Google Scholar] [CrossRef]
  17. Zhang, C.; Khan, B.; Shaba, T.G.; Ro, J.-S.; Araci, S.; Khan, M.G. Applications of q-Hermite polynomials to subclasses of analytic and bi-univalent functions. Fractal Fract. 2022, 6, 420. [Google Scholar] [CrossRef]
  18. Calì, C.; Longobardi, M.; Ahmadi, J. Some properties of cumulative Tsallis entropy. Physica A 2017, 486, 1012–1021. [Google Scholar] [CrossRef]
  19. Furuichi, S. Information theoretical properties of Tsallis entropies. J. Math. Phys. 2006, 47, 023302. [Google Scholar] [CrossRef]
  20. Furuichi, S.; Mitroi, F.-C. Mathematical inequalities for some divergences. Physica A 2012, 391, 388–400. [Google Scholar] [CrossRef]
  21. Furuichi, S.; Mitroi-Symeonidis, F.-C.; Symeonidis, E. On some properties of Tsallis hypoentropies and hypodivergences. Entropy 2014, 16, 5377–5399. [Google Scholar] [CrossRef]
  22. Rogers, C.; Ruggeri, T. q-Gaussian integrable Hamiltonian reductions in anisentropic gasdynamics. Discrete Contin. Dyn. Syst. Ser. B 2014, 19, 2297–2312. [Google Scholar]
  23. Sfetcu, R.-C.; Sfetcu, S.-C.; Preda, V. On Tsallis and Kaniadakis divergences. Math. Phys. Anal. Geom. 2022, 25, 7. [Google Scholar] [CrossRef]
  24. Tsallis, C. Introduction to Nonextensive Statistical Mechanics; Springer Science Business Media LLC: New York, NY, USA, 2009. [Google Scholar]
  25. Rastegin, A.E. Bounds of the Pinsker and Fannes types on the Tsallis relative entropy. Math. Phys. Anal. Geom. 2013, 16, 213–228. [Google Scholar] [CrossRef]
  26. Abreul, E.M.C.; Neto, J.A.; Barboza, E.M.; Nunes, R.C. Jeans instability criterion from the viewpoint of Kaniadakis’ statistics. EPL 2016, 114, 55001. [Google Scholar] [CrossRef]
  27. Clementi, F.; Gallegati, M.; Kaniadakis, G. A k-generalized statistical mechanics approach to income analysis. J. Stat. Mech. 2009, P02037. [Google Scholar] [CrossRef]
  28. Macedo-Filho, A.; Moreira, D.A.; Silva, R.; da Silva, L.R. Maximum entropy principle for Kaniadakis statistics and networks. Phys. Lett. A 2013, 377, 842–846. [Google Scholar] [CrossRef]
  29. Sunoj, S.M.; Krishnan, A.S.; Sankaran, P.G. A quantile-based study of cumulative residual Tsallis entropy measures. Physica A 2018, 494, 410–421. [Google Scholar] [CrossRef]
  30. Trivellato, B. The minimal k-entropy martingale measure. Int. J. Theor. Appl. Financ. 2012, 15, 1250038. [Google Scholar] [CrossRef]
  31. Trivellato, B. Deformed exponentials and applications to finance. Entropy 2013, 15, 3471–3489. [Google Scholar] [CrossRef]
  32. Varma, R.S. Generalization of Rényi’s entropy of order α. J. Math. Sci. 1966, 1, 34–48. [Google Scholar]
  33. Ajith, K.K.; Abdul Sathar, E.I. Some results on dynamic weighted Varma’s entropy and its applications. Am. J. Math. Manag. Sci. 2020, 39, 90–98. [Google Scholar] [CrossRef]
  34. Kumar, V.; Rani, R. Quantile approach of dynamic generalized entropy (divergence) measure. Statistica 2018, 2, 105–126. [Google Scholar]
  35. Liu, C.; Chang, C.; Chang, Z. Maximum Varma entropy distribution with conditional value at risk constraints. Entropy 2020, 22, 663. [Google Scholar] [CrossRef]
  36. Malhotra, G.; Srivastava, R.; Taneja, H.C. Calibration of the risk-neutral density function by maximization of a two-parameter entropy. Physica A 2019, 513, 45–54. [Google Scholar] [CrossRef]
  37. Sati, M.M.; Gupta, N. On partial monotonic behaviour of Varma entropy and its application in coding theory. J. Indian Stat. Assoc. 2015, 53, 135–152. [Google Scholar]
  38. Sfetcu, S.-C. Varma quantile entropy order. Analele Ştiinţifice Univ. Ovidius Constanţa 2021, 29, 249–264. [Google Scholar] [CrossRef]
  39. Chen, Y.; Feng, J. Spatial analysis of cities using Rényi entropy and fractal parameters. Chaos Solitons Fractals 2017, 105, 279–287. [Google Scholar] [CrossRef]
  40. Haven, E. The Blackwell and Dubins theorem and Rényi’s amount of information measure: Some applications. Acta Appl. Math. 2010, 109, 743–757. [Google Scholar] [CrossRef]
  41. Raşa, I. Convexity properties of some entropies. Result. Math. 2018, 73, 105. [Google Scholar] [CrossRef]
  42. Raşa, I. Convexity properties of some entropies. II. Result. Math. 2019, 74, 154. [Google Scholar] [CrossRef]
  43. Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Berkeley, CA, USA, 20 June 20–30 July 1960; University of California Press: Berkeley, CA, USA, 1961; Volume 1, pp. 547–561. [Google Scholar]
  44. Toscani, G. Rényi entropies and nonlinear diffusion equations. Acta Appl. Math. 2014, 132, 595–604. [Google Scholar] [CrossRef]
  45. Xu, M.; Shang, P.; Zhang, S. Multiscale Rényi cumulative residual distribution entropy: Reliability analysis of financial time series. Chaos Solitons Fractals 2021, 143, 110410. [Google Scholar] [CrossRef]
  46. Nadarajah, S.; Kotz, S. Mathematical properties of the multivariate t distribution. Acta Appl. Math. 2005, 89, 53–84. [Google Scholar] [CrossRef]
  47. Sidi, A. Euler–Maclaurin expansions for integrals with endpoint singularities: A new perspective. Numer. Math. 2004, 98, 371–387. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sfetcu, R.-C.; Sfetcu, S.-C.; Preda, V. Discrete Entropies of Chebyshev Polynomials. Mathematics 2024, 12, 1046. https://doi.org/10.3390/math12071046

AMA Style

Sfetcu R-C, Sfetcu S-C, Preda V. Discrete Entropies of Chebyshev Polynomials. Mathematics. 2024; 12(7):1046. https://doi.org/10.3390/math12071046

Chicago/Turabian Style

Sfetcu, Răzvan-Cornel, Sorina-Cezarina Sfetcu, and Vasile Preda. 2024. "Discrete Entropies of Chebyshev Polynomials" Mathematics 12, no. 7: 1046. https://doi.org/10.3390/math12071046

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop