Next Article in Journal
The Influences of the Hyperbolic Two-Temperatures Theory on Waves Propagation in a Semiconductor Material Containing Spherical Cavity
Next Article in Special Issue
Refinements of the Converse Hölder and Minkowski Inequalities
Previous Article in Journal
Vertical Nonlinear Temperature Gradient and Temperature Load Mode of Ballastless Track in China
Previous Article in Special Issue
Jensen-Type Inequalities for (h, g; m)-Convex Functions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Chebyshev-Steffensen Inequality Involving the Inner Product

by
Milica Klaričić Bakula
1,* and
Josip Pečarić
2
1
Faculty of Science, University of Split, Ruđera Boškovića 33, 21000 Split, Croatia
2
Croatian Academy of Sciences and Arts, Trg Nikole Šubića Zrinskog 11, 10000 Zagreb, Croatia
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(1), 122; https://doi.org/10.3390/math10010122
Submission received: 14 December 2021 / Revised: 29 December 2021 / Accepted: 30 December 2021 / Published: 1 January 2022
(This article belongs to the Special Issue Advances in Mathematical Inequalities and Applications)

Abstract

:
In this paper, we prove the Chebyshev-Steffensen inequality involving the inner product on the real m-space. Some upper bounds for the weighted Chebyshev-Steffensen functional, as well as the Jensen-Steffensen functional involving the inner product under various conditions, are also given.

1. Introduction

Let f be a convex function defined on a real interval J R . Jensen’s inequality states that if x = ( x 1 , , x n ) J n , n N , then
f ( 1 P n i = 1 n p i x i ) 1 P n i = 1 n p i f ( x i ) ,
for all nonnegative real n-tuples p = ( p 1 , , p n ) , such that P n = p 1 + + p n > 0 . For f strictly convex (1) is strict unless all x i are equal [1] (p. 43). Jensen’s inequality is, without any doubt, one of the most important inequalities, if not the most important inequality, in convex analysis with various applications in mathematics, statistics and engineering.
It is also known that the assumptions on p can be relaxed if we put more restrictions on x [2]. Namely, if p is a real n-tuple such that
0 P i = p 1 + + p i P n , i 1 , , n 1 ,
and P n > 0 , then for any monotonic n-tuple x = ( x 1 , , x n ) J n we get
x ¯ = 1 P n i = 1 n p i x i J ,
and for any function f convex on J inequality, (1) still holds. Note that (2) allows the occurrence of negative weights of p i , which usually complicate matters. Under such assumptions, inequality (1) is called the Jensen-Steffensen inequality for convex functions and (2) is called Steffensen’s conditions by J. F. Steffensen. Again, for a strictly convex function f, inequality (1) remains strict under certain additional assumptions on x and p [3]. The Jensen-Steffensen inequality is a proper generalization of the Jensen inequality since nonnegative weights of p satisfy condition (2) in every order, which means that for nonnegative weights the monotonicity condition on x becomes irrelevant.
Another important inequality in mathematical analysis is the Chebyshev inequality(Čebyšev inequality), as shown in [1] (p. 197) and [4] (p. 240), which states that
j = 1 n p j i = 1 n p i a i b i j = 1 n p j a j i = 1 n p i b i
whenever a = ( a 1 , , a n ) , b = ( a 1 , , b n ) are real n-tuples monotonic in the same direction, and p = ( p 1 , , p n ) a positive n-tuple [1] (p. 43). It is also useful to consider the Chebyshev functional (sometimes also called the Chebyshev difference) C defined by
C ( a , b ; p ) = j = 1 n p j i = 1 n p i a i b i j = 1 n p j a j i = 1 n p i b i .
Obviously, by the Chebyshev inequality,
C ( a , b ; p ) 0
when p is positive and a , b are monotonic in the same direction. In the special case a = b , we immediately get
C ( a , a ; p ) 0 .
Our goal is to prove the Chebyshev-Steffensen inequality (i.e., the Chebyshev inequality with weights p satisfying Steffensen’s conditions (2)) involving the inner product on the real m-space R m and to establish some upper bounds for the weighted Chebyshev-Steffensen functional. The obtained results are used to find new Grüss-like upper bounds for the Jensen functional with weights of p satisfying (2). It is worth noting here that many interesting results of this type, but with nonnegative weights, can be found in [5].

2. Chebyshev-Steffensen Inequality

In the rest of the paper, for some n , m N , n 2 , we denote
I n = 1 , 2 , , n ,
X = ( x 1 , , x n ) , Y = ( y 1 , , y n ) , x i , y i R m , i , j I n ,
· , · : R m × R m R is the inner product on the real m-space R m , · norm related to · , · , and ≤ the coordinatewise partial order on R m , i.e., for ξ , η R m
ξ = ( ξ 1 , , ξ m ) η = ( η 1 , , η m ) ξ 1 η 1 ξ m η m .
With
C ( X , Y ; p ) = j = 1 n p j i = 1 n p i x i , y i i = 1 n p i x i , i = 1 n p i y i
we denote the weighted Chebyshev functional for the inner product on R m . Furthermore,
P i = p 1 + + p i , P ¯ i = p i + + p n , i I n ,
that is
P ¯ i P j = r = i n s = 1 j p r p s , P i P ¯ j = r = 1 i s = j n p r p s .
To prove our main results we need the following lemma.
Lemma 1.
Let X = ( x 1 , , x n ) and Y = ( y 1 , , y n ) be two n-tuples of elements from R m and p = ( p 1 , , p n ) R n . The following identity holds
C ( X , Y ; p ) = i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , y j + 1 y j + j = i + 1 n P i P ¯ j x i + 1 x i , y j y j 1 ) .
Proof. 
It can be easily proved (using summation by parts on the coordinates) that for k 2 , , n 1 and A = a 1 , , a n R m the following identity holds
i = 1 n p i a i = i = 1 k 1 P i ( a i a i + 1 ) + P k a k + P ¯ k + 1 a k + 1 + i = k + 2 n P ¯ i ( a i a i 1 ) ,
and in border cases k = 1 or k = n
i = 1 n p i a i = P ¯ 1 a 1 + i = 2 n P ¯ i ( a i a i 1 ) i = 1 n p i a i = P n a n i = 1 n 1 P i ( a i + 1 a i ) .
In all of the cases we assume
i = k l x i = 0 , when k > l .
It could be checked directly that
i = 1 n p i j = 1 n p j x j , y j i = 1 n p i x i , i = 1 n p i y i = i = 1 n p i j = 1 n p j x i , y i y j ,
and also
i = 1 n p i j = 1 n p j x i , y i y j = i = 1 n 1 ( k = 1 i p k j = 1 n p j x i + 1 x i , y j y k ) ,
hence
i = 1 n p i j = 1 n p j x j , y j i = 1 n p i x i , i = 1 n p i y i = i = 1 n 1 ( k = 1 i p k j = 1 n p j x i + 1 x i , y j y k ) .
Using (5) with a k = j = 1 n p j ( y j y k ) we obtain
k = 1 i p k j = 1 n p j ( y j y k ) = P i j = 1 n p j ( y j y i ) k = 1 i 1 P k ( j = 1 n p j ( y j y k + 1 ) j = 1 n p j ( y j y k ) ) = P i ( j = 1 n p j y j P n y i ) P n k = 1 i 1 P k ( y k y k + 1 ) ,
and next using (4) with a k = y k we obtain
k = 1 i p k j = 1 n p j ( y j y k ) = P i ( j = 1 i 1 P j ( y j y j + 1 ) + P i y i + P ¯ i + 1 y i + 1 + j = i + 2 n P ¯ j ( y j y j 1 ) P n y i ) P n k = 1 i 1 P k ( y k y k + 1 ) = P i ( j = 1 i 1 P j ( y j y j + 1 ) P ¯ i + 1 y i + P ¯ i + 1 y i + 1 + j = i + 2 n P ¯ j ( y j y j 1 ) ) P n j = 1 i 1 P j ( y j y j + 1 ) = P i j = 1 i 1 P j ( y j y j + 1 ) + P i j = i + 1 n P ¯ j ( y j y j 1 ) P n j = 1 i 1 P j ( y j y j + 1 ) = P i j = i + 1 n P ¯ j ( y j y j 1 ) P ¯ i + 1 j = 1 i 1 P j ( y j y j + 1 ) .
Hence
i = 1 n p i j = 1 n p j x j , y j i = 1 n p i x i , i = 1 n p i y i = i = 1 n 1 ( P i j = i + 1 n P ¯ j x i + 1 x i , y j y j 1 P ¯ i + 1 j = 1 i 1 P j x i + 1 x i , y j y j + 1 ) = i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , y j + 1 y j + j = i + 1 n P i P ¯ j x i + 1 x i , y j y j 1 ) .
Note that
i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j + j = i + 1 n P i P ¯ j ) = C ( e , e ; p ) ,
where e = ( 1 , , n ) .
The next theorem states the Chebyshev-Steffensen inequality for the inner product on the real m-space R m with weights p satisfying (2).
Theorem 1.
Let X = ( x 1 , , x n ) , Y = ( y 1 , , y n ) be two n-tuples of elements from R m such that
x i + 1 x i , y i + 1 y i , i I n 1
or
x i + 1 x i , y i + 1 y i , i I n 1 .
Then for all real n-tuples p = ( p 1 , , p n ) R n satisfying (2) the following inequality holds
C ( X , Y ; p ) 0 .
If
x i + 1 x i , y i + 1 y i , i I n 1
or
x i + 1 x i , y i + 1 y i , i I n 1
then (7) is reversed.
Proof. 
First note that (2) implies
P ¯ i = p i + + p n 0 , i I n ,
hence all products P i P ¯ j are nonnegative.
Suppose that X and Y are such that (6) holds. Then
x i + 1 x i , y j + 1 y j 0 , i I n 1 , j 1 , , i 1 , x i + 1 x i , y j y j 1 0 , i I n 1 , j i + 1 , , n ,
and by Lemma 1 we immediately obtain (7). All other cases can be proven similarly. □
In the special case m = 1 the Chebyshev-Steffensen functional C ( X , Y ; p ) reduces to C ( x , y ; p ) , where x = ( x 1 , , x n ) and y = ( y 1 , , y n ) are real n-tuples, and (7) becomes the classical Chebyshev inequality (3) under Steffensen’s conditions or, in other words, the one-dimensional Chebyshev-Steffensen inequality.
The coordinatewise partial order is the most obvious choice of order on R m , and the conditions on X and Y in Theorem 1 are based on it, but it is possible to consider alternative conditions on X and Y. For instance, we can introduce a notion of monotonicity related to the inner product in the following way.
Definition 1.
We say that X = ( x 1 , , x n ) and Y = ( y 1 , , y n ) , where x i , y i R m , i , j I n , are monotonic in the same direction with respect to the inner product if
x i + 1 x i , y j + 1 y j 0 for all i , j I n 1 ,
and we say that they are monotonic in opposite directions with respect to the inner product if the above inequality is reversed.
It is easy to see that Theorem 1 can be obtained as a simple consequence of the one-dimensional version of the Chebyshev-Steffensen inequality using properties of the inner product. In the following theorem, we prove the Chebyshev-Steffensen inequality under slightly different conditions, which makes the use of Lemma 1 essential.
Theorem 2.
Let X = ( x 1 , , x n ) and Y = ( y 1 , , y n ) be two n-tuples of elements from R m monotonic in the same direction with respect to the inner product. Then for all real n-tuples p = ( p 1 , , p n ) R n satisfying (2), inequality (7) holds. If X and Y are monotonic in opposite directions, (7) is reversed.
Proof. 
Directly from Lemma 1 and Definition 1. □
A natural question to ask is this: Is there a connection between the conditions for X and Y in Theorem 1 and in Theorem 2? Obviously, monotonicity on the coordinates as in Theorem 1 implies monotonicity with respect to the inner product as in Definition 1 but not vice versa, as we show in the next example.
Example 1.
Let X = ( x 1 , x 2 ) and Y = ( y 1 , y 2 ) belong to R 2 × R 2 where x 1 = ( 1 , 2 ) , x 2 = ( 1 , 3 ) , y 1 = ( 3 , 1 ) , y 2 = ( 0 , 1 ) . Then
x 2 x 1 , y 2 y 1 = ( 2 , 1 ) , ( 3 , 2 ) = 4 0
and Theorem 2 can be applied. On the other hand, x 1 and x 2 can not be compared in the coordinatewise partial on R 2 and Theorem 1 can not be applied. If we choose x 1 = ( 1 , 2 ) , x 2 = ( 1 , 3 ) , y 1 = ( 3 , 1 ) , y 2 = ( 0 , 1 ) then
x 2 x 1 , y 2 y 1 = ( 0 , 1 ) , ( 3 , 2 ) = 2 0
and
x 2 x 1 , y 2 y 1
hence we can chose either Theorem 1 or Theorem 2.
The previous example points out that Theorem 2 is better than Theorem 1, but from the numerical point of view, it is good to have Theorem 1 too.
Remark 1.
In [6] (Theorem 4) the author considered some other conditions for weights of p , such as
0 P n P i , i I n 1
or
0 P n P ¯ i , i 2 , , n .
It can be easily seen that if the first assumption holds we get
P ¯ i 0 , i 2 , , n ,
and if the second holds we get
P k 0 , i I n 1 .
In both cases the products P ¯ i + 1 P j and P i P ¯ j in
i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , y j + 1 y j + j = i + 1 n P i P ¯ j x i + 1 x i , y j y j 1 )
are nonpositive. From that we conclude that under such conditions on p and the same conditions on X and Y as in Theorem 1 or Theorem 2 in all of the cases, inequality (7) is reversed.

3. Bounds for the Chebyshev-Steffensen Functional

Our next goal is to establish upper bounds for the Chebyshev-Steffensen functional under various conditions of X and Y.
Theorem 3.
Let X = ( x 1 , , x n ) and Y = ( y 1 , , y n ) be two n-tuples of elements from R m monotonic in the same direction with respect to the inner product. Then for all real n-tuples p = ( p 1 , , p n ) R n satisfying (2), the following inequalities hold
0 C ( X , Y ; p ) i = 1 n 1 P ˜ i ( x i + 1 x i ) , y n y 1 ,
where
P ˜ i = j = 1 i 1 P ¯ i + 1 P j + j = i + 1 n P i P ¯ j , i 1 , , n .
Proof. 
The left hand inequality in (8) follows from Theorem 2.
For all i I n 1 we can write
x i + 1 x i , y 2 y 1 + + x i + 1 x i , y n y n 1 = x i + 1 x i , y 2 y 1 + + y n y n 1 = x i + 1 x i , y n y 1 .
Since X and Y are monotonic in the same direction with respect to the inner product and the sum of the nonnegative summands is never smaller than any of its summands, we conclude that for all i , j I n 1
0 x i + 1 x i , y j + 1 y j x i + 1 x i , y n y 1 .
Then, by Lemma 1, we obtain
C ( X , Y ; p ) i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , y n y 1 + j = i + 1 n P i P ¯ j x i + 1 x i , y n y 1 ) = i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j + j = i + 1 n P i P ¯ j ) x i + 1 x i , y n y 1 = i = 1 n 1 P ˜ i ( x i + 1 x i ) , y n y 1 .
In the rest of the paper we denote
P ˜ i = j = 1 i 1 P ¯ i + 1 P j + j = i + 1 n P i P ¯ j , i I n ,
as in Theorem 3.
A simple way to bound the Chebyshev-Steffensen functional without monotonicity conditions is given in the following theorem.
Theorem 4.
Let X = ( x 1 , , x n ) and Y = ( y 1 , , y n ) be two n-tuples of elements from R m and let μ , ν R + be such that
x i + 1 x i μ , y i + 1 y i ν , i I n 1 .
Then for all real n-tuples p = ( p 1 , , p n ) R n satisfying (2), the following inequality holds
C ( X , Y ; p ) μ ν i = 1 n 1 P ˜ i .
Proof. 
Using Lemma 1 and the Cauchy–Bunyakovsky–Schwarz inequality for inner product spaces we obtain
C ( X , Y ; p ) i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , y j + 1 y j + j = i + 1 n P i P ¯ j x i + 1 x i , y j y j 1 ) i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i y j + 1 y j + j = i + 1 n P i P ¯ j x i + 1 x i y j y j 1 ) μ ν i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j + j = i + 1 n P i P ¯ j ) = μ ν i = 1 n 1 P ˜ i .
Observe that in the special case m = 1 conditions (9) become
x i + 1 x i μ , y i + 1 y i ν , i I n 1 ,
and by Theorem 4 we get
C ( x , y ; p ) = i = 1 n p i i = 1 n p i x i y i i = 1 n p i x i i = 1 n p i y i μ ν i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j + j = i + 1 n P i P ¯ j ) = μ ν i = 1 n 1 P ˜ i ,
which (with a slightly different notation) is a Grüss-like inequality obtained in [6] (Theorem 4). Some related results considering positive weights can be found in [7,8].
As in Remark 1, we can consider alternative conditions on weights p
0 P n P i , i I n 1
or
0 P n P ¯ i , i 2 , , n .
In either of those cases, (10) becomes (remember the nonpositivity of i = 1 n 1 P ˜ i )
C ( X , Y ; p ) μ ν i = 1 n 1 P ˜ i .
There is a way to bound the Chebyshev-Steffensen functional without bounding x i + 1 x i and y i + 1 y i : instead, we have to consider the max P i P ¯ j .
Theorem 5.
Let X = ( x 1 , , x n ) and Y = ( y 1 , , y n ) be two n-tuples of elements from R m . Then for all real n-tuples p = ( p 1 , , p n ) R n satisfying (2), the following inequalities hold
C ( X , Y ; p ) max i I n 1 j I n 1 P i P ¯ j i = 1 n 1 j = 1 n 1 x i + 1 x i , y j + 1 y j max i I n 1 j I n 1 P i P ¯ j i , j = 1 n 1 x i + 1 x i y j + 1 y j .
Proof. 
Similarly, as in the proof of Theorem 4, we have
C ( X , Y ; p ) i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , y j + 1 y j + j = i + 1 n P i P ¯ j x i + 1 x i , y j y j 1 ) max i I n 1 j I n 1 P i P ¯ j i = 1 n 1 ( j = 1 i 1 x i + 1 x i , y j + 1 y j + j = i + 1 n x i + 1 x i , y j y j 1 ) = max i I n 1 j I n 1 P i P ¯ j i = 1 n 1 j = 1 n 1 x i + 1 x i , y j + 1 y j max i I n 1 j I n 1 P i P ¯ j i = 1 n 1 j = 1 n 1 x i + 1 x i y j + 1 y j .
In [9], the authors proved the following inequality
C ( α , X ; p ) max i I n 1 P i P ¯ i + 1 i = 1 n 1 α i + 1 α i i = 1 n 1 x i + 1 x i ,
where X = ( x 1 , , x n ) is an n-tuple of elements from a normed linear space ( V , · ) over R , α = ( α 1 , , α n ) R n and weights p = ( p 1 , , p n ) R n nonnegative. Obviously, dealing with nonnegative weights gives more liberty because in that case we have
max i I n 1 j I i 1 P j P ¯ i + 1 max i I n 1 j I n 1 P j P ¯ i + 1 = max i I n 1 P i P ¯ i + 1 ,
and since for such weights P ¯ i P ¯ j when i j we get
max i I n 1 j i + 1 , , n P i P ¯ j max i I n 1 P i P ¯ i + 1 .
This means that for nonnegative weights, p inequalities (11) can be reformulated in the following way
C ( X , Y ; p ) max i I n 1 P i P ¯ i + 1 i = 1 n 1 j = 1 n 1 x i + 1 x i , y j + 1 y j max i I n 1 P i P ¯ i + 1 i , j = 1 n 1 x i + 1 x i y j + 1 y j .
In the special case p = ( 1 n , , 1 n ) , we get
max i I n 1 P i P ¯ i + 1 = P n 2 P ¯ n 2 + 1 = 1 n n 2 ( 1 1 n n 2 ) ,
and
C ( X , Y ; p ) 1 n n 2 ( 1 1 n n 2 ) i = 1 n 1 j = 1 n 1 x i + 1 x i , y j + 1 y j 1 n n 2 ( 1 1 n n 2 ) i , j = 1 n 1 x i + 1 x i y j + 1 y j .
At the end of this section we give a theorem that combines some of the previous approaches.
Theorem 6.
Let X = ( x 1 , , x n ) and Y = ( y 1 , , y n ) be two n-tuples of elements from R m and m , M R m such that
x i + 1 x i , i 1 , , n 1 ,
m y i M , i I n .
Then for all real n-tuples p = ( p 1 , , p n ) R n satisfying (2)
C ( X , Y ; p ) i = 1 n 1 P ˜ i ( x i + 1 x i ) , M m M m i = 1 n 1 P ˜ i ( x i + 1 x i ) M m i = 1 n 1 P ˜ i x i + 1 x i .
Proof. 
The conditions of this theorem imply that for all i , j 1 , , n 1
x i + 1 x i , y j + 1 y j x i + 1 x i , M m .
By Lemma 1 we get
C ( X , Y ; p ) = i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , y j + 1 y j + j = i + 1 n P i P ¯ j x i + 1 x i , y j y j 1 ) i = 1 n 1 ( j = 1 i 1 P ¯ i + 1 P j x i + 1 x i , M m + j = i + 1 n P i P ¯ j x i + 1 x i , M m ) = i = 1 n 1 P ˜ i ( x i + 1 x i ) , M m .
By the Cauchy–Bunyakovsky–Schwarz inequality for inner product spaces and the triangle inequality we obtain
C ( X , Y ; p ) i = 1 n 1 P ˜ i ( x i + 1 x i ) M m i = 1 n 1 P ˜ i x i + 1 x i M m ,
which completes the proof. □

4. Steffensen-Grüss Inequality

In this section, we show how some of the results from Section 2 and Section 3 can be used to obtain Grüss-like upper bounds for the Jensen-Steffensen functional. In this section, c o n v S denotes the convex hull of S R m .
Theorem 7.
Let U be an open convex subset of R m and X = ( x 1 , , x n ) U n such that
x i + 1 x i , i I n 1 .
Let f : U R be a continuously differentiable function and m , M R m such that
m f ( x ) M , for all x c o n v x 1 , , x n .
Then for all p = ( p 1 , , p n ) R n satisfying (2), the following inequalities hold
1 P n i = 1 n p i f ( x i ) f ( 1 P n i = 1 n p i x i ) 1 P n 2 i = 1 n 1 P ˜ i ( x i + 1 x i ) , M m M m P n 2 i = 1 n 1 P ˜ i ( x i + 1 x i ) M m P n 2 i = 1 n 1 P ˜ i x i + 1 x i .
Proof. 
First, note that continuity of the partial derivatives on U implies the existence of some m , M R m such that (12) holds. Furthermore, under condition (2) on the weights p , we have
x ¯ = 1 P n i = 1 n p i x i c o n v x 1 , , x n U .
From the mean-value theorem we know that for any x , y c o n v x 1 , , x n there exists some θ ( 0 , 1 ) such that
f ( x ) f ( y ) = f ( z ) , x y ,
where z = y + θ ( x y ) . Applying this to x = x i , y = x ¯ and z = z i = x ¯ + θ i ( x i x ¯ ) c o n v x 1 , , x n we obtain
f ( x i ) f ( x ¯ ) = f ( z i ) , x i x ¯ , i I n .
Multiplying the above equality by p i and summing over i we obtain
i = 1 n p i f ( x i ) P n f ( x ¯ ) = i = 1 n p i f ( z i ) , x i x ¯ = i = 1 n p i ( x i , f ( z i ) x ¯ , f ( z i ) ) ,
and therefore, after multiplication with P n ,
P n i = 1 n p i f ( x i ) P n 2 f ( x ¯ ) = P n i = 1 n p i x i , f ( z i ) i = 1 n p i x i , i = 1 n p i f ( z i ) .
If in Theorem 6 we choose y i = f ( z i ) , we get
C ( X , Y ; p ) = P n i = 1 n p i x i , y i i = 1 n p i x i , i = 1 n p i y i M m i = 1 n 1 P ˜ i x i + 1 x i .
This implies
P n i = 1 n p i f ( x i ) P n 2 f ( x ¯ ) i = 1 n 1 P ˜ i ( x i + 1 x i ) , M m M m i = 1 n 1 P ˜ i ( x i + 1 x i ) M m i = 1 n 1 P ˜ i x i + 1 x i ,
which, after division by P n 2 , becomes (13). □
Posing a stronger condition on f , namely the condition of Lipschitz continuity, we are able to remove the monotonicity condition for X = ( x 1 , , x n ) .
Theorem 8.
Let U be an open convex subset of R m and X = ( x 1 , , x n ) U n . Let f : U R be a differentiable function such that for some L > 0 f satisfies the Lipschitz condition
f ( y ) f ( x ) L y x , for all x , y c o n v x 1 , , x n .
Then for all p = ( p 1 , , p n ) R n satisfying (2), the following inequality holds
1 P n i = 1 n p i f ( x i ) f ( 1 P n i = 1 n p i x i ) L Δ P n 2 max i I n 1 j I n 1 P i P ¯ j i = 1 n 1 x i + 1 x i ,
where
Δ = max 1 i < j n x i x j .
Proof. 
First, observe that for any a , b c o n v x 1 , , x n there exist some u i , v i 0 , 1 , i I n , such that i = 1 n u i = i = 1 n v i = 1 and
a = i = 1 n u i x i , b = i = 1 n v i x i .
Then
a b = i = 1 n v i i = 1 n u i x i i = 1 n u i i = 1 n v i x i = i , j = 1 n u i v j ( x i x j ) i , j = 1 n u i v j x i x j Δ i , j = 1 n u i v j = Δ .
Consequently, for z i , i I n , defined as in the proof of Theorem 7, we have
z i z j Δ , i , j I n .
Now, similarly as in the proof of Theorem 7, we have
P n i = 1 n p i f ( x i ) P n 2 f ( x ¯ ) = P n i = 1 n p i x i , f ( z i ) i = 1 n p i x i , i = 1 n p i f ( z i ) .
If in Theorem 5 we choose y i = f ( z i ) , we obtain
C ( X , Y ; p ) = P n i = 1 n p i x i , y i i = 1 n p i x i , i = 1 n p i y i max i I n 1 j I n 1 P i P ¯ j i , j = 1 n 1 x i + 1 x i y j + 1 y j max i I n 1 j I n 1 P i P ¯ j i , j = 1 n 1 x i + 1 x i f ( z j + 1 ) f ( z j ) L max i I n 1 j I n 1 P i P ¯ j i , j = 1 n 1 x i + 1 x i z j + 1 z j L Δ max i I n 1 j I n 1 P i P ¯ j i = 1 n 1 x i + 1 x i .
After division by P n 2 we obtain the desired result. □

5. Applications Involving Generalized Convex Functions

The results from Section 4 can be used to establish new upper bounds for the Jensen-Steffensen functional involving certain generalized convex functions, namely P-convex functions and functions with nondecreasing increments.
Let f be a real-valued function defined by J = a , b R . A k-th order divided difference of f at distinct points x 0 , , x k J may be defined recursively by
x i f = f ( x i ) x 0 , , x k f = x 1 , , x k f x 0 , , x k 1 f x k x 0 .
A function f : J R is said to be k-convex on a , b if
x 0 , , x k f 0 , for all distinct x 0 , , x k J .
This definition was generalized in [10] in the following way: Let J 1 = a , b and J 2 = c , d be two intervals in R and let f be a real-valued function defined by J 1 × J 2 . A divided difference of order ( k , m ) at k + 1 distinct points x 0 , , x k from J 1 and m + 1 distinct points y 0 , , y m from J 2 is defined by
x 0 , , x k y 0 , , y m f = x 0 , , x k ( y 0 , , y m f ) = y 0 , , y m ( x 0 , , x k f ) .
A function f : J 1 × J 2 R is said to be convex of order ( k , m ) by J 1 × J 2 if
x 0 , , x k y 0 , , y m f 0
for all x 0 , , x k J 1 , y 0 , , y m J 2 such that x 0 < < x k and y 0 < < y m .
A similar class of functions was considered in [11].
Definition 2.
Let f be a real-valued function defined by J 1 × J 2 . We say that f is P-convex of order k if
x 0 , , x i y 0 , , y k i f 0 , i 0 , 1 , , k ,
for all x 0 , , x k J 1 and y 0 , , y k J 2 such that x 0 < < x k and y 0 < < y k , i.e., if f is convex of order ( i , k i ) for all i 0 , 1 , , k . We say that f is P-concave of order k if f is P-convex of order k .
If a function f is P-convex of order 2, we simply say that the f is P-convex. Obviously, this definition can be extended for functions with more than two variables. In the same paper [11], the author proved several properties of P-convex functions of order k related to the properties of k-convex functions.
(i)
A P-convex function of order k is not necessarily continuous on J 1 × J 2 .
(ii)
If the kth partial derivatives of a function f : J 1 × J 2 R exist, then f is P-convex of order k if these partial derivatives are nonnegative.
(iii)
If the ( k 1 ) th partial derivatives of a function f : J 1 × J 2 R exist, then f is P-convex of order k if these partial derivatives are nondecreasing in each argument.
An interesting P-convex (but not convex) function is f : R 2 R defined by
f ( x , y ) = x y
which provides a beautiful connection between the Chebyshev-Steffensen inequality and the Jensen-Steffensen inequality.
Wright-convex functions have an important generalization for functions of several variables introduced in [12] and [1] (p. 14).
An interval a , b in R m , where a , b R m and a b , is the set
a , b = x R m : a x b .
Definition 3.
A real-valued function f defined on an interval J R m is said to have nondecreasing increments if
f ( x + h ) f ( x ) f ( y + h ) f ( y )
whenever 0 h R m , x y , x , y + h J .
In the same paper [12] Brunk also proved that:
(i)
A function with nondecreasing increments is not necessarily continuous.
(ii)
If the first partial derivatives of a function f : J R exists for x J , then f has nondecreasing increments if each of these partial derivatives is nondecreasing in each argument.
(iii)
If the second partial derivatives of a function f : J R exists for x J , then f has nondecreasing increments if each of these partial derivatives is nonnegative.
We may note here that if n = 2 and we consider only functions with partial derivatives of the second-order, then the class of P-convex functions and the class of functions with nondecreasing increments coincide.
P-convex functions and functions with nondecreasing increments have an important common property that ordinary convex functions of several variables do not have: the Jensen-Steffensen inequality holds for them (see [11,13] and [1] (p. 62)).
In the next theorem, we show how Theorem 7 can be used to establish a new upper bound for the Jensen-Steffensen functional for functions with nondecreasing increments.
Theorem 9.
Let J be an interval in R m and X = ( x 1 , , x n ) J n such that
x i + 1 x i , i I n 1 .
Let f : J R be a function with nondecreasing increments continuously differentiable on i n t ( J ) and m , M R m such that (12) holds. Then for all p = ( p 1 , , p n ) R n satisfying (2), the following inequalities hold
0 1 P n i = 1 n p i f ( x i ) f ( 1 P n i = 1 n p i x i ) M m P n 2 i = 1 n 1 P ˜ i x i + 1 x i .
Proof. 
By Theorem 7 we know that
1 P n i = 1 n p i f ( x i ) f ( 1 P n i = 1 n p i x i ) M m P n 2 i = 1 n 1 P ˜ i ( x i + 1 x i ) M m P n 2 i = 1 n 1 P ˜ i x i + 1 x i ,
and since f is a function with nondecreasing increments we know that
1 P n i = 1 n p i f ( x i ) f ( 1 P n i = 1 n p i x i ) 0 .
This completes the proof. □
Obviously, an analogous result can be formulated for P-convex functions.
Theorem 10.
Let J be an interval in R 2 and X = ( x 1 , , x n ) J n such that
x i + 1 x i , i I n 1 .
Let f : J R be a P-convex function continuously differentiable by i n t ( J ) and m , M R 2 such that (12) holds. Then for all p = ( p 1 , , p n ) R n satisfying (2) inequalities (14) hold.

6. Conclusions

In this paper, we have proven the Chebyshev-Steffensen inequality involving the inner product on the real m-space, which is a new and interesting result. This new inequality enables us to establish some upper bounds for the weighted Chebyshev-Steffensen functional, as well as the Jensen-Steffensen functional for the inner product under various conditions involving (possibly) negative weights. The obtained results are new and, in our opinion, interesting and formulated in a mathematically beautiful way.

Author Contributions

Writing—original draft preparation, M.K.B. and J.P. All authors have read and agreed to the published version of the manuscript.

Funding

This publication was supported by the University of Split, Faculty of Science and by the Ministry of Education and Science of the Russian Federation (the Agreement number No. 02.a03.21.0008).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Pečarić, J.E.; Proschan, F.; Tong, Y.L. Convex functions, partial orderings, and statistical applications. In Mathematics in Science and Engineering; Academic Press, Inc.: Boston, MA, USA, 1992. [Google Scholar]
  2. Steffensen, J.F. On certain inequalities and methods of approximation. J. Inst. Actuar. 1919, 51, 274–297. [Google Scholar] [CrossRef]
  3. Abramovich, S.; Klaričić Bakula, M.; Matić, M.; Pečarić, J. A variant of Jensen-Steffensen’s inequality and quasi-arithmetic means. J. Math. Anal. Appl. 2005, 307, 370–386. [Google Scholar] [CrossRef] [Green Version]
  4. Mitrinović, D.S.; Pečarić, J.E.; Fink, A.M. Classical and new inequalities in analysis. In Mathematics and its Applications (East European Series); Kluwer Academic Publishers Group: Dordrecht, The Netherlands, 1993. [Google Scholar]
  5. Dragomir, S.S. Advances in Inequalities of the Schwarz, Grüss and Bessel Type in Inner Product Spaces; Nova Science Publishers, Inc.: Hauppauge, NY, USA, 2005. [Google Scholar]
  6. Pečarić, J.E. On the Ostrowski generalization of Čebyšev’s inequality. J. Math. Anal. Appl. 1984, 102, 479–487. [Google Scholar] [CrossRef] [Green Version]
  7. Izumino, S.; Pečarić, J.E. Some extensions of Grüss’ inequality and its applications. Nihonkai Math. J. 2002, 13, 159–166. [Google Scholar]
  8. Pečarić, J.; Tepeš, B. Improvements of some inequalities for moments of guessing function. Math. Inequal. Appl. 2005, 8, 53–62. [Google Scholar] [CrossRef] [Green Version]
  9. Pečarić, J.; Tepeš, B. Improvement of a Grüss type inequality of vectors in normed linear spaces and applications. Rad Hrvat. Akad. Znan. Umjet. Mat. Znan. 2005, 15, 129–137. [Google Scholar]
  10. Popoviciu, T. Les fonctions convexes. In Actualités Scientifiques et Industrielles; Hermann et Cie: Paris, France, 1944; 76p. [Google Scholar]
  11. Pečarić, J.E. Some inequalities for generalized convex functions of several variables. Period. Math. Hungar. 1991, 22, 83–90. [Google Scholar] [CrossRef]
  12. Brunk, H.D. Integral inequalities for functions with nondecreasing increments. Pacific J. Math. 1964, 14, 783–793. [Google Scholar] [CrossRef]
  13. Pečarić, J.E. On some inequalities for functions with nondecreasing increments. J. Math. Anal. Appl. 1984, 98, 188–197. [Google Scholar] [CrossRef] [Green Version]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Klaričić Bakula, M.; Pečarić, J. Chebyshev-Steffensen Inequality Involving the Inner Product. Mathematics 2022, 10, 122. https://doi.org/10.3390/math10010122

AMA Style

Klaričić Bakula M, Pečarić J. Chebyshev-Steffensen Inequality Involving the Inner Product. Mathematics. 2022; 10(1):122. https://doi.org/10.3390/math10010122

Chicago/Turabian Style

Klaričić Bakula, Milica, and Josip Pečarić. 2022. "Chebyshev-Steffensen Inequality Involving the Inner Product" Mathematics 10, no. 1: 122. https://doi.org/10.3390/math10010122

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop