Abstract
In this article, we established new results related to a 2-pre-Hilbert space. Among these results we will mention the Cauchy-Schwarz inequality. We show several applications related to some statistical indicators as average, variance and standard deviation and correlation coefficient, using the standard 2-inner product and some of its properties. We also present a brief characterization of a linear regression model for the random variables in discrete case.
MSC:
Primary 46C05; secondary 26D10; 26D15
1. Introduction
In Reference [1] Gähler introduced the definitions of a linear 2-normed space and a 2-metric space. In References [2,3], Diminnie, Gähler and White studied the properties of a 2-inner product space.
Several results related to the theory of 2-inner product spaces can be found in Reference [4]. In Reference [5] Dragomir et al. show the corresponding version of Boas-Bellman inequality in 2-inner product spaces and in Reference [6] the superadditivity and the monotony of 2-norms generated by inner products was studied.
We consider X a linear space of dimension greater than 1 over the field , where is the set of the real or the complex numbers. Suppose that is a -valued function defined on X × X × X satisfying the following conditions:
- (a)
- and if and only if u and w are linearly dependent;
- (b)
- ;
- (c)
- ;
- (d)
- , for any scalar ;
- (e)
- .
Function is called 2-inner product on X and is called 2-inner product space (or 2-pre-Hilbert space).
A series of consequences of these requirements can be deduced (see e.g., References [2,4,7]):
for all and .
The standard 2-inner product is defined on the inner product space by:
for all .
Let be a 2-inner product space. We can define a function on by
for all . This function satisfies the following conditions:
- (a)
- and if and only if ;
- (b)
- ;
- (c)
- , for any scalar ;
- (d)
- , for all
A function defined on and satisfying the above conditions is called 2-norm on X and is called linear 2-normed space.
It is easy to see that if is a 2-inner product space over the field of real numbers or the field of complex numbers , then is a linear 2-normed space and the 2-norm is generated by the 2-inner product .
Two consequences of the above properties are given by the following: the parallelogram law [4],
for all and the Cauchy-Schwarz inequality (see e.g., References [4,7]),
for all . The equality in (3) holds if and only if and w are linearly dependent.
A reverse of the Cauchy-Schwarz inequality in 2-inner product spaces can be found in Reference [5]: if and are such that or equivalently hold, then
Constant is the best possible.
Another important inequality in a 2-inner product space X is the triangle inequality [4],
for all .
The Cauchy-Schwarz inequality in the real case, (see e.g., References [9,10]), can be obtained by the following identity, as in Reference [11],
for all . An inequality, which is an improvement of the Cauchy-Schwarz inequality, is the Ostrowski inequality. In Reference [12], we find some refinements of Ostrowski’s inequality and an extention to a 2-inner product space.
The purpose of this paper is to study some identities in a 2-pre-Hilbert space and we prove new results related to several inequalities in a 2-pre-Hilbert space. We will mention the Cauchy-Schwarz inequality. The novelty of this article is the introduction, for the first time, of the concepts of average, variance, covariance and standard deviation and of the correlation coefficient for vectors, using the standard 2-inner product and some of its properties. We also present a brief characterization of a linear regression model for the random variables in discrete case.
2. Inequalities in a 2-Pre-Hilbert Space
In this section, we will obtain some characterizations of the Cauchy-Schwarz inequality for a 2-pre-Hilbert space. First, we use an identity, which is given by the following result:
Lemma 1.
If is a 2-inner product space over the field of complex numbers , and the 2-norm is generated by the 2-inner product , then we have
for vectors x,y and z in X and a,b ∈.
Proof.
By making simple calculations, for all and , we have that
which proved the statement. □
Remark 1.
If in relation (8) we take and , then we obtain
for all nonzero vectors x and y in X and the linearly independent pairs of vectors and . If , then we obtain the following relation:
The above equality is the extension of equality (7) to a 2-inner product space.
Using Lemma 1 in two conveniently chosen ways we get an important equality, thus:
Theorem 1.
With the above assumptions in a 2-pre-Hilbert space, the following equality holds
for all vectors x,y and z in X and a,b with .
Proof.
By adding the above relation with relation (8), we obtain the relation of the statement. □
Another equality in 2-inner product spaces is given by the following:
Theorem 2.
If is a 2-inner product space over the field of complex numbers , and the 2-norm is generated by the 2-inner product , then we have
for all vectors x,y,z and .
Proof.
If we make the substitutions and , in the equality form Lemma 1, then we find the relation
If we apply Lemma 1, then we obtain the relation
Similarly, we deduce the identity
By these relations and using the identity
we deduce relation (14). Therefore, the relation of the statement is true. □
Remark 2.
Corollary 1.
With the above assumptions in a 2-pre-Hilbert space, the following identity holds
for all nonzero vectors x,y and z in X and the linearly independent pairs of vectors (x,z) and (y,z) and a,b .
Corollary 2.
With the above assumptions in a 2-pre-Hilbert space, the following equalities hold
for vectors x and y in X and a,b , and
for nonzero vectors x,y and z in X and the linearly independent pairs of vectors (x,z) and (y,z).
Proof.
Remark 3.
We can rearrange the expression from relation (18) as follows:
But, using the inequality for positive real numbers a and b, we deduce the following inequality
for nonzero vectors x,y and z in X and the linearly independent pairs of vectors and . This inequality is an extension of Maligranda’s inequality from Reference [13] to a 2-inner product space over the field of real numbers.
Next, we give an evaluation of the sum of the squares of the norms of two vectors, in a 2-inner product space:
Theorem 3.
If a,b with and is a 2-inner product space over the field of complex numbers , and the 2-norm is generated by the 2-inner product , then we have
for all nonzero vectors x,y and z in X and the linearly independent pairs of vectors (x,z) and (y,z).
Proof.
Using the equality,
and from triangle inequality, we prove the relation and taking into account that , we deduce inequality (20). □
Bellow, we obtain a refinement of the Cauchy-Schwarz inequality and a reverse inequality of the Cauchy-Schwarz inequality in a 2-pre-Hilbert space.
Corollary 3.
With the above assumptions in a 2-pre-Hilbert space, the following inequality holds
for all nonzero vectors x,y and z in X and the linearly independent pairs of vectors (x,z) and (y,z).
Proof.
Remark 4.
If we take in inequality (22), we obtain the following inequality
for all , where is a given nonzero vector.
Next, we will show an estimate of the triangle inequality in a linear 2-normed space.
Theorem 4.
If is a linear 2-normed space over the field of real numbers , then the following inequality holds
for all vectors in X and .
Proof.
If without reducing the generality, we assume that , then we have
Similarly, we make the following calculations:
In the case , we deduce the same above results. Therefore, the inequalities of the statement are true. □
Remark 5.
If we replace x by and y by in relation (25), we obtain the following inequality:
for nonzero vectors in X and the linearly independent sets of vectors and .
Corollary 4.
If is a linear 2-normed space over the field of real numbers , then we have
for nonzero vectors in X and the linearly independent sets of vectors and .
Proof.
For nonzero vectors in X and the linearly independent sets of vectors and , in Theorem 4, we make the following substitutions: , , then we obtain
which implies the inequalities from (27). □
3. Applications of the Standard 2-Inner Product
If is an inner product space, then the standard 2-inner product is defined on X by:
for all .
But, becomes a linear 2-normed space, with the 2-norm given by the following:
for all .
(a) We consider the vector space . For , , , we have , ,
and .
(b) In the vector space , for we have
and
These inequalities are improvements of the Cauchy-Schwarz inequality in discrete version and in integral version.
(c) Let X be a real linear space with the inner product . The Chebyshev functional [14] is defined by
for all , where is a given nonzero vector.
It is easy to see that we have and for all .
If we replace x and y by and , , in the Cauchy-Schwarz inequality, then we find the Cauchy-Schwarz inequality in terms of the Chebyshev functional, given by:
Let X be a real linear space with the inner product . Equality (17) can be written in terms of the Chebyshev functional by
for all vectors in X, where is a given nonzero vector and . If , then
for all vectors in X, where
(d) For every subspace , we have the decomposition . Every can be uniquely written as , where and We define the orthogonal projection by . It is easy to see that , for every , so we have , which involves the equality where the norm is generated by the inner product .
For a subspace U of an inner product space X with , and is the standard 2-inner product on X, we deduce the identity:
where we have the decompositions , . Using the equality from (17) and above identity, we proved the following equality:
for
4. Applications of the Standard 2-Inner Product to Certain Statistical Indicators
A variety of ways to present data, probability, and statistical estimation are mainly characterized by the following statistical indicators—mean (average), variance and standard deviation as well as covariance and Pearson correlation coefficient [15].
Taking the mean as the center of a random variable’s probability distribution, the variance is a measure of how much the probability mass is spread out around this center.
If V is a random variable with , then the formal definition of variance is the following: The expression for the variance can thus be expanded: The standard deviation of V is defined by
The covariance is a measure of how much two random variables V and W change together at the same time and is defined as and is equivalent to the form We find the inequality of Cauchy-Schwarz for discrete random variables given by
The correlation between sets of data is a measure of how well they are related. A correlation coefficient is a numerical measure of some type of correlation, meaning a statistical relationship between two variables.
The Pearson correlation coefficient is a measure of the strength and direction of the linear relationship between two variables V and W that is defined as the covariance of the variables divided by the product of their standard deviations:
Using the inequality of Cauchy-Schwarz, we deduce that The variance of a discrete random variable with probabilities for any is its second central moment, the expected value of the squared deviation from mean , thus:
Let be real numbers, assume for all and the average In 1935, Popoviciu (see e.g., References [16,17]) proved the following inequality
The discrete version of Grüss inequality has the following form (see e.g., References [18,19]):
where are real numbers so that and for all
From the relation
and using the inequality of Cauchy-Schwarz for discrete random variables given by , and inequality (32), we obtain a proof of Grüss’s inequality.
Bhatia and Davis show in Reference [16] the following inequality:
The inequality of Bhatia and Davis represents an improvement of Popoviciu’s inequality, because . Therefore, we will first have an improvement of Grüss’s inequality given by the following relation:
In Reference [18], we find some research on refining the Grüss inequality.
The Pearson correlation coefficient is given by
Florea and Niculescu in Reference [20] treated the problem of estimating the deviation of the values of a function from its mean value. The estimation of the deviation of a function from its mean value is characterized below.
We denote by the space of Riemann-integrable functions on the interval , and by the space of real-valued continuous functions on the interval .
The integral arithmetic mean for a Riemann-integrable function is the number
If f and h are two integrable functions on and , then a generalization for the integral arithmetic mean is the number called the h-integral arithmetic mean for a Riemann-integrable function f. If function f is a Riemann-integrable function, we denote by
the variance of f. The expression for the variance of f can be expanded in this way: In the same way, we defined the h-variance of a Riemann-integrable function f by The expression for the h-variance can be thus expanded:
It is easy to see another form of the h-variance, given by the following: In Reference [21], Aldaz showed a refinement of the AM-GM inequality and used it in the proof that is a measure of the dispersion of about its mean value, which is, in fact, comparable to the variance.
The covariance is a measure of how much two Riemann-integrable functions change together at the same time and is defined as and is equivalent to the form
In fact, the covariance is the Chebyshev functional attached to functions f and g. In Reference [22] it is written as . The properties of the Chebyshev functional have been studied by Elezović, Marangunić and Pečarić in Reference [19].
The h-covariance is a measure of how much two random variables change together and is defined as and is equivalent to the form
In Reference [23], Pečarić used the generalization of the Chebyshev functional notion attached to functions f and g to the Chebyshev h-functional attached to functions f and g defined by . Here, Pečarić showed some generalizations of the inequality of Grüss by the Chebyshev h-functional. It is easy to see that, in terms of covariance, this can be written as
In terms of covariance, the inequality of Grüss becomes
In terms of Chebyshev functional, the inequality of Gruss becomes
Next, using the notion of the standard 2-inner product, we extend the above concepts to vectors of . If is an inner product space, then the standard 2-inner product is defined on X by:
for all
But, becomes a linear 2-normed space, with the 2-norm given by the following:
for all
Now, we take the vector space For we have
and
In Reference [14], Niezgoda studied certain orthoprojectors. The operator defined by
is the orthoprojector from X onto span. If , where , then the average of vector x is , and we have
Therefore, in , we define the variance of a vector x by
The standard deviation of is defined by , so we deduce that Since, using the standard 2-inner product, we have
it is easy to define the covariance of two vectors x and y by
The correlation coefficient of two vectors x and y can be defined by:
Another definition of variance and covariance for vectors from can be made using projection. Vector projection is an important operation in the Gram-Schmidt orthonormalization of vector space bases.
The projection of a vector x onto a vector y is given by
If in we have the vector then
We remark that the variance of a vector x is given by and the covariance of two vectors x and y is given by
Next, we can write some equalities and inequalities, using several results from Section 2, related to variance, covariance and the standard deviation of vectors . Therefore, from relations (8), (10), (11), (15), (18)–(20), (22), (25), we obtain the following relations:
for all with and
for all and
If we take the vector space , then for we have
and
If and then and
Therefore, in , we define the variance of a function f by
the standard deviation of is defined by , so we deduce that and the covariance of two functions f and g by
The definition of variance of a function f and the covariance of two functions f and g in terms of the projection is given below.
The projection of a vector f onto a vector g is given by
If in we take we have
Thus, in we define the variance of a function f by and the covariance of vectors f and g by
Relations (38)–(46) can be written in terms of the elements from . We mention two of them:
and
for all and
Let be vectors in the inner product space over the field of real numbers, with and vectors being linearly independent, such that
where Using the inner product and its properties, we deduce that and Therefore, we have to solve this system with two equations and two unknowns But, we have the 2-inner product for all , with
If A is the matrix of the system, then we obtain Because, vectors are linearly independent, therefore, we have Using the Cramer method to solve the system, we find that and Let be vectors in an inner product space over the field of real numbers, with and vectors being linearly independent, such that
where By dividing by , we deduce the relation where , so . Therefore, we obtain and If , then and
In statistics, linear regression is a linear approach to modelling the relationship between a dependent variable and one or more independent variables. The case of one independent variable is called simple linear regression.
We consider two random variables: , with probabilities , for any
A linear regression model assumes that the relationship between the dependent variable W and the independent variable V is linear. Thus, the general linear model for one independent variable may be written as . We can describe the underlying relationship between and involving this error term by .
If we have , then we find Using the Lagrange method of multipliers, we obtain and . By simple calculations, we deduce and so, we obtain the same coefficients as above.
Funding
This research received no external funding.
Acknowledgments
The author would like to thank the reviewers for their constructive comments and suggestions which led to a substantial improvement of this article.
Conflicts of Interest
The author declares no conflict of interest.
References
- Gähler, S. Lineare 2-normierte Räume. Math. Nachr. 1965, 28, 1–43. [Google Scholar] [CrossRef]
- Diminnie, C.; Gähler, S.; White, A. 2-inner Product Spaces. Demonstr. Math. 1973, 6, 525–536. [Google Scholar]
- Diminnie, C.; Gähler, S.; White, A. 2-inner product spaces II. Demonstr. Math. 1977, 10, 169–188. [Google Scholar]
- Cho, Y.J.; Lin, P.C.S.; Kim, S.S.; Misiak, A. Theory of 2-inner Product Spaces; Nova Science Publishes, Inc.: New York, NY, USA, 2001; 330p. [Google Scholar]
- Dragomir, S.S.; Cho, Y.J.; Kim, S.S.; Sofo, A. Some Boas-Bellman type inequalities in 2-inner product spaces. J. Inequal. Pure Appl. Math. 2005, 6, 1–13. [Google Scholar]
- Dragomir, S.S.; Cho, Y.J.; Kim, S.S. Superadditivity and monotonicity of 2-norms generated by inner products and related results. Soochow J. Math. 1998, 24, 13–32. [Google Scholar]
- Cho, Y.J.; Matić, M.; Pečarić, J. On Gram’s determinant in 2-inner product spaces. J. Korean Math. Soc. 2001, 38, 1125–1156. [Google Scholar] [CrossRef][Green Version]
- Dragomir, S.S.; Sándor, J. Some inequalities in pre-Hilbertian spaces. Stud. Univ. Babes-Bolyai Math. 1987, 32, 71–78. [Google Scholar]
- Dragomir, S.S. Improving Schwarz Inequality in Inner Product Spaces. Linear Multilinear Algebra 2019, 67, 337–347. [Google Scholar] [CrossRef]
- Mitrinović, D.S.; Pečarić, J.; Fink, A.M. Classical and New Inequalities in Analysis; Kluwer Academic: Dordrecht, The Netherlands, 1992; 740p. [Google Scholar]
- Niculescu, C.P.; Persson, L.E. Convex Functions and Their Applications: A Contemporary Approach, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2018; 256p. [Google Scholar]
- Minculete, N. Some Refinements of Ostrowski’s Inequality and an Extention to a 2-Inner Product Space. Symmetry 2019, 11, 707. [Google Scholar] [CrossRef]
- Maligranda, L. Some remarks on the triangle inequality for norms. Banach J. Math. Anal. 2008, 2, 31–41. [Google Scholar] [CrossRef]
- Niezgoda, M. On the Chebyshev functional. Math. Ineq Appl. 2007, 10, 535–546. [Google Scholar] [CrossRef]
- Evans, J.R. Statistics, Data Analysis and Decision Modeling; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2007. [Google Scholar]
- Bhatia, R.; Davis, C. A better bound on the variance. Am. Math. Mon. 2000, 107, 353–357. [Google Scholar] [CrossRef]
- Furuichi, S. A note on a parametrically extended entanglement-measure due to Tsallis relative entropy. Information 2006, 9, 837–844. [Google Scholar]
- Minculete, N.; Ciurdariu, L. A generalized form of Grüss type inequality and other integral inequalities. J. Inequal. Appl. 2014, 2014, 119. [Google Scholar] [CrossRef]
- Elezović, N.; Marangunić, L.; Pečarić, J. Some improvements of Gruss type inequality. J. Math. Inequal. 2007, 1, 425–436. [Google Scholar] [CrossRef]
- Florea, A.; Niculescu, C.P. A note on Ostrowski’s inequality. J. Inequal. Appl. 2005, 2005, 459–468. [Google Scholar] [CrossRef][Green Version]
- Aldaz, J.M. A refinement of the inequality between arithmetic and geometric means. J. Math. Inequal. 2008, 2, 473–477. [Google Scholar] [CrossRef]
- Kechriniotis, A.; Delibasis, K. On generalizations of Grüss inequality in inner product spaces and applications. J. Inequal. Appl. 2010, 2010, 167091. [Google Scholar] [CrossRef][Green Version]
- Pečarić, J. On the Ostrowski Generalization of Čebyšev’s Inequality. J. Math. Anal. Appl. 1984, 102, 479–487. [Google Scholar] [CrossRef][Green Version]
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).