Abstract
In this paper, we introduce the notions of nearly Sasakian and nearly Kähler statistical structures with a non-trivial example. The conditions for a real hypersurface in a nearly Kähler statistical manifold to admit a nearly Sasakian statistical structure are given. We also study invariant and anti-invariant statistical submanifolds of nearly Sasakian statistical manifolds. Finally, some conditions under which such a submanifold of a nearly Sasakian statistical manifold is itself a nearly Sasakian statistical manifold are given.
Keywords:
information geometry; nearly Kähler statistical manifold; nearly Sasakian statistical manifold MSC:
62B11; 53D15; 60D05
1. Introduction
Information geometry, as a well-known theory in geometry, is a gadget used to peruse spaces including of probability measures. At present, this interdisciplinary field, as a combination of differential geometry and statistics, plays an impressive role in various sciences. For instance, a manifold learning theory in a hypothetic space consisting of models is developed in [1]. The semi-Riemannian metric of this hypothesis space, which is uniquely derived, relies on the information geometry of the probability distributions. In [2], Amari also presented the geometrical and statistical ideas used to investigate neural networks, including invisible units or unobservable variables. To see more applications of this geometry in other sciences, refer to [3,4].
Suppose that is an open subset of , and is a sample space with parameters . A statistical model S is the set of probability density functions defined by
The Fisher information matrix on S is given as
where is the expectation of with respect to , and . The space S, together with the information matrices, is a statistical manifold.
In 1920, Fisher was the first to offer (1) as a mathematical purpose of information (see [5]). It is observed that is a Riemannian manifold if all components of g are converging to real numbers and g is positive-definite. Therefore, g is called a Fisher metric on S. Using g, an affine connection ∇ with respect to is described by
Nearly Kähler structures on Riemannian manifolds were specified by Gray [6] to describe a special class of almost Hermitian structures in every even dimension. As an odd-dimensional peer of nearly Kähler manifolds, nearly Sasakian manifolds were introduced by Blair, Yano and Showers in [7]. They showed that a normal nearly Sasakian structure is Sasakian and a hypersurface of a nearly Kähler structure is nearly Sasakian if and only if it is quasi-umbilical with the (almost) contact form. In particular, properly imbedded in inherits a nearly Sasakian structure which is not Sasakian.
A statistical manifold can be considered as an expanse of a Riemannian manifold such that the compatibility of the Riemannian metric is developed to a general condition. By applying this opinion in geometry, we create a convenient nearly Sasakian structure on statistical structures and define a nearly Sasakian statistical manifold.
The purpose of this paper is to present nearly Sasakian and nearly Kähler structures on statistical manifolds and show the relation between two geometric notions. To achieve this goal, the notions and attributes of statistical manifolds are obtained in Section 2. In Section 3, we describe a nearly Sasakian structure on statistical manifolds and present some of their properties. In Section 4, we investigate nearly Kähler structures on statistical manifolds. In this context, the conditions needed for a real hypersurface in a nearly Kähler statistical manifold to admit a nearly Sasakian statistical structure are provided. Section 5 is devoted to studying (anti-)invariant statistical submanifolds of nearly Sasakian statistical manifolds. Some conditions under which an invariant submanifold of a nearly Sasakian statistical manifold is itself a nearly Sasakian statistical manifold are given at the end.
2. Preliminaries
For an n-dimensional manifold N, consider , as a local chart of the point . Considering the coordinates on N, we have the local field as frames on .
An affine connection ∇ is called Codazzi connection if the Codazzi equations satisfy:
for any where
The triplet is also called a statistical manifold if the Codazzi connection ∇ is a statistical connection, i.e., a torsion-free Codazzi connection. Moreover, the affine connection as a (dual) conjugate connection of ∇ with respect to g is determined by
Considering as the Levi–Civita connection on N, one can see and
Thus, forms a statistical manifold. In particular, the torsion-free Codazzi connection ∇ reduces to the Levi–Civita connection if .
A -tensor field K on a statistical manifold is described by
from (2) and (3), we have
Hence, it follows that K satisfies
The curvature tensor of a torsion-free linear connection ∇ is described by
for any . On a statistical structure , denote the curvature tensor of ∇ as or for short, and denote as in a similar argument. It is obvious that
Moreover, setting , we can see that
The statistical curvature tensor field of the statistical structure is given by
using the definition of , it follows that
where .
The Lie derivative with respect to a metric tensor g in a statistical manifold , for any is given by
The vector field v is said to be the Killing vector field or infinitesimal isometry if . Hence, using the above equation and (8), it follows that
Similarly, (7) implies
The curvature tensor of a Riemannian manifold admitting a Killing vector field v satisfies the following
for any [8].
3. Nearly Sasakian Statistical Manifolds
An almost contact manifold is a -dimensional differentiable manifold N equipped with an almost contact structure where is a tensor field of type , v a vector field and u a 1-form, such that
Additionally, N will be called an almost contact metric manifold if it admits a pseudo-Riemannian metric g with the following condition
Moreover, as in the almost contact case, (19) yields and .
Theorem 1.
The statistical curvature tensor field of a statistical manifold with an almost contact metric structure , such that the vector field v is Killing, which satisfies the equation
for any .
Proof.
Applying (9) to the above equation, we find
Since v is Killing, by differentiating
with respect to , we obtain
Setting the last equation in (20), it follows that
As , and using (12) in the above equation, we can obtain
Similarly, we find
A nearly Sasakian manifold is an almost contact metric manifold if
for any [7]. In such manifolds, the vector field v is Killing. Moreover, a tensor field h of type is determined by
The last equation immediately shows that h is skew-symmetric and
and
Moreover, Olszak proved the following formulas in [9]:
for any .
Lemma 1.
For a manifold N with a statistical structure , and an almost contact metric structure , the following holds
for any .
Definition 1.
A nearly Sasakian statistical structure on N is a quintuple consisting of a statistical structure and a nearly Sasakian structure satisfying
for any .
A nearly Sasakian statistical manifold is a manifold that admits a nearly Sasakian statistical structure.
Remark 1.
A multiple is also a nearly Sasakian statistical manifold if is a nearly Sasakian statistical manifold. In this case, from Lemma 1 and Definition 1, we have
for any .
Theorem 2.
If is a statistical manifold, and an almost contact metric structure on N; then, is a nearly Sasakian statistical structure on N if and only if the following formulas hold:
for any .
Proof.
Example 1.
Let us consider the three-dimensional unite sphere in the complex two-dimensional space . As is isomorphic to the Lie group , set as the basis of the Lie algebra of obtained by
Therefore, the Lie bracket is described by
The Riemannian metric g on is defined by the following
Assume that and u is the 1-form described by for any . Considering as a -tensor field determined by and ; the above equations imply that is an almost contact metric manifold. Using Koszul’s formula, it follows that , except
According to the above equations, we can see that
unless
which gives , a nearly Sasakian structure on . By setting
while the other cases are zero, one see that K satisfies (8). From (6), it follows that
Therefore, we can obtain , except
Hence, is a statistical structure on . Moreover, the equations
hold. Therefore, is a nearly Sasakian statistical manifold.
Proposition 1.
For a nearly Sasakian statistical manifold , the following conditions hold:
for any .
Proof.
Putting in the last equation and using (18), we can obtain
Applying yields
Corollary 1.
A nearly Sasakian statistical manifold satisfies the following
for any .
Proof.
Similarly,
Then, subtracting the above two equations yields
which gives us . Thus, we obtain
Moreover, (iii) implies
Therefore, the assertion follows. □
Corollary 2.
In a nearly Sasakian statistical manifold N, let and . Then,
- 1.
- ,
- 2.
- .
Proposition 2.
On a nearly Sasakian statistical manifold, the following holds
for any .
Proof.
Since v is a Killing vector field in a nearly Sasakian manifold (see [7]); hence, we have
Setting (6) in the above equation, we have the following assertion. □
Lemma 2.
Let be a nearly Sasakian statistical manifold. Then, the statistical curvature tensor field satisfies
for any .
Proof.
Applying (17) in the above equation, we have
We can similarly conclude that
The above two equations imply
from this and Theorem 1, we have
Thus, the assertion follows from (25), (32) and Corollary 1. □
Corollary 3.
On a nearly Sasakian statistical manifold N, the following holds
for any .
Proof.
We have
Applying Lemma 2 in the last equation, it follows that (33). To prove (34), using and in the above equation and using the skew-symmetric property of h, we can obtain
□
Proposition 3.
The statistical curvature tensor field S of a nearly Sasakian statistical manifold N satisfies the following
for any .
Proof.
Using the above equation in (38), we obtain (35). Considering in (35) and using (18), it follows that
Similarly, setting , and , respectively, we have
and
Replacing and by and , we can rewrite the last equation as
Applying (34) in the above equation, we obtain
On the other hand, using (18), it can be seen that
According to Corollary 1 and (32), we have
The above three equations imply (36). □
Corollary 4.
The tensor field K in a nearly Sasakian statistical manifold, N, satisfies the relation
for any .
Proof.
Comparing this with relation (36) yields the following assertion. □
A statistical manifold is called conjugate symmetric if the curvature tensors of the connections ∇ and , are equal, i.e.,
for all .
Corollary 5.
Let be a conjugate symmetric nearly Sasakian statistical manifold. Then, the following holds
for any .
4. Hypersurfaces in Nearly Kähler Statistical Manifolds
Let be a smooth manifold. A pair is said to be an almost Hermitian structure on if
for any . Let denote the Riemannian connection of . Then, J is Killing if and only if
In this case, the pair is called a nearly Kähler structure and if J is integrable, the structure is Kählerian [7].
Lemma 3.
Let be a statistical structure, and a nearly Kähler structure on . We have the following formula:
for any , where is given as (8) for .
Remark 2.
A multiple is also a nearly Kähler statistical manifold if is a nearly Kähler statistical manifold. In this case, from the above lemma, we have
for any .
Definition 2.
A nearly Kähler statistical structure on is a triple , where is a statistical structure, is a nearly Kähler structure on and the following equality is satisfied
for any .
Let N be a hypersurface of a statistical manifold . Considering and g as a unit normal vector field and the induced metric on N, respectively, the following relations hold
for any . It follows that
Furthermore, the second fundamental form is related to the Levi–Civita connections and by
where .
Remark 3.
Let be a nearly Kähler manifold, and N be a hypersurface with a unit normal vector field . Let g be the induced metric on N, and consider v, u and as a vector field, a 1-form and a tensor of type on N, respectively, such that
for any . Then, is an almost contact metric structure on N [7].
Lemma 4.
Let be a nearly Kähler statistical manifold. If is a hypersurface with the induced almost contact metric structure as in Remark 2, and is the induced statistical structure on N as in 42, then the following holds
- (i)
- (ii)
- (iii)
- (iv)
- (v)
- (vi)
for any . For the induced statistical structure on N, we have
- (i)*
- (ii)*
- (iii)*
- (iv)*
- (v)*
- (vi)*
Proof.
According to Definition 2 and (46), we can write
The vanishing tangential part yields
Setting in the above equation, it follows that
hence, and implies (i), from which (ii) follows because . From (49) and (50) we have (iii). Vanishing vertical part in (48), and using and
we obtain (iv). As
thus, (43), (44), (46) and (47) imply
From the above equation, (v) and (vi) follow. In a similar fashion, we have –. □
Theorem 3.
Proof.
Let be a nearly Sasakian statistical structure on N. According to Definition 1, we have
which gives us
Placing the last equation in part (iii) of Lemma 4, we obtain (51). Similarly, we can prove (52). Conversely, let the shape operators satisfy (51). Part (v) of Lemma 4 yields
In the same way, (v) and (52) imply
According to the above equations and Theorem 2, the proof is completed. □
5. Submanifolds of Nearly Sasakian Statistical Manifolds
Let N be a n-dimensional submanifold of an almost contact metric statistical manifold . We denote the induced metric on N by g. For all and , we put and , where and . If and for any , then N is called -invariant and -anti-invariant, respectively.
Proposition 4
([10]). Any -invariant submanifold N embedded in an almost contact metric manifold in such a way that the vector field is always tangent to N, induces an almost contact metric structure .
For any , the corresponding Gauss formulas are given by
It is proved that and are statistical structures on N, and and are symmetric and bilinear. The mean curvature vector field with respect to is described by
The submanifold N is a totally umbilical submanifold if for all . The submanifold N is called -autoparallel if for any . The submanifold N is said to be dual-autoparallel if it is both - and -autoparallel, i.e., for any . If for any , the submanifold N is called totally geodesic. Moreover, the submanifold N is called -minimal (-minimal) if ().
For any and , the Weingarten formulas are
where D and are the normal connections on and the tensor fields , , A and , satisfy
The Levi–Civita connections and are associated with the second fundamental form by
where .
On a statistical submanifold of a statistical manifold , for any tangent vector fields , we consider the difference tensor K on N as
More precisely, for the tangential part and the normal part, we have
respectively. Similarly, for and we have
where
Now, suppose that is a submanifold of a nearly Sasakian statistical manifold . As a tensor field, of type on is described by ; we can set and where and for any and . Furthermore, if and , then N is called -invariant and -anti-invariant, respectively.
Proposition 5.
Let N be a submanifold of a nearly Sasakian statistical manifold , where the vector field is normal to N. Then,
Moreover,
(i) N is a -anti-invariant submanifold if and only if N is a -anti-invariant submanifold.
(ii) If , then N is a -anti-invariant submanifold.
(iii) If N is a -invariant and -invariant submanifold, then , for any .
Proof.
Using (22) and Proposition 1 for any , we can write
(54) and the above equation imply
As is symmetric and the operators and g are skew-symmetric, the above equation yields
Lemma 5.
Let be a -anti-invariant statistical submanifold of a nearly Sasakian statistical manifold such that the structure on N is given by Proposition 4.
(i) If is tangent to N, then
(ii) If is normal to N, then
Proof.
Thus, the normal part is and the tangential part is . Similarly, we can obtain their dual parts. Hence, (i) holds. If is normal to N, from (22) and (54), it follows that
Considering the normal and tangential components of the last equation, we obtain (ii). Since , we have the dual part of the assertion. □
Lemma 6.
Let be a -invariant and -invariant statistical submanifold of a nearly Sasakian statistical manifold . Then, for any , if
(i) is tangent to N, then
(ii) is normal to N, then
Proof.
The relations are proved using the method applied to the proof of Lemma 5. □
Theorem 4.
On a nearly Sasakian statistical manifold , if N is a -anti-invariant totally umbilical statistical submanifold of and is tangent to N, then N is -minimal in .
Proof.
According to Lemma 5, . As N is a totally umbilical submanifold, it follows that
which gives us the assertion. □
Theorem 5.
Let N be a -invariant submanifold of a nearly Sasakian statistical manifold , where the vector field is tangent to N. If
for all , then forms a nearly Sasakian statistical structure on N.
Proof.
According to Proposition 4, N induces the almost contact metric structure . Furthermore, (53) shows that is a statistical structure on N. By applying (55), we can write
As is symmetric, from (59), we have . Hence, the above equation implies
On the other hand, since has a nearly Sasakian structure, we have
(59) and the above two equations yield
Thus, is a nearly Sasakian manifold. For the nearly Sasakian statistical manifold , using (27), we have
for any . Applying (57) in the last equation, it follows
From the above equation and (60), we obtain
Therefore, is a nearly Sasakian statistical manifold. Hence, the proof is completed. □
Proposition 6.
Let N be a -invariant and -invariant statistical submanifold of a nearly Sasakian statistical manifold , such that is tangent to N. Then,
and
for any .
Proof.
We have
for any . According to Proposition 1, part (i) of Lemma 6 and the above equation, we have
Similarly, other parts are obtained. □
Corollary 6.
Let N be a -invariant and -invariant statistical submanifold of a nearly Sasakian statistical manifold . If is tangent to N, then the following conditions are equivalent
(i) and are parallel with respect to the connection ;
(ii) N is dual-autoparallel.
Author Contributions
Writing—original draft, S.U., E.P. and L.N.; Writing—review and editing, R.B. All authors have read and agreed to the published version of the manuscript.
Funding
This research work was funded by Institutional Fund Projects under grant no. (IFPIP: 1184-130-1443). The authors gratefully acknowledge the technical and financial support provided by the Ministry of Education and King Abdulaziz University, DSR, Jeddah, Saudi Arabia.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Sun, K.; Marchand-Maillet, S. An information geometry of statistical manifold learning. In Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, China, 21–26 June 2014; pp. 1–9. [Google Scholar]
- Amari, S. Information geometry of the EM and em algorithms for neural networks. Neural Netw. 1995, 8, 1379–1408. [Google Scholar] [CrossRef]
- Belkin, M.; Niyogi, P.; Sindhwani, V. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 2006, 7, 2399–2434. [Google Scholar]
- Caticha, A. Geometry from information geometry. arXiv 2015, arXiv:1512.09076v1. [Google Scholar]
- Fisher, R.A. On the mathematical foundations of theoretical statistics. Philos. Trans. R. Soc. Lond. 1922, 222, 309–368. [Google Scholar]
- Gray, A. Nearly Kähler manifolds. J. Differ. Geom. 1970, 4, 283–309. [Google Scholar] [CrossRef]
- Blair, D.E.; Showers, D.K.; Yano, K. Nearly Sasakian structures. Kodai Math. Semin. Rep. 1976, 27, 175–180. [Google Scholar] [CrossRef]
- Blair, D.E. Riemannian Geometry of Contact and Symplectic Manifolds; Birkhäuser: Basel, Switzerland, 2002. [Google Scholar]
- Olszak, Z. Nearly Sasakian manifolds. Tensor 1979, 33, 277–286. [Google Scholar]
- Yano, K.; Ishihara, S. Invariant submanifolds of almost contact manifolds. Kōdai Math. Semin. Rep. 1969, 21, 350–364. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).