OPTIMIZATIONS ON STATISTICAL HYPERSURFACES WITH CASORATI CURVATURES

In the present paper, we study Casorati curvatures for statistical hypersurfaces. We show that the normalized scalar curvature for any real hypersurface (i.e., statistical hypersurface) of a holomorphic statistical manifold of constant holomorphic sectional curvature k is bounded above by the generalized normalized δ−Casorati curvatures and also consider the equality case of the inequality. Some immediate applications are discussed.


Introduction
In 1985, a notion of statistical manifold has been studied by Amari [1]. The abstract generalizations of statistical models are considered as the statistical manifolds. The geometry of statistical manifolds lies at a junction of several branches of geometry (information geometry, affine differential geometry and Hessian geometry). A statistical structure can be considered as a generalization of a Riemannian structure (a pair of a Riemannian metric and its Levi-Civita connection). It includes the notion of dual connection, also called conjugate connection. The theory of statistical manifold and its statistical submanifold plays a role of central importance in many research fields of differential geometry.
Recently, H. Furuhata investigated the existence of complex structures on statistical manifolds and introduced the concept of holomorphic statistical manifold, as the statistical counterpart of the notion of complex manifold (see [11,12]). Similarly, by putting a natural affine connection to a Sasakian manifold and a Kenmotsu manifold, Furuhata defined a Sasakian statistical manifold [13] and a Kenmotsu statistical manifold [14]. The theory of statistical manifolds and their statistical submanifolds is a very recent geometry. Therefore, it attracts the geometers and several interesting results have been obtained by many of them (for example [3-5, 21, 22, 26, 28]).
The Casorati curvature has been defined by F. Casorati [6] as the normalized square of the length of the second fundamental form of a submanifold of a Riemannian manifold. This notion extends the concept of the principal direction of a hypersurface of a Riemannian manifold. This curvature, which is of interest in computer vision, was preferred by Casorati over the traditional curvatures because it seems to correspond better with the common intuition of curvature. Several geometers have found geometrical interpretation and significance of the (extrinsic) Casorati curvatures. Therefore, it follows that it is of great interest to establish a family of optimal Casorati inequalities for different submanifolds with any codimension of different ambient space forms (for example [9,10,15,16,18,19,24,25,27]) In this paper, we obtain a family of optimal inequalities which relate the normalized scalar curvature with the Casorati curvature for statistical hypersurfaces of holomorphic statistical manifolds of constant holomorphic sectional curvature. Equality cases are also verified. Such inequalities were recently obtained for a statistical submanifold, which is obviously a particular class of statistical hypersurfaces. See, for instance [2,8,17,20]. We mention that the ambient spaces in the above mentioned articles are different as compared to the ambient space (that is, a holomorphic statistical manifold of constant holomorphic sectional curvature) in our work, namely a quaternion Kahler-like statistical space form, a Kenmotsu statistical manifold, a statistical manifold, and a Sasakian statistical manifold, respectively.

Statistical Manifold and its Submanifolds
This section is fully devoted to a brief review of several fundamental formulae and some definitions which are required later. Definition 2.1 ([12]). Let ∇ be an affine connection of Riemannian manifold (M, g) with Riemannian metric g on M.
(a) The affine connection ∇ * of M defined by , for any X, Y, Z ∈ Γ(T M) is known as the dual connection of ∇ with respect to g. (b) The triplet (M, ∇, g) is known as a statistical manifold if the torsion tensor field of ∇ vanishes and ∇g ∈ Γ(T M (0,3) ) is symmetric.
We remark that one can also construct examples for higher dimension by defining Fisher information metric and α−connection on a family of statistical distribution (for example [12]).  For a holomorphic statistical manifold M, g, J , we have the following relation (see [12]) ∇ X (JY) = J∇ * X Y for any X, Y ∈ Γ(T M). Lemma 2.1 ([11]). Let (M, g, J) be a Kaehler manifold and a connection ∇ is defined as ∇ := ∇ g + K, where K is a (1, 2)−tensor field satisfying the following conditions: for any X, Y ∈ Γ(T M). Then, by simple computation, we see that K 1 satisfies three conditions of Lemma 2.1, and hence a holomorphic statistical manifold (M, ∇ := ∇ g + K 1 , g, J) is obtained. 26]). For a Kaehler manifold (M, g, J), we take a vector field U ∈ Γ(T M) and set K 2 as follows: for any X, Y ∈ Γ(T M). Then K 2 ∈ Γ(T M (1,2) ) satisfies three conditions of Lemma 2.1 as in Example 2.2, and hence (M, ∇ := ∇ g + K 2 , g, J) becomes a holomorphic statistical manifold.

Example 2.4 ([26]). Let us consider a Kaehler manifold
where a Riemanian metric g and the standard complex structure J on M are defined by . Now, for any κ ∈ R, we define a (1, 2)-tensor field K 3 on R 2 as follows: where −k 1 11 = k 2 12 = k 2 21 = k 1 22 = κ and k 2 11 = k 1 12 = k 1 21 = k 2 22 = 0. Then K 3 satisfies all three conditions of Lemma 2.1, and hence we get a holomorphic statistical manifold (M, ∇ := ∇ g + K 3 , g, J), where an affine connection ∇ on M is given by Now, we pay attention to the concept of statistical hypersurface. Let (M, g) be a statistical hypersurface of a holomorphic statistical manifold (M, g, J). By the Kaehler structure J, one can transfer any tangent vector field X on M in M as follows: JX = PX + u(X)N, where PX = tan(JX) and N is a unit normal vector field on M in M.
Then, it naturally satisfies the following relations (see [12]): The fundamental equations in the geometry of Riemannian submanifolds are the Gauss and Weingarten formulae and the equations of Gauss, Codazzi and Ricci (see [29]). In the statistical setting, Gauss and Weingarten formulae are, respectively, defined by [12] where ∇ and ∇ * (resp. ∇ and ∇ * ) are the dual connections on M (resp. on M). Define ν and ν * by ν(X) = g(D X N, N) and ν * (X) = g(D * X N, N), respectively. The symmetric and bilinear imbedding curvature tensors of M in M for ∇ and ∇ * are denoted by ς and ς * , respectively. The relation between ς (resp., ς * ) and Λ (resp. Λ * ) is defined by [12] g for any X, Y ∈ Γ(T M) and N ∈ Γ(T ⊥ M).

Definition 2.4 ([5]
). Let (M, ∇, g) be a submanifold with any codimension of a statistical manifold (M, ∇, g). Then M is said to be (a) totally geodesic with respect to ∇ if ς = 0; (a) * totally geodesic with respect to ∇ * if ς * = 0; The curvature tensors with respect to ∇ and ∇ * are denoted by R and R * , respectively. Also, R and R * are the curvature tensors with respect to ∇ and ∇ * , respectively. Then the curvature tensor fields of M and M are respectively defined as (see [12]) S = 1 2 (R + R * ) and S = 1 2 (R + R * ). The sectional curvature K on M of M is given by (see [21,22]) for any orthonormal vectors X, Y ∈ T ℘ M, ℘ ∈ M.

Definition 2.5 ([12]
). A holomorphic statistical manifold (M, ∇, g, J) is said to be of constant holomorphic curvature k ∈ R if the following curvature equation holds The corresponding Gauss equation is given by (see [12]) for any X, Y, Z ∈ Γ(T M).

Casorati Curvatures for Statistical Hypersurfaces
In this section, we study Casorati curvatures for a statistical hypersurface M of a holomorphic statistical manifold M. We and the normalized scalar curvature of M is defined as .
The mean curvature vectors H and H * of M in M are given by Conveniently, let us put Then, the squared norm of mean curvature vectors of M is defined as The squared norm of second fundamental forms ς and ς * are denoted by C and C * , respectively, called the Casorati curvatures of M in M. Therefore, we have If we consider a r-dimensional subspace W of T M, r ≥ 2, and an orthonormal basis {E 1 , . . . , E r } of W. Then the scalar curvature of the r-plane section W is defined as and the Casorati curvatures of the subspace W are the following: The normalized Casorati curvatures δ C (m − 1) and δ C (m − 1) are defined as (a) Further, we define the generalized normalized Casorati curvatures δ C (s; m − 1) and δ C (s; m − 1) as follows . Throughout this paper, we work with the above mentioned notations only.

Bounds of Normalized Scalar Curvature
The most fascinating problem in the theory of Riemannian submanifolds is to find simple relationships between various invariants (intrinsic and extrinsic) of the submanifolds and Riemannian manifolds. Initially, B.-Y. Chen [7] obtained sharp optimal inequalities involving the intrinsic δ-curvatures of Chen and the extrinsic squared mean curvature of submanifolds in a real space form. On the other hand, the study of δ−Casorati [9] curvatures proposed new solutions to the above problem. In this section, we prove such inequalities for a statistical hypersurface (M m , ∇, g) of a holomorphic statistical manifold (M 2n , ∇, g, J) with constant holomorphic sectional curvature k, M 2n (k).
Applying Cauchy-Buniakowski-Schwarz, we have From last inequality, we can easily obtain (4.1). This is the required inequality.
Theorem 4.1 shows that the normalized scalar curvature is bounded below. Now, we switch to our next theorem, which shows that the normalized scalar curvature is bounded above in terms of Casorati curvature. The result is as follows.
Let us take a quadratic polynomial K in the components of the second fundamental form Without loss of generality, we assume that W is spanned by E 1 , . . . , E m and together with (4.4), we find that From (4.5), we observe that the solutions of the following system of linear homogenous equations: are the critical points Hence, every solution ς 0c has ς 0 ij = 0 for i = j and the determinant which corresponds to the first two equations of the above system is zero. Furthermore, the Hessian matrix Hess K of K is given by where O are the null matrices and the matrices I, II and III are, respectively, given below: Therefore, the eigenvalues of Hessian matrix Hess K are given below: Thus, we know that K is parabolic and reaches a minimum K(ς 0c ) for each solution ς 0c of the system (4.6). From the equations (4.5) and (4.6), we arrive at K(ς 0c ) = 0. Hence K ≥ 0, and this further gives following inequality: Hence, we find that for every tangent hyperplane W of M. If we take the infimum over all tangent hyperplanes W, our assertion (4.2) follows.
In the same manner, we can establish an inequality (4.3) in the second part of the theorem.
Remark 4.1. The proof of Theorem 4.2 is mainly based on a classical optimization procedure by showing that a quadratic polynomial in the components of the second fundamental form ς 0 with respect to Levi-Civita connection is parabolic (see [15,16,18,24,27]). Since, we have proved that the Hessian matrix (4.8) is positive semidefinite for all points and admits precisely one eigenvalue equal to zero. Therefore, it is easy to say that K is parabolic and reaches a minimum K(ς 0c ) for each solution ς 0c of the system (4.6). In fact, because of the convexity, the critical point is a global minimum. We note that an alternative proof of Theorem 4.2 can be done by making use of T. Oprea's optimization technique [23], namely analyzing a suitable constrained extremum problem (see also [8,19,25]).
The characterisation of equality cases in Theorem 4.2.

Some Geometric Applications
In this section, we discuss some immediate applications of the results proved in the previous section. Some immediate consequences of Theorem 4.2 are the following.