g-Expectation for Conformable Backward Stochastic Differential Equations

In this paper, we study the applications of conformable backward stochastic differential equations driven by Brownian motion and compensated random measure in nonlinear expectation. From the comparison theorem, we introduce the concept of g-expectation and give related properties of g-expectation. In addition, we find that the properties of conformable backward stochastic differential equations can be deduced from the properties of the generator g. Finally, we extend the nonlinear Doob–Meyer decomposition theorem to more general cases.


Introduction
The initial research motivation of nonlinear expectations came from risk measurement and option pricing in financial applications. The Allais paradox, Ellsberg paradox and Simon's "bounded rationality" theory, and so forth, all show that decision-making in reality is contrary to the hypothesis of expected utility theory. Economists have found that the linearity of classical mathematical expectation (that is, the additivity of probability measures) is the main reason for this kind of problem so researchers wanted to find a new tool which can not only retain some properties of classical mathematical expectations, but also solve financial problems with highly dynamic and complex characteristics.
In the 1950s, Choquet [1] extended the Lebesgue integral to non-additive measure and obtained the Choquet expectation. However, this nonlinear expectation does not have dynamic compatibility and is not suitable for solving practical financial problems. In 1997, Peng [2] introduced a new nonlinear expectation, namely the g-expectation, based on the backward stochastic differential equation driven by Brownian motion. The g-expectation retains all the basic properties of the classical expectation except linearity [3], and it can be applied to the dynamic risk measurement of actuarial and financial valuation. Subsequently, Royer [4] studied the backward stochastic differential equation driven by Brownian motion and Poisson random measure, and introduced the corresponding g-expectation and a large number of studies show that this g-expectation can be applied to financial problems (see [5][6][7][8][9]). Recently, Long et al. [10] proposed a multi-step scheme on time-space grids for solving backward stochastic differential equations, and Chen and Ye [11] investigated solutions of backward stochastic differential equations in the framework of Riemannian manifold. From the paper [12], we could get the averaging principle for backward stochastic differential equations and the solutions can be approximated by the solutions to averaged stochastic systems in the sense of mean square under some appropriate assumptions.
In addition, coupled forward backward stochastic differential equations driven by the G-Brownian motion were studied in [13], while [14] investigated the solvability of fully coupled forward-backward stochastic differential equations with irregular coefficients.
The above papers concern research on integer order derivative, while the works of conformable type derivative are very few ( [15][16][17][18][19][20]). The conformable derivative not only has some properties of fractional derivative, but also some properties of integer order derivatives. We discussed the necessity of studying conformable backward stochastic differential equations in [21]. In the present paper, we study g-expectation for conformable backward stochastic differential equations. This paper is mainly divided into four parts. In the second section, we give some definitions and theorems. In the third section, we study the relationship between gexpectation and the filtered consensus expectation, and we give some properties of gexpectation. We find that the g-expectation can be considered as a nonlinear extension of the Girsanov transformation. In the final section, we prove the Doob-Meyer decomposition theorem under mild assumptions.

Preliminaries
Let B(·) be a standard Brownian motion defined on the complete probability space (Ω, F , P) with the filtration {F t } 0≤t≤T satisfying the usual hypotheses of completeness and right continuity. B(R) denotes the Borel sets of R and E denotes the expected value. A stochastic process V(ω, t) is a real function defined on The natural filtration is completed with sets of measure zero. By P we denote the σ-field. A process V : it is F -adapted and P-measurable. A process is called càdlàg if its trajectories are rightcontinuous and have left limits. The term a.s. means almost surely with respect to the probability measure. Inspired by [22], we define some spaces that we will use: Furthermore, for any constant σ, we introduce the norms of spaces H 2 , H 2 N and S 2 as:  [2] (Definition 3.2)) A nonlinear expectation E is a filtration consistent expectation (F -consistent expectation) if for any ζ ∈ L 2 (Ω, F T , P) and a ≤ t ≤ T, there exists a random variable ξ ∈ L 2 (Ω, where ξ is uniquely defined. We denote ξ = E [ζ|F t ], which is called the conditional expectation of ζ with respect to F t . Therefore, we can write it as Lemma 1. (see [4](Lemma A.1)) Let A(·) be an increasing predictable process. We consider its decomposition as a sum of a continuous and a purely discontinuous process: We also consider a càdlàg martingale W(·), bounded in L 2 .
(i) For any stopping time τ such that a ≤ τ ≤ T, (ii) For any predictable stopping time τ such that a ≤ τ ≤ T,
Define the following two stopping times: Then we getσ ≤τ ≤ T. Since X(t) − X n (t) is right continuous, we have: SupposeX(t) is the solution with the terminal valueX(τ) = X n (τ) on [a,τ]. From (2) and the comparison theorem, we get X n (σ) ≤ X(σ). This is a contradiction. Thus, X(t) ≥ X n (t).

The Main Results of g-Expectations
Consider the following conformable backward stochastic differential equation where ζ ∈ L 2 (Ω, F T , P), X is an adapted process, Y and Z are given control processes, is a given Brownian motion and N is a compensated random measure.
where K is a positive constant.
(ii) For any z, z ∈ R, there exist constants −1 < C 1 ≤ 0 and C 2 ≥ 0 such that Notice that the comparison theorems in [21] follow from Definition 1. Hence a nonlinear expectation can be defined by conformable backward stochastic differential equations.
3) satisfies Assumption 1 and we define the g-expectation as E g [ζ] = X(a), where a triple (X, Y, Z) is a unique solution of Equation (3) and X(a) denotes the initial value of the solution.
for any a ≤ t ≤ T, the generator g of Equation (3) satisfies Assumption 1 and we define the conditional g-expectation as E g [ζ|F t ] = X(t), where a triple (X, Y, Z) is a unique solution of Equation (3) and ζ ∈ L 2 (Ω, F T , P) denotes the terminal value of the solution.

Proposition 1.
We have the following results: Case (ii). For any A ∈ F s and a ≤ s ≤ t ≤ T, we have A ∈ F t . From the result of (i), one has: For any a ≤ t ≤ T and 0 < ρ ≤ 1, consider the following equations: where ζ ∈ L 2 (Ω, F T , P), X 1 (u) = E g [ζ1 A |F t ] and the generator g satisfies Assumption 1.
Multiplying by 1 A on both sides of (7) we get: where a ≤ t ≤ T and 0 < ρ ≤ 1. Notice that g(t, X(t), Y(t), Z(t))1 A = g(t, 1 A X(t), 1 A Y(t), 1 A Z(t)), and then, (8) can be written as: By the uniqueness of the conformable backward stochastic differential equation, we get X 1 (t) = X 3 (t) = E g [ζ1 A |F t ]1 A , a ≤ t ≤ T. From Definition 3 and Proposition 1, we have: Next, we prove the uniqueness of ξ. Assume that there exists another random variable ξ such that E g [ξ1 A ] = E g [ξ 1 A ] and ξ = ξ . Choose ξ > ξ . According to the comparison theorem in [21] and Definition 3, Combining the existence and uniqueness of ξ, we conclude that the g-expectation is F -consistent expectation. The proof is complete.
Next, we give two kinds of g-expectation with the special generators g 1 and g 2 .
(i) Translation invariance: for any constant c ∈ R and a ≤ t ≤ T, we have where the generator g is independent of X(·).

Doob-Meyer Decomposition Theorem
We first give some definitions.

Definition 5.
The process X(·) is called a g-martingale if for any a ≤ s ≤ t ≤ T, we have E[|X(t)| 2 ] < ∞ and E g [X(t)|F s ] = X(s).

Definition 6.
The process X(·) is called a g-supermartingale if for any a ≤ s ≤ t ≤ T, we have E[|X(t)| 2 ] < ∞ and E g [X(t)|F s ] ≤ X(s).
In other words, these sequences (X n (t)) n∈N + , (Y n (t)) n∈N + and (Z n (t, ·)) n∈N + weakly converge in their spaces, and then for all stopping time ς, we have