Abstract
The objective of this paper is to analyze the approximate controllability of a semilinear stochastic integrodifferential system with nonlocal conditions in Hilbert spaces. The nonlocal initial condition is a generalization of the classical initial condition and is motivated by physical phenomena. The results are obtained by using Sadovskii’s fixed point theorem. At the end, an example is given to show the effectiveness of the result.
Keywords:
approximate controllability; impulsive systems; mild solutions; semilinear systems; Sadovskii’s fixed point theorem MSC:
34K30; 34K35; 93C25
1. Introduction
Controllability is one of the essential concepts in mathematical control theory and plays a crucial role in each deterministic and stochastic control system. It has been properly documented that the controllability of deterministic systems is widely employed in many fields of science and technology. Any control system can be defined as controllable only if every state regarding that process is affected or controlled in the corresponding time by some control signals. In several projective systems, it is possible to guide or control the dynamical system from an imperiousinitial state to a peremptoryfinal state with the help of the set of admissible controls.
Kalman [1] introduced the idea of the controllability for finite-dimensional deterministic linear control systems. The fundamental ideas of control theory in finite and infinite-dimensional spaces were introduced in [2] and [3], respectively. However, in several cases, some reasonable randomness can appear in the problem, so that the system should be modeled by a stochastic form. Only a few authors have researched the extension of deterministic controllability ideas to stochastic control systems. Dauer and Mahmudov [4] studied the controllability of a semilinear stochastic system by using the Banach fixed point technique. In [5,6,7,8,9], Mahmudov et al. established results for the controllability of linear and semilinear stochastic systems in Hilbert space. On behalf of this, Sakthivel, Balachandran, and Dauer et al. deliberated on the approximate controllability of nonlinear stochastic systems in [4,10,11,12]. Sakthivel et al. studied the existence results for fractional stochastic differential equations; see [13,14,15,16,17,18,19,20] and the references therein.
On the other hand, only a few authors have investigated the controllability of neutral functional integrodifferential systems in Banach spaces by using semigroup theory. Recently, in [21,22,23], Balachendran and Karthikeyan et al. studied the controllability of stochastic integrodifferential systems in finite dimension spaces.
To date, from our simplest data, there are no results on the approximate controllability of semilinear stochastic integrodifferential systems with nonlocal conditions using Sadovskii’s fixed point theorem within the literature. Therefore, this paper is dedicated to the estimation of the approximate controllability of semilinear stochastic integrodifferential control systems with nonlocal conditions using Sadovskii’s fixed point theorem.
In this work, we shall study the approximate controllability of the following semilinear stochastic integrodifferential system:
where is a closed, linear, and densely-defined operator on , which generates a compact semigroup on . Let B be a bounded linear operator from the Hilbert space U into . The control ; ; ; are nonlinear suitable functions. is the measurable -valued random variable independent of w; g is a continuous function from . For simplicity, we generally assume that the set of admissible controls is .
2. Preliminaries
Let be a complete space with a normal filtration , . Let , and E be the separable Hilbert spaces and be a Q-Wiener process on with the covariance operator Q such that . We assume that there exists a complete orthonormal system in E, a bounded sequence of nonnegative real numbers such that , , and a sequence of independent Brownian motions such that:
Let and , where is the -algebra generated by . Let be the space of all Hilbert–Schmidt operators from to with the norm . Let be the space of all -adapted, -valued measurable square integrable processes on
Let be the Banach space of continuous maps from into satisfying the condition:
Let . Now, is the closed subspace of consisting of measurable and -adapted -valued processes endowed with the norm:
Definition 1.
A stochastic process is a mild solution of (1)–(2) if for each , it satisfies the following integral equation:
Let us introduce the succeeding operators and sets [24] defined by:
where denotes the set of bounded linear operators from to . Then, its adjoint operator is given by:
The set of all states reachable in time b from initial state using admissible controls is defined as:
Let us introduce the linear controllability operator as follows:
The corresponding controllability operator for the deterministic model is:
Definition 2.
The stochastic system (1)–(2) is approximately controllable on if , where , and is the closed subspace of , consisting of all -adapted, U-valued stochastic processes.
Lemma 1.
[25] Let be a strongly-measurable mapping such that . Then:
for all and , where is the constant involving p and b.
Lemma 2.
(Sadovskii’s fixed point theorem) Suppose that N is a nonempty, closed, bounded, and convex subset of a Banach space and is a condensing operator. Then, the operator has a fixed point in N.
3. Main Result
To prove our main results, we list the following hypotheses:
Hypothesis 1 (H1).
A is the infinitesimal generator of a compact semigroup on .
Hypothesis 2 (H2).
The function satisfies linear growth and Lipschitz conditions, i.e., there exist positive constants , such that:
Hypothesis 3 (H3).
The function satisfies linear growth and Lipschitz conditions, i.e., there exist positive constants , such that:
Hypothesis 4 (H4).
The function satisfies linear growth and Lipschitz conditions, i.e., there exist positive constants , such that:
Hypothesis 5 (H5).
The function h is a continuous function, and there exists some positive constants such that:
for all .
Hypothesis 6 (H6).
For each , the operator in the strong operator topology as , where:
is the controllability Gramian.
Observe that the linear deterministic system corresponding to (1)–(2):
is approximately controllable on iff the operator strongly as . For simplicity, let us take:
Two lemmas, as far as approximate controllability is concerned, will be utilized in the result. The accompanying lemma is needed to define the control function.
Lemma 3.
[7] For any , there exists such that:
Now, for any and , we define the control function in the form below:
Lemma 4.
There exists a positive constant such that for all , we have:
Proof.
Let . From Holder’s inequality, Lemma 1, and the presumption on the data, we obtain:
where . When , the second inequality can be proven in the same approach. □
Theorem 1.
Proof.
The proof of this theorem is classified into three steps: For any , define the operator by:
- Step 1.
- For any is continuous on J in the -sense. Let . Then, for any fixed , it follows from Holder’s inequality, Lemma 1, and presume for the theorem that:
Thus, utilizing LDCT, we infer that the right-hand side of the above inequality tends to zero as . Accordingly, we conclude that is continuous from the right in . A comparative contention demonstrates that it is likewise continuous from the left in . Consequently, is continuous on J in the -sense.
- Step 2.
- For each positive integer q, let , then the set is clearly a bounded, closed, and convex set in :
From Lemma 1, Holder’s inequality, and the assumption (H1), we have:
which deduces that is integrable on J, and by Bochner’s theorem, is well defined on . Next, from the assumption (H4), it follows that,
Similarly from the assumption (H2) and Lemma 1, we have:
Now, we claim that there exists a positive number q such that .
If this is not true, then for each positive number q, there is a function , but does not belong to , that is for some . On the other hand, from the assumptions (H2), (H3), and Lemma 4, we have:
Dividing both sides by q and taking the limit as , we get:
This contradicts with Condition (5). Hence, for some positive number q,
- Step 3.
- Define the operators and as:
Now, we will prove that is completely continuous, while is a contraction operator.
It is clear that is completely continuous by the assumption (H3). To prove is a contraction, let us take . Then, from the assumptions (H2), (H3), and for each , we have:
Therefore,
where:
Thus, is a contraction mapping.
Theorem 2.
Proof.
Let be a fixed point of in . By using the stochastic Fubini theorem, it is easy to see that:
By the assumption that , and g are uniformly bounded, there exists such that:
in .
Then, there is a subsequence denoted by:
weakly converging to say in and in . The compactness of implies that:
On the other hand, by the assumption (H6), for all , the operator:
and moreover:
Hence, by the Lebesgue dominated convergence theorem, we obtain:
This results in the approximate controllability. □
4. Example
Consider the stochastic control system:
Let . Here, B is a bounded linear operator from a Hilbert space U into , and , , and are all continuous and uniformly bounded; is a feedback control; and w is a Q-Wiener process.
Let be an operator defined by:
with domain:
Let ,
Let ,
Let ,
The function ,
for and .
With this option of , and s, (1)–(2) is the abstract formulation of (7)–(9), such that the conditions in (H1) and (H2) are fulfilled. Then:
For the time being, define an infinite-dimensional space:
with the norm defined by:
and a linear continuous mapping B from as follows:
It is well known that for :
Moreover,
for and .
Now, let , . It follows that:
5. Conclusions
In this paper, we study the approximate controllability of a semilinear stochastic integrodifferential system with nonlocal conditions in Hilbert spaces. The nonlocal initial condition is a generalization of the classical initial condition and is motivated by physical phenomena. The results are obtained by using Sadovskii’s fixed point theorem.
In future work, we intend to extend these results to a new class of stochastic differential equations driven by fractional Brownian motion.
Author Contributions
A.A. and K.R. conceived and designed the control problem experiment and studied the approximate controllability of semilinear stochastic integrodifferential system with nonlocal conditions in Hilbert spaces. The nonlocal initial condition is the generalization of the classical initial condition and is motivated by physical phenomena. A.A. and K.R. analyzed the paper and contributed materials/analysis tools.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Kalman, R.E. Controllability of linear systems. Contrib. Differ. Equ. 1963, 1, 190–213. [Google Scholar]
- Barnett, S. Introduction to Mathematical Control Theory; Clarendon Press: Oxford, UK, 1975. [Google Scholar]
- Curtain, R.F.; Zwart, H. Introduction to infinite dimensional linear systems theory. In Texts in Applied Mathematics; Springer: New York, NY, USA, 1995. [Google Scholar]
- Dauer, J.P.; Mahmudov, N.I. Approximate Controllability of semilinear function equations in Hilbert spaces. J. Math. Anal. Appl. 2002, 273, 310–327. [Google Scholar] [CrossRef]
- Mahmudov, N.I. Controllability of linear stochastic systems. IEEE Trans. Autom. Control 2001, 46, 724–731. [Google Scholar] [CrossRef]
- Mahmudov, N.I. Approximate controllability of semilinear deterministic and stochastic evolution equations in abstract spaces. SIAM J. Control Optim. 2003, 45, 1604–1622. [Google Scholar] [CrossRef]
- Mahmudov, N.I.; Zorlu, S. Controllability of nonlinear stochastic systems. J. Control 2003, 76, 95–104. [Google Scholar] [CrossRef]
- Mahmudov, N.I. Controllability of linear stochastic systems in Hilbert spaces. J. Math. Anal. Appl. 2001, 259, 64–82. [Google Scholar] [CrossRef]
- Mahmudov, N.I. Controllability of semilinear stochastic systems in Hilbert spaces. J. Math. Anal. Appl. 2003, 288, 197–211. [Google Scholar] [CrossRef]
- Bashirov, A.E.; Kerimov, K.R. On controllability conception for stochastic systems. SIAM J. Control Optim. 1997, 35, 384–398. [Google Scholar] [CrossRef]
- Balachandran, K.; Dauer, J.P. Controllability of nonlinear systems in Banach spaces. J. Optim. Theory Appl. 2002, 115, 7–28. [Google Scholar] [CrossRef]
- Sakthivel, R.; Kim, J.H.; Mahmudov, N.I. On Controllability of nonlinear stochastic systems. Rep. Math. Phys 2006, 58, 433–443. [Google Scholar] [CrossRef]
- Chang, Y.K.; Nieto, J.J.; Li, W.S. Controllability of semilinear differential systems with nonlocal initial conditions in Banach spaces. J. Optim. Theory Appl. 2009, 142, 267–273. [Google Scholar] [CrossRef]
- Shukla, A.; Sukavanam, N.; Pandey, D.N. Approximate controllability of fractional semilinear stochastic system of order α ∈ (1,2]. Nonlinear Stud. 2015, 22, 131–138. [Google Scholar] [CrossRef]
- Kumar, S.; Sukavanam, N. On the Approximate Controllability of a Functional Order Control Systems with Delay. Nonlinear Dyn. Syst. Theory 2013, 13, 69–78. [Google Scholar]
- Ito, K. On stochastic differential equations. Mem. Am. Math. Soc. 1951, 4, 1–51. [Google Scholar] [CrossRef]
- Sakthivel, R.; Revathi, P.; Ren, Y. Existence of solutions for nonlinear fractional stochastic differential equations. Nonlinear Anal. TMA 2013, 81, 70–86. [Google Scholar] [CrossRef]
- Sakthivel, R.; Anandhi, E.R. Approximate controllability of impulsive differential equations with state-depenent delay. Int. J. Control 2010, 83, 387–393. [Google Scholar] [CrossRef]
- Shukla, A.; Sukavanam, N.; Pandey, D.N. Controllability of semilinear stochastic system with multiple delays in control. Adv. Control Optim. Dyn. Syst. 2014, 3, 306–312. [Google Scholar] [CrossRef]
- Sakthivel, R.; Nieto, J.J.; Mahmudov, N.I. Approximate controllability of nonlinear deterministic and stochastic systems with unbounded delay. Taiwan. J. Math 2010, 14, 1777–1797. [Google Scholar] [CrossRef]
- Balachendran, K.; Karthikeyan, S. Controllability of stochastic integrodifferential systems. Int. J. Control 2007, 80, 486–491. [Google Scholar] [CrossRef]
- Balachendran, K.; Sakthivel, R. Controllability of integrodifferential systems in Banach spaces. Appl. Math. Comput. 2001, 118, 63–71. [Google Scholar] [CrossRef]
- Balachendran, K.; Kim, J.H.; Karthikeyan, S. Controllability of semilinear stochastic integrodifferential systems. Kybernetika 2007, 43, 31–44. [Google Scholar]
- Luo, J.; Liu, K. Stability of infinite dimensional stochastic evolution equations with memory and markovian jumps. Stoch. Process. Appl. 2008, 118, 864–895. [Google Scholar] [CrossRef]
- Da Prato, G.; Zabczyk, J. Stochastic Equations in Infinite Dimensions. In Encyclopedia of Mathematics and Its Applications; Cambridge University Press: Cambridge, UK, 1992. [Google Scholar]
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).