Robust Stability of Complex-Valued Stochastic Neural Networks with Time-Varying Delays and Parameter Uncertainties

In practical applications, stochastic effects are normally viewed as the major sources that lead to the system’s unwilling behaviours when modelling real neural systems. As such, the research on network models with stochastic effects is significant. In view of this, in this paper, we analyse the issue of robust stability for a class of uncertain complex-valued stochastic neural networks (UCVSNNs) with time-varying delays. Based on the real-imaginary separate-type activation function, the original UCVSNN model is analysed using an equivalent representation consisting of two real-valued neural networks. By constructing the proper Lyapunov–Krasovskii functional and applying Jensen’s inequality, a number of sufficient conditions can be derived by utilizing It o ^ ’s formula, the homeomorphism principle, the linear matrix inequality, and other analytic techniques. As a result, new sufficient conditions to ensure robust, globally asymptotic stability in the mean square for the considered UCVSNN models are derived. Numerical simulations are presented to illustrate the merit of the obtained results.


Background and Motivation
The dynamical analysis for a variety of neural networks (NNs) has recently attracted the increasing attention of researchers. The results of NNs have been used extensively in different domains, including signal processing, pattern recognition, optimal control, and other science and engineering domains [1][2][3][4][5][6][7][8][9][10]. On the other hand, stability is a key requirement for a system to function properly and safely. As such, it is important to carry out the stability analysis of NNs, and they have received considerable attention [11][12][13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30]. and stochastic disturbances. In the NN model, the uncertain parameters are considered to be of the norm-bounded type, and the stochastic disturbances are assumed to be Brownian motions. Secondly, based on the real-imaginary separate-type activation function, the original UCVSNN model is separated into an equivalent representation consisting of two real-valued NNs. On the basis of the homeomorphism principle, Itô's formula, the Lyapunov-Krasovskii functional (LKF), as well as the linear matrix inequality (LMI), the sufficient conditions are derived in terms of simplified LMIs, whose feasible solutions can be verified by MATLAB. Numerical simulations are presented to ascertain the merits of the presented results.

Organization
This paper contains five sections. In Section 2, the problem definition is formally presented. In Section 3, the main results of this paper are presented. In Section 4, the usefulness of the results is illustrated by numerical examples. Concluding remarks are given in Section 5.

List of Symbols
Throughout this paper, R n and C n denote the n-dimensional Euclidean space and unitary space, while the n × n real matrix and complex matrix are denoted by R n×n and C n×n , respectively; · denotes the Euclidean norm in R n . A symmetric positive-definite matrix is denoted by P > 0; the transposition and complex conjugate transpose are denoted by the superscript T and * , respectively. Besides that, I n denotes the n-dimensional identity matrix; denotes a matrix that can be inferred by symmetry. The space of a continuous function mapping φ from [−d, 0] into C n is denoted as C([−d, 0]; C n ). (Ω, F, P) represents the complete probability space with a filtration {F t } t≥0 ; while the family of all F 0 measurable is denoted as L 2 F 0 ([−d, 0]; C n ); while diag{} stands for a block-diagonal matrix. The notation z T (·) P = z T (t)Pz(t) and E{·} indicates the mathematical expectation.

Problem Definition
A CVSNN model with parameter uncertainties is expressed as: where z(t) = [z 1 (t), ..., z n (t)] T ∈ C n is the state vector; while A = (a kj ) n×n ∈ C n×n and D = diag{d 1 , ..., d n } ∈ R n with d k > 0 (k = 1, ..., n) are the delayed connection weight matrix and self-feedback connection weight matrix, respectively. In addition, the vector-valued activation function is denoted by g(z(t)) = [g 1 (z 1 (t)), ..., g n (z n (t))] T : C n → C n ; J ∈ C n is the external input vector; d(t) is the time-varying delay that satisfies 0 ≤ d(t) ≤ d, 0 ≤ḋ(t) ≤ µ in which d and µ are known real constants; and Brownian motion defined on (Ω, F, {F t } t≥0 , P); while B, C ∈ R n×n are known constant matrices. Assumption 1. The activation function g j (·), j = 1, ..., n satisfies the following Lipschitz condition for all z 1 , z 2 ∈ C : where l j is a constant.
The parameter uncertainties D(t), (1) are assumed to satisfy: where G and H 1 , H 2 , H 3 are known matrices, while F(t) is a time-varying uncertain matrix that satisfies: For further analysis, let Both NN models in (6) and (7) can be equivalently re-written as: From (4) and (5), it is easy to see that the parameter uncertainties D , Ȃ satisfy: From (3), one can obtain: Let the initial condition of the NN model (8) be:

Remark 1.
It should be noted that, if we let D = Ȃ =B =C ≡ 0, the NN model in (8) turns into the following NN model:

Fundamentals
The main results can be derived by using the following lemmas.
Definition 1. For the NN model (1) and every φ ∈ L 2 F 0 ([−d, 0], C n ), the trivial solution is globally robustly asymptotically stable in the mean square if, for all admissible uncertainties, Definition 2. [45] Suppose that Ω is an open set of R n and H : Ω → R n is an operator. With the Euclidean norm · 2 , the nonlinear measure of H on Ω can be obtained: Lemma 1. [45] H is an injective mapping on Ω if m(H) < 0. Besides that, H is homeomorphic in R n if Ω = R n .

Lemma 2.
[31] The following inequality holds for any vectors M, N ∈ R n , a scalar > 0, and a positive-definite matrix P ∈ R n :

Main Results
This section explains the delay-dependent robust stability criteria for the considered CVNN models based on the Lyapunov functional and LMI method. Theorem 1. Under Assumption 1, the following LMI holds given matrix P > 0 and diagonal matrix W > 0: then a unique equilibrium point exists for the CVNN model in (12).
Proof. Define the following operator: where P > 0. As such, one can infer that the equilibrium points of the NN model in (12) are the same as the zero points of H(υ). Subsequently, we prove that mR 2n (H) < 0.
By Definition 2, we have: By (10), (14), and Lemma 2, for diagonal matrixW > 0, we have: Using Lemma 5, it follows from (13) that By (15), one can get that m 2n R (H) < 0. Next, using Lemma 1, H(υ) is a homeomorphism of R 2n . As a result, H(υ) has a unique zero point. This indicates that the NN model in (8) has a unique equilibrium point.
Let υ be the unique equilibrium point of the CVNN model in (12), andυ =υ − υ can be shifted to the origin for the following NN model:˙υ The following Theorem 2 describes that the robustly globally asymptotically stable criterion for the considered NN model (1) or equivalent NN model (8).

Theorem 2.
Based on Assumption 1, we can divide the activation function into two: the real and imaginary parts. The model in (8) is robustly globally asymptotically stable in the mean square for any scalars d > 0, µ > 0, if there exist matrices P > 0, Q > 0, R > 0, S > 0, U > 0, as well as any matrix X, diagonal matrix W > 0, and scalar γ > 0, such that the following LMIs hold: where: Proof. Given the NN model in (8), the Lyapunov function candidate can be expressed as: By Itô's differential rule, taking the stochastic derivative with respect to V(t,υ t ) pertaining to the trajectories of the NN model in (8), we have: By using Lemma 2, we obtain: The single integration of (21) can be estimated, i.e., Using Lemmas 3 and 4, we obtain: The double integration of (21) can be estimated, i.e., Using Lemma 3, we obtain: Moreover, from (10), it follows that: Then, combining with (21)-(24), we have: where: By applying Lemma 5, the condition (25) is equivalent to the form (18). Therefore, for Π < 0, a scalar β > 0 exists such that: Taking the mathematical expectation on both sides of (25), we have: As a result, the model in (8) is robustly globally asymptotically stable in the mean squared sense. This completes the proof.

Remark 2.
In the situation where there are no parameter uncertainties, (1) becomes: Simultaneously, System (8) then turns into: Corollary 1 is obtained when we set D = Ȃ = 0 in the proof of Theorem 2.

Corollary 1.
Based on Assumption 1, we can divide the activation function into two: the real and imaginary parts. The NN model in (29) is globally asymptotically stable in the mean square for any scalar d > 0, µ > 0, if there exist matrices P > 0, Q > 0, R > 0, S > 0, U > 0, diagonal matrix W > 0, and matrix X in such a way that the following LMI holds: where:Π Simultaneously, the model in (8) becomes: Corollary 2 is obtained when we set D = Ȃ =B =C = 0 in the proof of Theorem 2.

Corollary 2.
Based on Assumption 1, we can divide the activation function into two: the real and imaginary parts. The model in (33) is globally asymptotically stable for any scalar d > 0, µ > 0, if there exist matrices P > 0, Q > 0, R > 0, S > 0, U > 0, as well as any diagonal matrix W > 0, and matrix X in such a way that the following LMI holds: where: Remark 4. In this study, we designed a general system model, namely UCVSNNs for continuous time. Thus, the NN model discussed in [9] is the special case of the proposed NN model in this paper. This means that when there are no parameter uncertainties, stochastic disturbance, and time-varying delays in (1), then it is equivalent to the following system model proposed in [9].ż (t) = −Dz(t) + Ag(z(t)) + J Remark 5. In this paper, we used the approach proposed in prior studies [30], but we extended the results on the complex domain. This means that, in [30], the authors studied the robust stability criteria for a class of uncertain stochastic NN, while, in our paper, a new class of uncertain stochastic CVNN was developed by implementing complex algebra into RVNN, in order to generalize RVNN models with a complex-valued state vector, input vector, and neuron activation functions. Similar robust stability criteria were derived for the considered UCVSNN models in Theorem 2. The new approach used in this paper was more concise and powerful.

Remark 6.
It should be noted that network models are susceptible to stochastic disturbances. In this respect, numerous researchers have investigated the stability issues with stochastic inputs for different types of NN models. Among the studies are included passivity, robust stability, mean squared exponential stability, global exponential stability, as well as robust state estimation [41][42][43][44]48]. In our study, in Theorem 2, we obtained sufficient conditions for the robust globally asymptotic stability in the mean square in terms of the LMI approach. The conditions were more concise than those obtained in [41] and much easier to check.

Remark 7.
When the complex-valued activation function g j (·), j = 1, 2, ..., n was not divided into real and imaginary parts separately, the main results of this paper were invalid.

Illustrative Examples
This section shows the merits of the present results by two numerical examples. (1):

Example 1. Consider a two-neuron UCVSNN model in
with: For this NN model, the activation function is the form as g j (z) = g R j (x, y) + ig I j (x, y), j = 1, 2. We can obtain the following results under the simple calculation,  By taking d(t) = 0.3sint + 0.4, which satisfies d = 0.7 and µ = 0.5, and by applying Theorem 2, it is straightforward to realize that the CVNN model in (1) or the equivalent CVNN model in (8) is robustly globally asymptotically stable in the mean squared sense. We can obtain the following feasible solutions by solving the LMIs in (18) and (19):

Example 2.
Let the parameters of a CVNN model defined in (32) be: For this NN model, the activation function is the form as g j (z) = g R j (x, y) + ig I j (x, y), j = 1, 2. Under some simple calculation, we can obtain the following results: By taking d(t) = 0.5sint + 0.8, which satisfies d = 1.3 and µ = 0.5, and by applying the MATLAB LMI toolbox, we can obtain the following feasible solutions for the LMIs (34) and (35) (34) and (35) in Corollary 2. Figure 3 depicts the state trajectories with respect to the real and imaginary parts of the CVNN model in (32), subject to the initial conditions z 1 (t) = x 1 (t) + iy 1 (t) = 0.4 + 0.3i, z 2 (t) = x 2 (t) + iy 2 (t) = 0.2 − 0.5i. The phase trajectories with respect to the real and imaginary parts of the CVNN model in (32) are given in Figure 4. By Corollary 2, it is straightforward to confirm that the equilibrium point of the CVNN model in (32) is globally asymptotically stable.

Conclusions
The problem of the robust stability of UCVSNNs with time-varying delays was investigated in this paper. To analyse more realistic behaviours, a general form of the NN model including the effects of parameter uncertainties and stochastic disturbances was considered. Based on the real-imaginary separate-type activation function, the original UCVSNN model was separated into an equivalent representation consisting of two real-valued NNs. By applying the homeomorphism principle, Itô's formula, LKF, as well as LMI, the associated sufficient conditions were derived. The results confirmed that the model equilibrium was unique and was globally asymptotically stable, in which the feasible solution could be easily verified by MATLAB. To demonstrate the usefulness of the obtained results, two illustrative examples were presented. For further research, we will extend our proposed approach to analysing other relevant types of stochastic quaternion-valued neural network models. In this regard, we intend to undertake the investigation on Cohen-Grossberg quaternion-valued neural network models and bi-directional associative memory quaternion-valued neural network models. This is the direction for future studies whereby new methodologies will be developed and evaluated comprehensively.