Discrete-Time Stochastic Quaternion-Valued Neural Networks with Time Delays: An Asymptotic Stability Analysis

: Stochastic disturbances often cause undesirable characteristics in real-world system modeling. As a result, investigations on stochastic disturbances in neural network (NN) modeling are important. In this study, stochastic disturbances are considered for the formulation of a new class of NN models; i.e., the discrete-time stochastic quaternion-valued neural networks (DSQVNNs). In addition, the mean-square asymptotic stability issue in DSQVNNs is studied. Firstly, we decompose the original DSQVNN model into four real-valued models using the real-imaginary separation method, in order to avoid difﬁculties caused by non-commutative quaternion multiplication. Secondly, some new sufﬁcient conditions for the mean-square asymptotic stability criterion with respect to the considered DSQVNN model are obtained via the linear matrix inequality (LMI) approach, based on the Lyapunov functional and stochastic analysis. Finally, examples are presented to ascertain the usefulness of the obtained theoretical results.


Introduction
The research on dynamical behavior analysis for NN models has attracted increasing attention in recent years and their results have been widely used in a variety of science and engineering disciplines [1][2][3][4][5][6][7][8][9]. The stability analysis of NN models is fundamental and important in the applications of NNs, which have received significant attention recently .
Indeed, most of the NN analyses are dealt with in the continuous-time case. Nevertheless, in today's digital world, nearly all signals are digitalized for computer processing needs before and after transmission. In this regard, instead of continuous-time analysis, it is important to study discrete-time signals in implementing NN models. As a result, several researchers have studied various dynamical behaviors of discrete-time NN models. For example, a number of scientific results for various dynamic behaviors in the discrete-time case for both the real-world neural network (RVNN) as well as for the and stochastic concepts. Note that several known results can be viewed as special cases of the results of our work. Finally, we provide numerical examples to illustrate the usefulness of the proposed results.
This study presents three key contributions. (1) This is the first analysis of the mean-square asymptotic stability of the considered DSQVNN models. (2) Unlike the traditional stability analysis, we establish new mean-square asymptotic stability criteria for the considered DSQVNN models, which is achieved through the Lyapunov functional and real-imaginary separate-type activation functions. (3) Developed sufficient conditions can be directly solved by the standard Matlab LMI toolbox. (4) The results of this study are more general and powerful than those in the existing discrete-time QVNN models in the literature.
In Section 2, we define the proposed problem model formally. We explain the new stability criterion in Section 3. The numerical examples are given in Section 4. Concluding remarks are given in the last section.

Notations
We use R, C, and H to indicate the real, complex, and skew quaternion fields, respectively. The m × m matrices with entries from R, C, and H are denoted as R m×m , C m×m , and H m×m , while the m-dimension vectors are denoted as R m , C m , and H m , respectively. For any matrix Q and its transpose, the conjugate transposes are denoted as Q T and Q * , respectively. In addition, a block diagonal matrix is denoted as diag{}, while the smallest and largest eigenvalues of Q are denoted as λ min (Q) and λ max (Q), respectively. The Euclidean norm of a vector x and the mathematical expectation of a stochastic variable x are represented by x and E{x}, respectively. Meanwhile, given integers a, b with a < b, the discrete interval given by N[a, b] = {a, a + 1, ..., b − 1, b} is denoted by N[a, b], while the set of all functions φ : N[−ς, 0] → H m is denoted by C(N[−ς, 0], H m ), respectively. Moreover, we assume that (Ω, F, P) be a complete probability space with a filtration {F} t≥0 satisfying the usual conditions. In a given matrix, a term induced by symmetry is denoted by in the matrix.

Quaternion Algebra
Firstly, we address the quaternion and its operating rules. The quaternion is expressed in the form: where the real constants are denoted by x R , x I , x J , x K ∈ R, while the fundamental quaternion units are denoted by i, j, and k. The following Hamilton rules are satisfied: which implies that quaternion multiplication has the non-commutativity property.
The following expressions define the operations between quaternions x = x R + ix I + jx J + kx K and y = y R + iy I + jy J + ky K . Note that the definitions of addition and subtraction of complex numbers are applicable to those of the quaternions as well.

(i)
Addition: (ii) Subtraction: The multiplication of x and y, which is in line with the Hamilton multiplication rules (1), is defined as follows: (iii) Multiplication: The following expression of |x| represents the module of a quaternion x = x R + ix I + jx J + kx K ∈ H: where the conjugate transpose of x is denoted by

Problem Definition
The DSQVNN model with time delays is considered; i.e., where p = 1, ..., m, k ∈ N. The model in (2) can be expressed in an equivalent vector form of where the state variable and quaternion-valued neuron activation function are denoted by x(k) = [x 1 (k), ..., x m (k)] T ∈ H and g(x(k)) = [g 1 (x 1 (k)), ..., g m (x m (k))] T ∈ H, respectively. In addition, U = [u 1 , ..., u m ] T ∈ H the input vector. A self-feedback connection weight matrix with 0 ≤ d i < 1 is denoted by D = diag{d 1 , ..., d m } ∈ R m . Besides that, a connection weight matrix is denoted by A = (a pq ) m×m ∈ H m×m , while the transmission delay is denoted by a positive scalar of ς. Given the model in (3), its initial condition is where which is similar to [20][21][22]. We can define the following, given φ ∈ Ω, As such, Ω is a Banach space having uniform convergence in its topology. Suppose the solutions of the model in (3) are x(k, φ) and x(k, ψ) starting from φ and ψ, respectively, in which any φ, ψ ∈ Ω. As such, following the model in (3), we have Let As a results, we can express (6) as A1: For y = y R + iy I + jy J + ky K ∈ H, y R , y I , y J , y K ∈ R, we can divide f q (y) into two parts, real and imaginary, as follows: For further analysis, we divide the NN model in (8) into both real and imaginary parts through the use of the quaternion multiplication. As such, we have +σ I (k, y I (k))w(k), +σ J (k, y J (k))w(k), where σ R (k, y R (k)) = Re(σ(k, y(k)), σ I (k, y I (k)) = Im(σ(k, y(k)), σ J (k, y J (k)) = Im(σ(k, y(k)), σ K (k, y K (t)) = Im(σ(k, y(k)).
The following expression denotes the initial condition of the model in (9): Therefore, the following expression can be used to represent the model in (9) y(k + 1) =Dȳ(k) +Āf (ȳ(k − ς)) +σ(k)w(k). (11) Note that the following expression constitutes the initial condition of the model in (11) where |φ K (s)|.

Definition 2.
For any solution of the NN model in (8), it is asymptotically stable in the mean square sense if the following expression is true: Lemma 1. [43] Given matrix 0 < W = W T ∈ R m×m , integers τ 1 and τ 2 satisfying τ 1 < τ 2 , and vector function y : N[τ 1 , τ 2 ] → R, in a way whereby the sums concerned are well-defined, we have

Remark 1.
When stochastic disturbances are excluded, we can reduce the NN model in (11) to becomē The proof of Theorem (1) can be applied to yield Corollary (1).
By complex number properties y(k) = y R (k) + iy I (k), the NN model in (8) becomes where σ R (k, y R (k)) = Re(σ(k, y(k)), σ I (k, y I (k)) = Im(σ(k, y(k)). Consider the model in (27) becomesŷ The following expression constitute the initial condition of the model in (28) y A3: For y = y R + iy I ∈ C, y R , y I ∈ R, we can divide f q (y) into two parts, real and imaginary, as follows: where f R q (·), f I q (·) : R → R. There exist constants ξ R− q , ξ R+ q , ξ I− q , ξ I+ q , such that for any α, β ∈ R and α = β,

A4:
The noise intensity function σ s (k, y s (k)) : R × R m → R m×m , (s = R, I) has the properties of (i) Borel measurable; (ii) locally Lipschitz continuous, and it satisfies the following expressions: where ρ 1 , ρ 2 are known positive constants.
The proof of (1) can be applied to yield Corollary (2).

Corollary 2.
The activation function can be separated into both real and imaginary parts based on assumption (A 3 ). Given the existence of matrices 0 < P 1 , 0 < P 2 , 0 < Q 1 , 0 < Q 2 , 0 < R 1 , 0 < R 2 , diagonal matrices 0 < L 1 , 0 < L 2 , and scalars 0 < λ * 1 , 0 < λ * 2 , the NN model in (28) is said to be asymptotically stable in the mean square sense, subject to satisfying the following LMI: When stochastic disturbances are excluded, we can reduce the NN model in (28) to becomê The proof of Theorem (1) can be applied to yield Corollary (3).

Corollary 3.
The activation function can be separated into both real and imaginary parts based on assumption (A 3 ). Given the existence of matrices 0 < P 1 , 0 < P 2 , 0 < Q 1 , 0 < Q 2 , 0 < R 1 , 0 < R 2 and diagonal matrices 0 < L 1 , 0 < L 2 , the NN model in (34) is said to be globally asymptotically stable, subject to satisfying the following LMI: whereΘ

Remark 3.
In the literature of QVNN models, the way to choose a suitable quaternion-valued activation function is still an open question. Several activation functions have recently been used to study QVNN models; e.g., non-monotonic piecewise nonlinear activation functions [30], linear threshold activation functions [37][38][39], and real-imaginary separate type activation functions [28,32,34]. Under the assumption that the activation functions can be divided into real and imaginary parts, our current results provide some criteria to ascertain the asymptotic stability in the mean-square sense pertaining to the considered DSQVNN models along with time-delays.

Remark 4.
In [37], the authors used the semi-discretization technique to obtain discrete-time analogues of continuous-time QVNNs with linear threshold neurons and study their global asymptotical stability without considering time delays. Compared with the previous work [37] by separating the real part and imaginary part of the DSQVNNs with time delays and constructing suitable Lyapunov-Krasovskii functional candidates, we obtain the sufficient conditions for the mean-square asymptotic stability of the DSQVNNs in the form of LMIs. The LMI conditions in this paper are more concise than those obtained in [37][38][39] and much easier to be checked.

Remark 5. Different dynamics of DCVNN models without stochastic disturbances have been examined in
previous studies [20][21][22]. In this study, we not only focus on the mean-square asymptotic stability criteria with respect to a class of discrete-time SNN models by using the same method proposed in [20][21][22] but also extend our results to the quaternion domain. As such, the approach proposed in this paper is more general and powerful.

Illustrative Examples
This section presents two numerical examples to show the usefulness of the proposed method. By separating the activation function into real and imaginary parts, we can find A R , A I , A J , and A K . Choose the noise intensity functions as σ R (k, y R (k)) = 0.1y R (k), σ I (k, y I (k)) = 0.1y I (k), σ J (k, y J (k)) = 0.1y J (k), σ K (k, y K (k)) = 0.1y K (k), it can be verified that A2 is satisfied with ρ 1 = ρ 2 = ρ 3 = ρ 4 = 0.02. Given a time delay of ς = 3, and the activation functions can be taken as:  (1), it is easy to conclude that the NN model in (8) with the above given parameters is mean-square asymptotically stable based on the Lyapunov stability theory. The state trajectories y R 1 (k), y I 1 (k), y J 1 (k), y K 1 (k), y R 2 (k), y I 2 (k), y J 2 (k), y K 2 (k) of the NN model in (8) with stochastic disturbances are depicted in Figures 1 and 2, respectively. Figures 3 and 4 show the state trajectories y R 1 (k), y I 1 (k), y J 1 (k), y K 1 (k), y R 2 (k), y I 2 (k), y J 2 (k), y K 2 (k) of the NN model in (8) Figure 4. Results of the time responses for the states y R 2 (k), y I 2 (k), y J 2 (k), y K 2 (k) of the NN model in (8) with σ(k, y(k)) = 0 in Example 1.

Example 2.
The following parameters pertaining to the NN model in (27) are considered: By separating the activation function into both real and imaginary parts, we obtain A R and A I . The noise intensity functions are considered as σ R (k, y R (k)) = 0.1y R (k), σ I (k, y I (k)) = 0.1y I (k); we can verify that A4 is satisfied with ρ 1 = ρ 2 = 0.02. Take the time-delay ς = 3, and subject to the following activation functions .

It can verified that A3 is satisfied with ξ
We can find that the conditions (30)- (33) are true by using the LMI control toolbox in MATLAB. According to Corollary (2), we can conclude that the NN model in (27) with the aforementioned parameters is asymptotically stable in the mean square sense based on the Lyapunov stability theory. The state trajectories y R 1 (k), y I 1 (k), y R 2 (k), y I 2 (k) of the NN model in (27) with stochastic disturbances are depicted in Figures 5 and 6, respectively. Figures 7 and 8 show the state trajectories y R 1 (k), y I 1 (k), y R 2 (k), y I 2 (k) of the NN model in (27) without stochastic disturbances.  Figure 5. Results of the time responses for the states y R 1 (k), y I 1 (k) of the NN model in (27) with σ R (k, y R (k)) = 0.1y R (k), σ I (k, y I (k)) = 0.1y I (k) in Example 2.  Figure 6. Results of the time responses for the states y R 2 (k), y I 2 (k) of the NN model in (27) with σ R (k, y R (k)) = 0.1y R (k), σ I (k, y I (k)) = 0.1y I (k) in Example 2.  Figure 7. Results of the time responses for the states y R 1 (k), y I 1 (k) of the NN model in (27) with σ R (k, y R (k)) = σ I (k, y I (k)) = 0 in Example 2.  Figure 8. Results of the time responses for the states y R 2 (k), y I 2 (k) of the NN model in (27) with σ R (k, y R (k)) = σ I (k, y I (k)) = 0 in Example 2.

Conclusions
In this study, we have investigated the mean-square asymptotic stability criteria for the considered DSQVNN models. The designed DSQVNN models encompass discrete-time stochastic CVNN and discrete-time stochastic RVNN as the special cases. By exploiting the real-imaginary separation method, we have derived four equivalent RVNNs from the original QVNN model. By formulating appropriate Lyapunov functional candidates with more system information, and by employing stochastic concepts, we have established the LMI-based new sufficient conditions for the mean-square asymptotic stability of the DSQVNN models. It is worth noting that previously known results can be treated as special cases in our results. The effectiveness of our investigation has been demonstrated through numerical examples.
For future work, a variety of stochastic QVNN models will be examined. Specifically, the BAM (bidirectional associative memory)-type and Cohen-Grossberg-type of QVNN model under the discrete-time case will be investigated in our next study. Acknowledgments: The authors are grateful to Chiang Mai University for supporting this research.

Conflicts of Interest:
The authors declare no conflicts of interest. (1). Given the NN model in (11), the following Lyapunov functional candidate is considered