Next Article in Journal
Exact Solutions of Reaction–Diffusion PDEs with Anisotropic Time Delay
Next Article in Special Issue
Dynamics and Embedded Solitons of Stochastic Quadratic and Cubic Nonlinear Susceptibilities with Multiplicative White Noise in the Itô Sense
Previous Article in Journal
Epidemic Spreading on Weighted Co-Evolving Multiplex Networks
Previous Article in Special Issue
Global Synchronization of Fractional-Order Multi-Delay Coupled Neural Networks with Multi-Link Complicated Structures via Hybrid Impulsive Control
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Almost Sure Exponential Stability of Uncertain Stochastic Hopfield Neural Networks Based on Subadditive Measures

1
School of Sciences and Arts, Suqian University, Suqian 223800, China
2
Ningxia Key Laboratory of Intelligent Information and Big Data Processing, Governance and Social Management Research Center of Northwest Ethnic Regions, North Minzu University, Yinchuan 750021, China
3
School of Mathematical and Computational Science, Hunan University of Science and Technology, Xiangtan 411201, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2023, 11(14), 3110; https://doi.org/10.3390/math11143110
Submission received: 8 June 2023 / Revised: 1 July 2023 / Accepted: 6 July 2023 / Published: 14 July 2023

Abstract

:
For this paper, we consider the almost sure exponential stability of uncertain stochastic Hopfield neural networks based on subadditive measures. Firstly, we deduce two corollaries, using the Itô–Liu formula. Then, we introduce the concept of almost sure exponential stability for uncertain stochastic Hopfield neural networks. Next, we investigate the almost sure exponential stability of uncertain stochastic Hopfield neural networks, using the Lyapunov method, Liu inequality, the Liu lemma, and exponential martingale inequality. In addition, we prove two sufficient conditions for almost sure exponential stability. Furthermore, we consider stabilization with linear uncertain stochastic perturbation and present some exceptional examples. Finally, our paper provides our conclusion.

1. Introduction

An artificial neural network (ANN) is a computational model inspired by the human brain. ANNs comprise interconnected neurons that process and transmit information. ANNs excel in parallel processing and handling complex, nonlinear problems. ANNs learn from data, recognize patterns, and solve tasks like image recognition and natural language processing. With different architectures such as feedforward, recurrent, and convolutional networks, ANNs have become a crucial component of modern artificial intelligence, enabling machines to learn, adapt, and perform tasks that have traditionally required human intelligence. The Hopfield neural network, as a type of ANN [1], has witnessed steady advancement and intensive investigation over the past few decades, leading to a rich reservoir of research outcomes that have found widespread applications across diverse domains, including combination optimization [2], signal processing [3], pattern recognition [4], and robust control [5]; however, the successful application of neural networks in these fields is closely linked to their dynamic behavior, and stochastic stability is the most important property [6,7,8,9,10,11,12,13]. The above literature shows that the ability of a neural network to maintain stochastic stability (exponential stability and instability [6], exponential stability with time delay [7,8], global stability of stochastic high-order neural networks [9], mean square exponential stability with time-varying delays [10], mean square global asymptotic stability with distributed delays [11], and almost sure exponential stability [12,13]) is crucial for its overall performance, especially when dealing with complex processes. Hence, significant efforts have been directed towards exploring and enhancing the stability of neural networks.
It is well known that stability is the crucial property of stochastic neural networks, which are often affected simultaneously by parameter uncertainties and random interference factors that can impact their stability due to reasons such as system modeling, measurement errors, and system linearization, as documented in Refs. [14,15,16,17]. For example, Huang et al. [14] examined the exponential stability analysis of uncertain stochastic neural networks with multiple delays, and Wang et al. [15] studied the exponential stability of uncertain stochastic neural networks with mixed time delays. Chen et al. [16] investigated the mean square exponential stability of uncertain stochastic delayed neural networks, and Syed [17] surveyed the stochastic stability of uncertain recurrent neural networks with Markovian jumping parameters. However, these studies [14,15,16,17] only focused on the robust stability and asymptotic stability of stochastic neural networks with uncertain parameters, while the almost sure exponential stability of neural networks with both uncertain and random disturbances remains unexplored.
As noted above, the stochastic differential equation is a good tool for describing the stability of a stochastic neural network, and the dynamics of the stochastic differential system may be influenced by many other unknown, uncertain, and random disturbances. To address these, Itô [18] established the theory of stochastic analysis and stochastic differential equations with the Wiener process based on additive measures. Over the past 70 years, stochastic differential equations have matured, both in theory and practice, and they have become a vital tool in fields such as physics, systems science, management science, finance, and space science, especially the development of stochastic stability, as in [19,20,21,22]. An uncertain process, on the other hand, is a sequence of uncertain variables, with subadditive measures, that change over time. Liu [23] introduced the concept of a Liu process, which is the uncertain version of the Wiener process, in 2008. The Liu process is a Lipschitz continuous process with independent and steady increase properties, and its increments follow an uncertain normal distribution. Based on this process, Liu [24] introduced the chain rule in the process of uncertainty analysis to study the differentials and integrals of uncertain process functions, as well as a class of differential equations driven by standard Liu processes called uncertain differential equations [25]. Consequently, the stability of uncertain differential equations was discussed. When faced with a system that exhibits both uncertainty and randomness simultaneously, the noise should be modeled using the Wiener–Liu process, and the system evolution can be described through a hybrid differential equation, leading to the development of uncertain stochastic hybrid neural network systems [26]. In 2013, Liu [27] first introduced chance theory to investigate such uncertain stochastic systems based on subadditive measures, and subsequent works by Fei et al. [28,29] have further explored the use of the Wiener–Liu process and the Itô–Liu formula in uncertain stochastic differential equations. Researchers have made progress in studying various forms of the stability of stochastic neural networks based on additive measures, but the analysis of indeterminate neural networks, including both random and uncertain factors, requires chance theory’s subadditive measures. This paper will review some research results based on chance theory, exploring the stability of uncertain stochastic neural networks using the Itô–Liu formula and the Lyapunov method. The main contributions of this paper are the extension of two corollaries of the Itô–Liu formula under subadditive measures, the introduction of the concept of almost sure exponential stability for uncertain stochastic Hopfield neural networks for the first time, and the consideration of sufficient conditions for almost sure exponential stability and stabilization with linear uncertain stochastic perturbation.
In Section 2, we recall some results about Hopfield neural networks and some concepts, lemmas, theorems, and corollaries about chance theory, which are essential for our analysis. In Section 3, we present our main results about the almost sure exponential stability of uncertain stochastic neural networks. In Section 4, we present our conclusion.

2. Preliminaries

2.1. The Explanation of Symbols

We add the table of momenclature so that we could relate to symbols used in the paper easily (Table 1).

2.2. The Basic Knowledge

A Hopfield neural network [1] can be described in the form of an ordinary differential equation as follows:
F i u ˙ i ( k ) = 1 R i u i ( k ) + j = 1 m T i j f j ( u j ( k ) ) , 1 i m , k 0 ,
where u i ( k ) denotes the voltage on the input of the ith neuron, F i denotes the input capacitance, T i j is the connection matrix element, f i ( u ) is a nondecreasing transfer function, see Table 1, and f i ( 0 ) = 0 ; the following ς i is the slope of f i ( u ) at u = 0 , satisfying
u f i ( u ) 0 , | f i ( u ) | 1 ς i | u | , < u < + .
where 1 ς i | u | determines the upper bound of the function | f i ( u ) | and is denoted by
e i = 1 F i R i , b i j = T i j F i ,
then,
u ˙ k = E u k + B f ( u k ) , k 0 ,
where
u k = ( u 1 k , , u m k ) T , E = d i a g . ( e 1 , , e m ) , B = ( b i j ) m × m , f ( u ) = ( f 1 ( u 1 ) , , f m ( u m ) ) T .
Furthermore,
e i = j = 1 m | b i j | , 1 i m .
Itis easy to know that for any given initial case u 0 = z 0 R m , the equation has a unique solution. In particular, the equation is unique equilibrium solution u 0 = 0 . In other words, the zero point is the equilibrium point of the neural network system. The aim of this paper is to investigate the uncertain stochastic effects on the stability. The following reviews chance theory including some concepts, lemmas, theorems, and corollaries which are essential for our analysis.
Let Γ be a nonempty set, and L a σ -algebra over Γ . Each element Λ in L is called an event and M { Λ } is the belief degree. The uncertain measure dealing with belief degree satisfies the following axioms [23,25]:
  • Axiom 1 (Normality Axiom). M { Λ } = 1 for the universal set Γ .
  • Axiom 2 (Duality Axiom). M { Λ } + M { Λ c } = 1 for any event Λ .
  • Axiom 3 (Subadditivity Axiom). For every countable sequence of events Λ 1 , Λ 2 , ,
    M i = 1 Λ i i = 1 M k { Λ i }
    holds.
  • Axiom 4 (Product Axiom). Let ( Γ j , L j , M j ) be uncertainty spaces for j = 1 , 2 , . The product uncertain measure M is an uncertain measure satisfying
    M j = 1 Λ j = j = 1 M j { Λ j } .
    where Λ j are arbitrary events chosen from L j for j = 1 , 2 , , respectively.
Remark 1.
Axioms 1 and 2 are similar to probability theory, and axioms 3 and 4 are fundamentally different from probability theory. In particular, axiom 3 embodies subadditivity, which is different from the additivity of probability theory, and the product axiom of axiom 4 embodies the minimization operation, which is different from the product axiom of probability theory. The detailed analysis can be found in Refs. [23,25].
Definition 1
([23]). An uncertain variable is a measurable function ξ from an uncertainty space ( Γ , L , M ) to the set of real numbers, i.e., for any Borel set B of real numbers, the set
{ ξ B } = { γ Γ | ξ ( γ ) B }
 is an event.
Definition 2
([23]). Let T be an index set and ( Γ , L , M ) an uncertainty space. An uncertain process is a measurable function from T × ( Γ , L , M ) to the set of real numbers such that { Z k B } is an event for any Borel set B for each time k.
Definition 3
([23]). An uncertain process C k is said to be a Liu process if
  • (i) C 0 = 0 and almost all sample paths are Lipschitz continuous;
  • (ii) C k has stationary and independent increments;
  • (iii) every increment C r + k C r is a normal uncertain variable with expected value 0 and variance k 2 , whose uncertainty distribution is
    Φ ( x ) = 1 + exp π x 3 k 1 , x R .
Definition 4
([23]). Let Z k be an uncertain process with respect to time k and C k be a Liu process with respect to time k. For any partition of closed interval [ a , b ] with a = k 1 < k 2 < < k j + 1 = b , the mesh is written as
Δ = max 1 i j | k i + 1 k i | .
Then, the uncertain integral of Z k with respect to C k is
a b Z k d C k = lim Δ 0 i = 1 j Z k i · ( C k i + 1 C k i )
provided that the limit exists almost surely and is finite. In this case, the uncertain process Z k is said to be integrable.
Lemma 1
([25] (Liu inequality)). Let C k be a Liu process on uncertainty space ( Γ , L , M ) . Then, there exists an uncertain variable K such that K ( γ ) is a Lipschitz constant of the sample path C k ( γ ) for each γ,
lim x + M { γ Γ | K ( γ ) x } = 1
and
M { γ Γ | K ( γ ) x } 2 Φ ( x ) 1 .
Lemma 2
([26] (Liu lemma)). Suppose that C k is a Liu process, and Z k is an integrable uncertain process on [ a , b ] with respect to k. Then, the inequality
| a b Z k ( γ ) d C k |   K ( γ ) a b | Z k ( γ ) | d k
 holds, where K ( γ ) is the Lipschitz constant of the sample path Z k ( γ ) .
Let ( Ω , F , P ) be a complete probability space with a filtration { F k } k [ 0 , T ] satisfying the usual conditions, that is, it is increasing and right continuous while F 0 contains all P -null sets.
Let ( Γ , L , M ) be an uncertainty space where normality, duality, subadditivity, and product measure axioms are given. Let C k be Liu Liu process defined on ( Γ , L , M ) . The Liu process filtration { L k } k [ 0 , T ] is the sub- σ -field family ( L k , k [ 0 , T ] ) of L satisfying the usual conditions. It is generalized by σ ( C s : s k ) and M -null sets of L , L T = L .
Liu [27] first introduced chance theory to investigate a hybrid system with both uncertainty about belief degree and randomness. To investigate the uncertain stochastic differential systems, Fei [29] extended a filtered chance space ( Γ × Ω , L F , ( L k F k ) k [ 0 , T ] , M × P ) on which some concepts, theorems, are presented as follows.
Definition 5 ([29]).
(i) Let B be a Borel set; an uncertain random variable is a measurable function ξ R p (or R p × m ) from a chance space
( Γ × Ω , L F , M × P )
to R p (or R p × m ), that is, B R p (or R p × m ), so the set
{ ξ B } = { ( γ , ω ) Γ × Ω : ξ ( γ , ω ) B } L F .
(ii) B , { ξ B } is an uncertain random event with chance measure
Ch { ξ B } = 0 1 P ω Ω | M { γ Γ | ξ ( γ , ω ) B } x d x .
Definition 6 ([29]).
(a) An uncertain stochastic process is essentially a sequence of uncertain variables indexed by time. For each time k [ 0 , T ] , if Z k is an uncertain random variable, then we call Z k an uncertain stochastic process (or hybrid process). If the sample paths of Z k are continuous functions of k for almost all ( γ , ω ) Γ × Ω , then we call it continuous.
(b) If Z ( k , γ ) is F k -measurable for all k [ 0 , T ] , γ Γ , then we call it F k -adapted. Further, if Z ( k ) is L k F k -measurable for all k [ 0 , T ] , then we call it L k F k -adapted (or adapted).
(c) If the uncertain stochastic process is measurable related to the σ-algebra
( L k F k ) = { A B ( [ 0 , T ] ) L F : A ( [ 0 , k ] × Γ × Ω ) B ( [ 0 , k ] ) L k F k } .
then we call it progressively measurable.
Further, if the uncertain stochastic process Z ( k ) : Γ × Ω R p (or Z ( k ) : Γ × Ω R p × m is progressively measurable and satisfies k [ 0 , T ] , E [ 0 T | Z k | 2 d k ] , then we call it L 2 -progressively measurable, where L 2 ( 0 , T ; R p ) (or L 2 ( 0 , T ; R p × m )) denotes the set of L 2 -progressively measurable uncertain random processes.
Definition 7
([28]). Let W k be a Wiener process and C k a Liu process. Then, H k = ( W k , C k ) is called a Wiener–Liu process. The Wiener–Liu process is said to be standard if both W k and C k are standard.
Definition 8
([28]). Let Z k = ( Z ^ k , Z ˜ k ) , where Z ^ k and Z ˜ k are scalar uncertain stochastic processes, and let H k = ( W k , C k ) be a standard Wiener–Liu process. For any partition of a closed interval [ a , b ] with a = k 1 < k 2 < < k N + 1 = b , the mesh is written as
Δ = max 1 i N | k i + 1 k i | .
Then, the uncertain stochastic integral of Z k with respect to H k is
a b Z k d H k = lim Δ 0 i = 1 N ( Z ^ k i · ( W k i + 1 W k i ) + Z ˜ k i · ( C k i + 1 C k i ) )
 provided that the limit exists almost surely and is finite. In this case, the uncertain stochastic process Z k is said to be integrable.
Remark 2.
The uncertain stochastic integral may also be written as follows:
a b Z k d H k = a b ( Z ^ k d W k + Z ˜ k d C k ) .
The following theorem results in the Itô–Liu formula of the one-dimensional case.
Theorem 1
([28] (Itô–Liu formula)). Let H k be a Wiener–Liu process given by
H k = ( Z k , Z ¯ k ) = ( μ 1 k + σ 1 W k , μ 2 k + σ 2 C k ) .
Let W k be a Wiener process and C k a Liu process, and g ( k , z , z ¯ ) a twice continuously differentiable function. Define G k = g ( k , Z k , Z ¯ k ) . Then, we have the following chain rule:
d G k = g k ( k , Z k , Z ¯ k ) d k + g z ( k , Z k , Z ¯ k ) d W k + g z ¯ ( k , Z k , Z ¯ k ) d C k   + 1 2 2 g z 2 ( k , Z k , Z ¯ k ) d k .
Using Theorem 1, we can easily obtain the following two corollaries.
Corollary 1.
The infinitesimal increments d W k and d C k may be replaced with the derived Wiener–Liu process,
Z k = 0 k μ u d u + 0 k α u d W u + 0 k β u d C u ,
where μ k and β k are absolutely integrable uncertain stochastic processes, and α k is a square integrable uncertain stochastic process; then, Φ C 2 ( ) ( C 2 means second-order continuous differentiable), thus producing
Φ ( Z k ) = Φ ( Z 0 ) + 0 k Φ ( Z u ) μ u d u + 0 k Φ ( Z u ) α u d W u + 0 k Φ ( Z u ) β u d C u + 1 2 0 k Φ ( Z u ) α u 2 d u .
Let W k = ( W 1 k , W 2 k , , W p k ) and C k = ( C 1 k , C 2 k , , C q k ) be a p-dimensional standard Wiener process and a q-dimensional standard Liu process, respectively. If r i and v i j are absolute integrable hybrid processes, and w i j are square integrable hybrid processes, for i = 1 , 2 , , m , j = 1 , 2 , , q , then the m-dimensional hybrid process Z k = ( Z 1 k , Z 2 k , , Z m k ) is given by
d Z 1 k = r 1 d k + j = 1 p w 1 j d W j k + j = 1 q v 1 j d C j k d Z m k = r m d k + j = 1 p w m j d W j k + j = 1 q v m j d C j k ,
or, in matrix notation, simply
d Z k = r d k + w d W k + v d C k ,
where
r = r 1 r m , w = w 11 w 1 p w m 1 w m p , v = v 11 v 1 q v m 1 v m q , d W k = d W 1 k d W p k , d C k = d C 1 k d C q k .
Corollary 2.
Assume m-dimensional hybrid process Z k is given by
d Z k = r d k + w d W k + v d C k ,
Let g ( k , z 1 , , z m ) be a multivariate continuously differentiable function. Define G k = g ( k , Z 1 k , , Z m k ) . Then,
d G k = g k ( k , Z 1 k , , Z m k ) d k + i = 1 m g z i ( k , Z 1 k , , Z m k ) d Z i k + 1 2 i = 1 m j = 1 m 2 g z i z j ( k , Z 1 k , , Z m k ) d Z i k d Z j k ,
where d W i k d W j k = δ i j d k , d W i k d k = d k d W i k = d C ı k d C j k = d k d C ı k = d W i k d C ı k = 0 , for i , j = 1 , 2 , , p , 𝑙 , 𝘑 = 1 , 2 , , q . And
δ i j = 0 , i j 1 , i = j
In other words, it can be expressed as
d G k = g k ( k , Z 1 k , , Z m k ) d k + i = 1 p g z i ( k , W 1 k , , W p k , C 1 k , , C q k ) d W i k + j = 1 q g z m + j ( k , W 1 k , , W p k , C 1 k , , C q k ) d C j k + 1 2 i = 1 p 2 g z i 2 ( k , W 1 k , , W p k , C 1 k , , C q k ) d k .
Definition 9
([28]). Suppose W k is a standard, C k is a standard process, and f , g , and h are some given functions. Then,
d Z k = f ( k , Z k ) d k + g ( k , Z k ) d W k + h ( k , Z k ) d C k
is called an uncertain stochastic differential equation.

3. Main Results

Let us consider a hypothetical scenario in which an uncertain stochastic perturbation is introduced to the neural network, and as a result, the perturbed network can be modeled using an uncertain stochastic differential equation.
d z ( k ) = [ E z ( k ) + B f ( z ( k ) ) ] d k + g ( z ( k ) ) d W ( k ) + h ( z ( k ) ) d C ( k ) , k 0 , z ( 0 ) = z 0 R m ,
where W ( k ) = ( W 1 ( k ) , , W n ( k ) ) T denotes an n-dimensional Wiener process and f : R m R m × n (i.e., f ( z ) = ( f i j ( z ) ) m × n . Additionally, let C ( k ) = ( C 1 ( k ) , , C n ( k ) ) T and h : R m R m × n i.e., h ( z ) = ( h i j ( z ) ) m × n . In addition, g ( z ) and h ( z ) satisfy the Lipschitz continuous and satisfy the linear growth condition. Consequently, we can deduce from Refs. [28,29] that for k 0 , Equation (8) possesses a unique global solution z ( k , z 0 ) , assuming g ( 0 ) = h ( 0 ) = 0 for the sake of stability in this paper. As a result, Equation (8) possesses an equilibrium solution z ( k , 0 ) = 0 . Additionally, when z 0 0 , the uniqueness exists with chance measure one, that is, z ( k , z 0 ) 0 for all k 0 almost surely.
In contrast to Equation (3), Equation (8) represents a system with an uncertain stochastic perturbation. It is intriguing to explore the influence of uncertain stochastic perturbation on the stability characteristics of the neural network. In the next section, we will delve into these issues in great depth.

3.1. Almost Sure Exponential Stability

Definition 10.
Firstly, we assume that Equation (8) has a solution z 0 = 0 . Further, we assume that there exist two measure sets, M { Γ ϵ 1 } and P { Ω ϵ 2 } , such that for any ϵ 1 , ϵ 2 > 0 and for all γ Γ Γ ϵ 1 and ω Ω Ω ϵ 2 , the nonzero solution z ( k , z 0 ) of Equation (8) when z 0 0 satisfies the following condition:
lim sup k 1 k ln ( | z ( k , z 0 ) | ) < 0 ,
then, we call the uncertain stochastic neural network (8) almost surely exponentially stable, simply denoted as
lim sup k 1 k ln ( | z ( k , z 0 ) | ) < 0 , a . s .
Theorem 2.
Assume there exists a symmetric positive definite matrix P = ( p i j ) m × m and some constants μ R and ρ 1 , ρ 2 , H > 0 such that
2 z T P [ E z + B f ( z ) ] + t r [ g T ( z ) P g ( z ) ] μ z T P z ,
z T P g ( x ) g T ( z ) P z ρ 1 ( z T P z ) 2 ,
| z T P h ( x ) | ρ 2 n z T P z
for all z R m . Then, the solution of Equation (8) satisfies
lim sup k 1 k l n ( | z ( k , z 0 ) | ) ( ρ 1 H ρ 2 μ 2 ) a . s .
whenever z 0 0 . Especially, if ρ 1 H ρ 2 > μ / 2 , then the stochastic neural network (8) is almost surely exponentially stable.
Proof. 
Take the Lyapunov function
V ( z , k ) = z T P z .
Choose any nonzero value of z 0 and define z ( k , z 0 ) as z k . It follows from the fact that there is only one possible solution that z ( k ) will almost surely be nonzero for all k > 0 . The Itô–Liu formula implies that
d ( ln [ z k T P z k ] ) = 1 z k T P z k ( 2 z k T P [ E z k + B f ( z k ) ] + t r [ g T ( z k ) P g ( z k ) ] ) d k 2 [ z k T P z k ] 2 ( z k T P g ( z k ) g T ( z k ) P z k ) d k + 2 [ z k T P z k ] z k T P g ( z k ) d W k + 2 [ z k T P z k ] z k T P h ( z k ) d C k .
Considering condition (11), we obtain
ln [ z k T P z k ] ln [ z 0 T P z 0 ] + μ k 2 M k + 2 M k + 2 N k , a . s .
where
N k = 0 k 1 [ z s T P z s ] z s T P h ( z s ) d C s
for all k > 0 , where N k is an uncertain process and N 0 = 0 , and
M k = 0 k 1 [ z s T P z s ] z s T P g ( z s ) d W s ,
which is a continuous martingale that disappears when k = 0 . This martingale’s quadratic variation is denoted by M k . That is,
M k = 0 k 1 [ z s T P z s ] 2 ( z s T P g ( z s ) g T ( z s ) P z s ) d s .
By condition (12), we obtain
M k ρ 1 k .
Let l = 1 , 2 . and ϵ ( 0 , l ) be arbitrary. The exponential martingale inequality implies
P ( ω : sup 0 k l [ M k ϵ M k ] > 1 2 ϵ ln l ) 1 l .
Therefore, according to the Borel–Cantelli lemma, it follows that there exists a random integer l 0 ( ω ) for almost every ω Ω , such that for all l l 0 , the following holds:
sup 0 k l [ M k ϵ M k ] 1 2 ϵ ln l ,
that is,
M k ϵ M k + 1 2 ϵ ln l , 0 k l .
By condition (13), for any event γ Γ , we have
N k ( γ ) | N k ( γ ) | n · K ( γ ) 0 k 1 [ z s T P z s ] z s T P h ( z s ) d s n · K ( γ ) ρ 2 n k = K ( γ ) ρ 2 k ,
where K ( γ ) = max i K i ( γ ) , K i ( γ ) is a Lipschitz constant of C i k . By Lemma 1, for ϵ > 0 , there exists positive H = H ( γ ) , such that
M { γ Γ | K ( γ ) H } > 1 ϵ ,
namely, ϵ > 0 , Γ ϵ , such that
γ Γ Γ ϵ , N k ( γ ) H ρ 2 k .
Substituting this into (15) yields
ln [ z k T P z k ] ln [ z 0 T P z 0 ] + μ k ( 2 ϵ ) M k 2 H ρ 2 k + 1 ϵ ln l
for all 0 k l and l l 0 , almost surely. By (16), we can obtain that
ln [ z k T P z k ] ln [ z 0 T P z 0 ] + μ k ( 2 ϵ ) ρ 1 k 2 H ρ 2 k + 1 ϵ ln l
for all 0 k l and l l 0 , almost surely. So, for almost all ω Ω , γ Γ if l 1 k l and l l 0 , then
1 k ln [ z k T P z k ] [ ( 2 ϵ ) ρ 1 2 H ρ 2 μ ] + 1 l 1 ( ln [ z 0 T P z 0 ] + 1 ϵ ln l ) .
Letting ϵ 0 , we obtain
lim sup k 1 k ln [ z k T P z k ] [ 2 ρ 1 2 H ρ 2 μ ] .
Because P is a symmetric positive definite matrix, the minimum eigenvalue λ m i n > 0 , and then
λ m i n | z | 2 z T P z , z R m .
Thus
lim sup k 1 k ln [ z k T P z k ] lim sup k 1 k ln ( λ m i n | z k | 2 ) = lim sup k 1 k ( ln λ m i n + 2 ln | z k | ) = 2 lim sup k 1 k ln | z k | .
Thus
lim sup k 1 k ln ( | z k | ) ( ρ 1 H ρ 2 μ 2 ) .
We complete the proof. □
By Theorem 2, the following two sufficient conclusions can be obtained.
Theorem 3.
Suppose (2) is satisfied, and there exists a diagonal matrix P = diag ( p 1 , p 2 , , p m ) where p i > 0 for all i. Let μ > 0 , ρ 1 , ρ 2 be real numbers, and let the constant H > 0 such that
t r [ g T ( z ) P g ( z ) ] μ z T P z ,
z T P g ( z ) g T ( z ) P z ρ 1 ( z T P z ) 2 ,
| z T P h ( z ) | ρ 2 n z T P z
for all z R m . Denote by λ m a x (Q) the largest eigenvalue of the symmetric matrix Q = ( q i j ) m × m , where q i j is defined as follows:
q i j = 2 p i [ e i + ( 0 b i i ) ς i ] , f o r i = j , p i | b i j | ς j + p j | b j i | ς i , f o r i j .
Then, the solution of Equation (8) satisfies
(i) if λ m a x ( Q ) 0
lim sup k 1 k ln ( | z ( k , z 0 ) | ) ( 1 2 [ μ + λ m a x ( Q ) min 1 i m p i ] + H ρ 2 ρ 1 ) , a . s .
(ii) if λ m a x ( Q ) < 0
lim sup k 1 k ln ( | z ( k , z 0 ) | ) ( 1 2 [ μ + λ m a x ( Q ) min 1 i m p i ] + H ρ 2 ρ 1 ) , a . s .
whenever z 0 0 .
Proof. 
It holds from (2) that
2 z T P A f ( x ) = 2 i , j = 1 m z i p i b i j f j ( z j ) 2 i p i ( 0 b i i ) z i f i ( z i ) + 2 i j | z i | p i | b i j | ς j | z j | 2 i p i ( 0 b i i ) ς i z i 2 + i j | z i | ( p i | b i j | ς j + p j | b j i | ς i ) | z j | .
Thus, when λ m a x ( Q ) 0 ,
2 z T P [ E z + B f ( z ) ] ( | z 1 | , , | z m | ) Q ( | z 1 | , , | z m | ) T λ m a x ( Q ) | z | 2 λ m a x ( Q ) min 1 i m p i z T P z .
We can easily arrive at conclusion (18) by applying Theorem 2. Additionally, when λ m a x ( Q ) < 0 ,
2 z T P [ E z + B f ( z ) ] ( | z 1 | , , | z m | ) Q ( | z 1 | , , | z m | ) T λ m a x ( Q ) | z | 2 λ m a x ( Q ) min 1 i m p i z T P z .
By utilizing Theorem 2 once more, we can arrive at conclusion (19). Hence, we complete the proof. □
Theorem 4.
Suppose both (2) and (4) are satisfied, where δ i j is defined the same as (6). Additionally, assume that there exist m positive numbers p 1 , p 2 , , p m such that
ς j 2 i = 1 m p i [ 0 s i g n ( b i i ) ] δ i j | b i j | p j e j , 1 j m ,
and
t r [ g T ( z ) P g ( z ) ] μ z T P z ,
z T P g ( z ) g T ( z ) ρ 1 ( z T P z ) 2 ,
| z T P h ( z ) | ρ 2 n z T P z ,
where P = d i a g . ( p l , p 2 , , p m ) and the real numbers μ > 0 , ρ 1 , ρ 2 , H > 0 . Then for all z R m , the solution of Equation (8) satisfies
lim sup k 1 k l n ( | z ( k , z 0 ) | ) ( ρ 1 H ρ 2 μ 2 ) a . s .
Proof. 
By condition, we can obtain that
2 z T P A f ( x ) = 2 i , j = 1 m z i p i b i j f j ( z j ) 2 i , j = 1 m | z i | p i [ 0 s i g n ( b i i ) ] δ i j | b i j | ς j | z j | i , j = 1 m p i [ 0 s i g n ( b i i ) ] δ i j | b i j | ( z i 2 + ς j 2 z j 2 ) i = 1 m p i ( j = 1 m | b i j | ) z i 2 + j = 1 m ( ς j 2 i = 1 m p j [ 0 s i g n ( b i i ) ] δ i j | b i j | ) z j 2 i = 1 m p i e i z i 2 + j = 1 m p j e j z j 2 = 2 z T P E z .
Hence
2 z T P [ E z + B f ( z ) ] + t r [ g T ( z ) P g ( z ) ] μ z T P z .
So, by Theorem 2 again, we complete the proof. □
Theorem 5.
Suppose both (2) and (4) are satisfied. We assume that the network is symmetric, meaning that
| b i j | = | b j i | , 1 i , j m .
Moreover, assume
t r [ g T ( z ) P g ( z ) ] μ | z | 2 ,
z T P g ( z ) g T ( z ) ρ 1 | z | 4 ,
| z T h ( z ) | ρ 2 | z | 2
hold for all z R m , where μ > 0 and ρ 1 , ρ 2 , H > 0 are constants. Then, the solution to Equation (8) holds that
lim sup k 1 k l n ( | z ( k , z 0 ) | ) ( ρ 1 H ρ 2 + e ^ ( 1 ς ˇ ) μ 2 ) a . s .
1 ς ˇ , or
lim sup k 1 k l n ( | z ( k , z 0 ) | ) ( ρ 1 H ρ 2 e ˇ ( ς ˇ 1 ) μ 2 ) a . s .
1 < ς ˇ , whenever z 0 0 , where
ς ^ = max 1 i m ς i , e ˇ = max 1 i m e i , e ^ = min 1 i m e i .
Proof. 
By condition, we can obtain that
2 z T A f ( x ) = 2 i , j = 1 m z i b i j f j ( z j ) 2 i , j = 1 m | z i | b i j | ς j | z j | ς ˇ i , j = 1 m | b i j | ( z i 2 + z j 2 ) = ς ˇ [ i = 1 m ( j = 1 m | b i j | ) z i 2 + j = 1 m ( i = 1 m | b j i | ) z j 2 ] = ς ˇ [ i = 1 m e i z i 2 + j = 1 m e j z j 2 ] = 2 ς ˇ z T E z ,
and
2 z T [ E z + B f ( z ) ] + t r [ g T ( z ) P g ( z ) ] 2 ( 1 ς ˇ ) z T E z .
Therefore, in the case 1 ς ˇ ,
2 z T [ E z + B f ( z ) ] + t r [ g T ( z ) P g ( z ) ] [ 2 e ^ ( 1 ς ˇ ) + μ ] | z | 2 .
When 1 ς ˇ , applying Theorem 2 with P being the identity matrix, we can deduce that
lim sup k 1 k l n ( | z ( k , z 0 ) | ) ( ρ 1 H ρ 2 + e ^ ( 1 ς ˇ ) μ 2 ) a . s .
When 1 < ς ˇ ,
2 z T [ E z + B f ( z ) ] + t r [ g T ( z ) P g ( z ) ] [ 2 e ^ ( 1 ς ˇ ) + μ ] | z | 2 .
It follows from Theorem 2 again that
lim sup k 1 k l n ( | z ( k , z 0 ) | ) ( ρ 1 H ρ 2 e ˇ ( ς ˇ 1 ) μ 2 ) a . s .
We complete the proof. □

3.2. Stabilization by Linear Uncertain Stochastic Perturbation

We are aware that neural network
u ˙ k = E u k + B f ( u k )
can sometimes be unstable. It may be assumed that subjecting an unstable neural network to an uncertain stochastic perturbation would cause it to behave even worse, or become more unstable. However, this is not always the case. Uncertain stochastic perturbation can actually make an unstable neural network more stable. In this section, we will demonstrate that any neural network of the form (3) can be stabilized by uncertain stochastic perturbation. For practical purposes, we will only consider linear uncertain stochastic perturbations. This means that we will only focus on perturbations of the form:
g ( z ( k ) ) d W k = l = 1 n G l z ( k ) d W l ( k ) , h ( z ( k ) ) d C k = l = 1 n H l z ( k ) d C l ( k )
i.e., g ( z ) = ( G 1 z , G 2 z , , G n z ) ,   h ( z ) = ( H 1 z , H 2 z , , H n z ) , where G l , H l , 1 l n are all m × m matrices. In this case, the uncertain stochastic perturbed network (8) becomes
d z ( k ) = [ E z ( k ) + B f ( z ( k ) ) ] d k + l = 1 n G l z ( k ) d W l ( k ) + l = 1 n H l z ( k ) d C l ( k ) , k 0 z ( 0 ) = z 0 R m .
Note that
t r [ g T ( z ) P g ( z ) ] = l = 1 n z T G l T P G l z , t r [ h T ( x ) P h ( z ) ] = l = 1 n z T H l T P H l z ,
z T P g ( z ) g T ( z ) P z = t r [ g T ( z ) P z z T P g ( z ) ] = l = 1 n z T G l T P z T P G l z = l = 1 n ( z T P G l z ) 2 ,
and
z T P h ( z ) h T ( x ) P z = t r [ h T ( z ) P z z T P h ( z ) ] = l = 1 n z T H l T P z T P H l z = l = 1 n ( z T P H l z ) 2 .
The proof can be obtained easily by Theorem 2, which we omit here.
Theorem 6.
Assume there exists a symmetric positive definite matrix P = ( p i j ) m × n and some constants μ R and ρ 1 , ρ 2 , H 0 such that
2 z T [ E z + B f ( z ) ] + l = 1 n z T G l T P G l z μ z T P z
and
l = 1 n ( z T P G l z ) 2 ρ 1 ( z T P z ) 2 ,
l = 1 n ( z T P H l z ) 2 ρ 2 n z T P z
for all z R m . Then, the solution of Equation (8) satisfies
lim sup k 1 k l n ( | z ( k , z 0 ) | ) ( ρ 1 H ρ 2 μ 2 ) a . s .
whenever z 0 0 . Especially, if ρ 1 H ρ 2 > μ / 2 , then the stochastic neural network (8) is almost surely exponentially stable.

3.3. Some Examples

Example 1.
Let
G l = ζ l I , H l = ϑ l I , 1 l n ,
where ζ l , ϑ l , 1 l n are all real numbers and I is the identity matrix. Then, Equation (29) becomes
d z ( k ) = [ E z ( k ) + B f ( z ( k ) ) ] d k + l = 1 n ζ l z ( k ) d W l ( k ) + l = 1 n ϑ k z ( k ) d C l ( k ) .
The parameters ζ l , ϑ l , 1 l n denote the strength of the stochastic and uncertain perturbations, respectively. By selecting the identity matrix as the value of P, we observe that
l = 1 n z T G l T P G l z = l = 1 n | G l z | 2 = l = 1 n ζ l 2 | z | 2
and
l = 1 n ( z T P G l z ) 2 = l = 1 n ( z T ζ l z ) 2 = l = 1 n ζ l 2 | z | 4 .
Similarly, we have
l = 1 n ( z T P H l z ) 2 = l = 1 n ϑ k 2 | z | 4 = l = 1 n ϑ k 2 | z | 2 .
Moreover, by (2), we have
2 z T P A f ( z ) 2 | z | A f ( z ) 2 ς ˇ A | z | 2 ,
where ς ˇ = max 1 l m ς l and A = sup { | A z | : z R m , | z | = 1 } . Hence,
2 z T P [ E z + B f ( z ) ] 2 ( ς ˇ e ^ ) | z | 2 ,
where e ^ = min 1 l m e l . By combining Equations (33)–(36) and utilizing Theorem 6, we can conclude that the solution to Equation (32) meets
lim sup k 1 k ln ( | z ( k , z 0 ) | ) ( l = 1 n ζ l 2 n H l = 1 n ϑ k 2 ( ς ˇ A e ^ ) ) , a . s .
whenever z 0 0 . Especially, if
l = 1 n ζ l 2 n H l = 1 n ϑ k 2 > ς ˇ A e ^
hold, then the uncertain stochastic neural network (32) is almost surely exponentially stable.
Remark 3.
If we set ζ l = 0 for 2 l n , then Equation (32) simplifies even further to
d z ( k ) = [ E z ( k ) + B f ( z ( k ) ) ] d k + ζ 1 z ( k ) d W 1 ( k ) + ϑ 1 z ( k ) d C 1 ( k ) ,
here, we just rely on a Wiener–Liu process scalar as the origin of the uncertain stochastic perturbation. This uncertain stochastic network is almost surely exponentially stable provided
ζ 1 2 H ϑ > ς ˇ A e ^ .
The neural network described by u ˙ k = E u k + B f ( u k ) can be stabilized by incorporating a sufficiently strong and uncertain stochastic perturbation in a particular way. In other words, we can draw the corollary that this simple example illustrates.
Corollary 3.
If (2) is satisfied, a Wiener–Liu process can stabilize any neural network with the given form
u ˙ k = E u k + B f ( u k ) .
Notably, it is also feasible to utilize a single scalar Wiener–Liu process for this purpose.
Example 2.
For each l, choose a positive definite m × m matrix U l and V l such that
z T U l z 3 2 U l | z | 2 , z T V l z 1 2 V l | z | 2 .
There are numerous matrices that meet the criteria or characteristics being discussed. Let ζ be a real number and define G l = ζ U l . Let ϑ be a real number and define H l = ϑ V l . Then, Equation (29) becomes
d z ( k ) = [ E z ( k ) + B f ( z ( k ) ) ] d k + ζ l = 1 n U l z ( k ) d W l ( k ) + ϑ l = 1 n V l z ( k ) d C l ( k ) .
And let P be the identity matrix, noting that
l = 1 n z T G k T P G k z = l = 1 n | ζ U l z | 2 ζ 2 l = 1 n U l 2 | z | 2 ,
l = 1 n ( z T P G k z ) 2 = ζ 2 l = 1 n ( z T U l z ) 2 3 ζ 2 4 l = 1 n U l 2 | z | 4
and
l = 1 n ( z T P H l z ) 2 = ϑ 2 l = 1 n ( z T V l z ) 2 ϑ 2 l = 1 n V l 2 | z | 2 .
By merging (36) with the above and then utilizing Theorem 6, we can deduce that the solution to (37) satisfies
lim sup k 1 k l o g ( | z ( k , z 0 ) | ) ( 3 ζ 2 4 l = 1 n U l 2 1 2 n H ϑ l = 1 n V l 2 ( ς ˇ A e ^ ) ) a . s .
whenever z 0 0 . So, if
3 ζ 2 4 l = 1 n U l 2 1 2 n H ϑ l = 1 n V l 2 ( ς ˇ A e ^ ) ,
then the uncertain stochastic neural network (37) is almost surely exponentially stable.
Example 3.
We examine the scenario where the network’s dimension, denoted as m, is an even number, specifically m = 2 q ( q 1 ) . Suppose we set n to 1, meaning we select a scalar Wiener–Liu process ( W 1 ( k ) , C 1 ( k ) ) . Additionally, let ζ be a real number and P the identity matrix again; then, we define that
G 1 = 0 ζ 0 ζ 0 0 ζ 0 ζ 0 , H 1 = 0 ϑ 0 ϑ 0 0 ϑ 0 ϑ 0 .
Then, Equation (29) becomes
d z ( k ) = [ E z ( k ) + B f ( z ( k ) ) ] d k + ζ z 2 ( k ) z 1 ( k ) z 2 q ( k ) z 2 q 1 ( k ) d W 1 ( k ) + ϑ z 2 ( k ) z 1 ( k ) z 2 q ( k ) z 2 q 1 ( k ) d C 1 ( k ) .
Note that
z T G 1 T P G 1 z = ζ 2 | z | 2 , ( z T P G 1 z ) 2 = 0
and
2 z T P [ E z + B f ( z ) ] 2 ( ς ˇ A e ^ ) | z | 2 .
By integrating (40) with (41), and subsequently utilizing Theorem 6, we can derive that the solution to (39) meets:
lim sup k 1 k ln ( | z ( k , z 0 ) | ) ( 1 2 ζ k 2 ( ς ˇ A e ^ ) ) , a . s .
whenever z 0 0 . So, the uncertain stochastic neural network (39) is almost surely exponentially stable if ζ k 2 > 2 ( ς ˇ e ^ A ) .
Remark 4.
Different from the almost sure exponential stability of stochastic Hopfield neural networks based on the probability theory of additive measures [6,12], uncertain stochastic Hopfield neural networks are more complex in terms of handling conditions and processes of almost sure exponential stability, such as the conditions of Theorems 2–6. In addition, we use the Itô–Liu formula, Liu inequality (Lemma 1), the Liu lemma (Lemma 2), etc, and these conclusions are all obtained using subadditive measures.
Remark 5.
The practical significance of almost sure exponential stability in uncertain stochastic Hopfield neural networks is that it ensures robust and reliable performance in real-world applications, such as image or speech recognition, financial analysis, or control systems. Almost sure exponential stability enables the network to reliably handle uncertainties and variations in the input data. It improves the neural network’s ability to generalize and make accurate predictions, even when faced with Liu noises and Wiener noises. This stability increases the neural network’s practical usefulness and applicability in real-world scenarios.

4. Conclusions

The main focus of this paper is the stability of Hopfield neural network dynamical systems with uncertain stochastic perturbations. The paper presents a theorem for judging the stability of such systems, along with two conclusions of sufficient conditions for stability. The stability of neural network systems with linear uncertain stochastic perturbations is studied in order to facilitate the discussion. We note that uncertain stochastic neural networks can be divided into two types: one is uncertain stochastic neuron activation functions, such as the Boltzman machine model, and the other is neural networks with uncertain stochastic weighted connections. Therefore, when considering uncertain stochastic neural networks, both of these cases should be considered. The uncertain stochastic neural network model studied in the paper is the second type, which involves neural networks with uncertain stochastic weighted connections. Overall, this paper provides a valuable contribution to the field of neural networks by considering the effects of both stochastic and uncertain elements on network stability and proposing methods for analyzing such systems. This work can also extend to the two-layer cellular neural network, impulsive model, or the reaction diffusion model, as in Refs [30,31,32]. There is currently no corresponding research result for neural networks using uncertain stochastic neuron activation functions, uncertain stochastic two-layer cellular neural network, the uncertain stochastic impulsive model, or the reaction diffusion model, and researchers can develop these areas in the near future.

Author Contributions

Conceptualization, Z.J. and C.L.; methodology, Z.J. and C.L.; software, Z.J. and C.L.; validation, Z.J. and C.L.; formal analysis, Z.J.; investigation, Z.J. and C.L.; writing—original draft preparation, Z.J.; writing—review and editing, Z.J. and C.L.; supervision, C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Natural Science Foundation of Ningxia (no. 2020AAC03242), Major Projects of North Minzu University (no. ZDZX201805), Governance and Social Management Research Center of Northwestic regions, and Nation and First-Class Disciplines Foundation of Ningxia (Grant No. NXYLXK2017B09).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hopfield, J.; Tank, D. Neural computation of decision in optimization problems. Biol. Cybern. 1985, 52, 141–152. [Google Scholar] [CrossRef] [PubMed]
  2. Haykin, S. Neural Networks: A Comprehensive Foundation; Prentice Hall: Hoboken, NJ, USA, 1998. [Google Scholar]
  3. Joya, G.; Atencia, M.A.; Soval, F. Hopfield neural networks for optimization: Study of the different dynamics. Neurocomputing 2002, 43, 219–237. [Google Scholar] [CrossRef]
  4. Young, S.S.; Scott, P.D.; Nasrabadi, N.M. Object recognition using multilayer Hopfield neural network. IEEE Trans. Image Process. 1997, 6, 357–372. [Google Scholar] [CrossRef] [PubMed]
  5. Wang, Y.Y.; Xie, L.H.; De Souza, C.E. Robust control of a class of uncertain nonlinear systems. Syst. Control Lett. 1992, 19, 139–149. [Google Scholar] [CrossRef]
  6. Liao, X.; Mao, X. Exponential stability and instability of stochastic neural networks. Stoch. Ann. Appl. 1996, 14, 165–185. [Google Scholar] [CrossRef]
  7. He, Y.; Liu, G.P.; Rees, D.; Wu, M. Stability analysis for neural networks with time-varying interval delay. IEEE Trans. Neural Netw. 2007, 18, 1850–1854. [Google Scholar] [CrossRef]
  8. Wang, Q.; Liu, X.Z. Exponential stability of impulsive cellular neural networks with time delays via Lyapunov functions. Appl. Math. Comput. 2007, 194, 186–198. [Google Scholar]
  9. Wang, Z.D.; Fang, J.A.; Liu, X.H. Global stability of stochastic high-order neural networks with discrete and distributed delays. Chaos Solitons Fractals 2008, 36, 388–396. [Google Scholar] [CrossRef]
  10. Huang, C.X.; He, Y.G.; Wang, H.N. Mean square exponential stability of stochastic recurrent neural networks with time-varying delays. Comput. Math. Appl. 2008, 56, 1773–1778. [Google Scholar] [CrossRef] [Green Version]
  11. Guo, Y.X. Mean square global asymptotic stability of stochastic recurrent neural networks with distributed delays. Appl. Math. Comput. 2009, 215, 791–795. [Google Scholar] [CrossRef]
  12. Liu, L.; Zhu, Q.X. Almost sure exponential stability of numerical solutions to stochastic delay Hopfeld neural networks. Appl. Math. Comput. 2015, 266, 698–712. [Google Scholar]
  13. Zhao, Y.; Zhu, Q.X. Stabilization of stochastic highly nonlinear delay systems with neutral term. IEEE Trans. Autom. Control 2023, 68, 2544–2551. [Google Scholar] [CrossRef]
  14. Huang, H.; Cao, J. Exponential stability analysis of uncertain stochastic neural networks with multiple delays. Nonlinear Anal. Real World Appl. 2007, 8, 646–653. [Google Scholar] [CrossRef]
  15. Wang, Z.; Lauria, S.; Fang, J.; Liu, X. Exponential stability of uncertain stochastic neural networks with mixed time-delays. Chaos, Solitons Fractals 2007, 32, 62–72. [Google Scholar] [CrossRef]
  16. Chen, W.H.; Lu, X.M. Mean square exponential stability of uncertain stochastic delayed neural networks. Phys. Lett. A 2008, 372, 1061–1069. [Google Scholar] [CrossRef]
  17. Ali, M.S. Stochastic stability of uncertain recurrent neural networks with Markovian jumping parameters. Acta Math. Sci. 2015, 35, 1122–1136. [Google Scholar]
  18. Itô, K. On stochastic differential equations. Am. Math. Soc. 1951, 4, 1–51. [Google Scholar]
  19. Yu, J.J.; Zhang, K.J.; Fei, S.M. Further results on mean square exponential stability of uncertain stochastic delayed neural networks. Commun. Nonlinear Sci. Numer. Simul. 2009, 14, 1582–1589. [Google Scholar] [CrossRef]
  20. Deng, F.Q.; Luo, Q.; Mao, X.R. Stochastic stabilization of hybrid differential equations. Automatica 2012, 48, 2321–2328. [Google Scholar] [CrossRef]
  21. Guo, Q.; Mao, X.R.; Yue, R.X. Almost sure exponential stability of stochastic differential delay equations. SIAM J. Control Optim. 2016, 54, 1919–1933. [Google Scholar] [CrossRef] [Green Version]
  22. Zhu, Q.X. Stabilization of stochastic nonlinear delay systems with exogenous disturbances and the event-triggered feedback control. IEEE Trans. Autom. Control 2019, 64, 3764–3771. [Google Scholar] [CrossRef]
  23. Liu, B. Fuzzy process, hybrid process and uncertain process. J. Uncertain Syst. 2008, 2, 3–16. [Google Scholar]
  24. Liu, B. Some research problems in uncertainty theory. J. Uncertain Syst. 2009, 3, 3–10. [Google Scholar]
  25. Chen, X.; Liu, B. Existence and uniqueness theorem for uncertain differential equations. Fuzzy Optim. Decis. Mak. 2010, 9, 69–81. [Google Scholar] [CrossRef]
  26. Yao, K.; Gao, J.; Gao, Y. Some stability theorems of uncertain differential equation. Fuzzy Optim. Decis. Mak. 2013, 12, 3–13. [Google Scholar] [CrossRef]
  27. Liu, Y. Uncertain random variables: A mixture of uncertainty and randomness. Soft Comput. 2013, 17, 625–634. [Google Scholar] [CrossRef]
  28. Fei, W. Optimal control of uncertain stochastic systems with markovian switching and its applications to portfolio decisions. Cybern. Syst. 2014, 45, 69–88. [Google Scholar] [CrossRef] [Green Version]
  29. Fei, W. On existence and uniqueness of solutions to uncertain backward stochastic differential equations. Appl. Math. 2014, 29, 53–66. [Google Scholar] [CrossRef] [Green Version]
  30. Arena, P.; Baglio, S.; Fortuna, L.; Manganaro, G. Self-organization in a two-layer CNN. IEEE Trans. Autom. Control 1998, 45, 157–162. [Google Scholar] [CrossRef]
  31. Zhang, T.W.; Xiong, L.L. Periodic motion for impulsive fractional functional differential equations with piecewise Caputo derivative. Appl. Math. Lett. 2020, 101, 106072. [Google Scholar] [CrossRef]
  32. Huang, H.; Zhao, K.; Liu, X. On solvability of BVP for a coupled Hadamard fractional systems involving fractional derivative impulses. AIMS Math. 2022, 7, 19221–19236. [Google Scholar] [CrossRef]
Table 1. The explanation of symbols related to this paper.
Table 1. The explanation of symbols related to this paper.
Numbers           The Symbols of This Paper          The Explanation of Symbols
1           u i ( k )           voltage on the input of the ith neuron
2           F i           input capacitance
3           T i j           connection matrix element
4           f i ( u )           nondecreasing transfer function
5           ς i           slope of f i ( u ) at u = 0
6           M           uncertain measure
7          k          time
8           C k           Liu process
9           C h           chance measure
10           P           probability measure
11           W k           Wiener process
12           Z k           uncertain process or uncertain stochastic process
13          sup          supremum
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jia, Z.; Li, C. Almost Sure Exponential Stability of Uncertain Stochastic Hopfield Neural Networks Based on Subadditive Measures. Mathematics 2023, 11, 3110. https://doi.org/10.3390/math11143110

AMA Style

Jia Z, Li C. Almost Sure Exponential Stability of Uncertain Stochastic Hopfield Neural Networks Based on Subadditive Measures. Mathematics. 2023; 11(14):3110. https://doi.org/10.3390/math11143110

Chicago/Turabian Style

Jia, Zhifu, and Cunlin Li. 2023. "Almost Sure Exponential Stability of Uncertain Stochastic Hopfield Neural Networks Based on Subadditive Measures" Mathematics 11, no. 14: 3110. https://doi.org/10.3390/math11143110

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop