Next Article in Journal
Generalized p-Convex Fuzzy-Interval-Valued Functions and Inequalities Based upon the Fuzzy-Order Relation
Next Article in Special Issue
Global Exponential Stability of Fractional Order Complex-Valued Neural Networks with Leakage Delay and Mixed Time Varying Delays
Previous Article in Journal
Fractional Order Modeling the Gemini Virus in Capsicum annuum with Optimal Control
Previous Article in Special Issue
Synchronization of Fractional Order Uncertain BAM Competitive Neural Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Robust Stability of Fractional Order Memristive BAM Neural Networks with Mixed and Additive Time Varying Delays

1
School of Mathematics and Statistics, Shandong Normal University, Jinan 250014, China
2
Department of Mathematics, Thiruvalluvar University, Vellore 632115, India
3
Department of Computer Science, College of Arts and Science, Prince Sattam Bin Abdulaziz University, Wadi Aldawaser 11991, Saudi Arabia
*
Author to whom correspondence should be addressed.
Fractal Fract. 2022, 6(2), 62; https://doi.org/10.3390/fractalfract6020062
Submission received: 5 November 2021 / Revised: 5 January 2022 / Accepted: 17 January 2022 / Published: 25 January 2022
(This article belongs to the Special Issue Frontiers in Fractional-Order Neural Networks)

Abstract

:
This paper is concerned with the problem of the robust stability of fractional-order memristive bidirectional associative memory (BAM) neural networks. Based on Lyapunov theory, fractional-order differential inequalities and linear matrix inequalities (LMI) are applied to obtain a robust asymptotical stability. Finally, numerical examples are presented.

1. Introduction

Fractional calculus has a long history of more than three hundred years. Fractional calculus can be considered as the generalization for traditional calculus, from the integer order to the arbitrary order [1], and it relates to the calculus of the integrals and derivatives of orders that may be real or complex. Very recently, a study concerning FNNs with mixed and additive time-varying delays has made this an active research topic. They discussed applications in various fields, such as viscoelastic systems, diffusion waves, quantitative finance, etc. [2,3,4,5,6,7,8,9].
The BAM neural network is a two-layer neural network which can generalize not only auto-associative memory, but also hetero-associative memory [10,11,12,13,14,15,16]. This has been widely applied in many fields, such as image processing, pattern recognition, automatic control, and optimization problems. With the development and application of memristors, BAM memristive neural networks have become an active area of research [17,18,19,20]. MBAMNNs are the combination of memristors and BAMNNs. By adjusting the connection weight matrice, they can simulate human brains better than traditional BAMNNs.
On the other hand, we know that signals transmitted from one point to another may pass through two sections of networks, and due to the changes in network transmission conditions, time delays have different features; therefore, there exists another kind of delay, named additive (successive) delay. Recently, a new model for neural networks with pair additive time-varying delays should be considered in [21]. By constructing some advanced techniques, a new asymptotic stability criterion for neural networks with two successive delay components is derived in [22]. Inspired by [23,24] we first consider systems with two additive delay components. As we all know, in the process of the realization of neural network circuits, time delay is inevitable, and the existence of time delay often causes system stability or deteriorates its performance [25,26,27,28,29].
In practice, the stability of a well-designed neural network system may often be destroyed by its unavoidable uncertainty, due to the existence of modeling error, external disturbance, and parameter fluctuation during implementation. Most results are related to the dynamical analysis of FONNs concentrated on robust stability [30,31,32,33,34,35]. One instinctive plan to manage this issue is to access the disturbance or effect of the disturbances from quantifiable variables, and thereafter, a control move can be made, in context of the disturbance estimate, to compensate for the effect of the disturbances. This essential idea can be ordinarily stretched out to manage uncertainties or unmodelled dynamics that could be considered as a part of the disturbance. When analyzing the stability of NNs, not only delay, but also parameter uncertainty, should be considered because of the existence of disturbances in the environment [36,37,38,39].
Motivated by the above discussions, we try to investigate the robust stability of fractional-order memristive BAM neural networks.
(1)
The proposed memristive BAM neural networks model contains mixed and additive time-varying delays.
(2)
The proposed main proofs are proved with the some effective analytical techniques.
(3)
A new sufficient criterion is derived in terms of LMI, which can be effectively solved in the LMI MATLAB toolbox.
(4)
Finally, we provide a numerical example.

2. Preliminaries

Definition 1
([40]). The Caputo derivative of the fractional order γ of the function 𝕙 𝕥 is given
0 C D 𝕥 γ 𝕙 ( 𝕥 ) = 1 Γ ( m γ ) 0 𝕥 ( 𝕥 𝕤 ) m γ 1 𝕙 ( 𝕞 ) ( 𝕤 ) 𝕕 𝕤 .
in which 𝕞 1 < γ < 𝕞 .
Lemma 1
([41]). If U > 0 is a constant and P , Q are real matrices, then
P T Q + Q T P U P T P + U 1 Q T Q .
Lemma 2
([41]). Given that P , Q , R are constant matrices, where P = P T , Q = Q T , then
P R R T Q < 0 .
iff Q > 0 and P + RQ 1 R T < 0 .
Lemma 3
([42]). Let T 𝕚 R 𝕟 × 𝕟 , ( 𝕚 = 0 , 1 , . . . . , 𝕡 ) be symmetric matrices. The conditions on T 𝕚 ( 𝕚 = 0 , 1 , . . . , 𝕡 ) , ξ T T 𝕚 ξ > 0 , for all ξ 0 , such that ξ T T 0 ξ > 0 ( 𝕚 = 1 , 2 , . . , 𝕡 ) hold if there exist τ 𝕚 0 , ( 𝕚 = 1 , 2 , . . , 𝕡 ) such that
T 0 𝕚 = 1 𝕡 τ 𝕚 T 𝕚 > 0 .
Lemma 4
([43]). Let μ ( ) : [ 𝕒 , 𝕓 ] R 𝕟 be a scalar function with scalars 𝕒 < 𝕓 and S is matrix then
𝕒 𝕓 μ T ( ) 𝕕 S 𝕒 𝕓 μ ( ) 𝕕 ( 𝕓 𝕒 ) 𝕒 𝕓 μ T ( ) S μ ( ) 𝕕 .
Hypothesis 1.
The functions 𝕙 𝕛 ( · ) and 𝕘 𝕚 ( · ) satisfy
| 𝕙 𝕛 ( 𝕩 ) 𝕙 𝕛 ( 𝕪 ) | 𝕡 𝕛 | 𝕩 𝕪 | , 𝕛 = 1 , 2 , . . , 𝕟 , | 𝕘 𝕚 ( 𝕩 ) 𝕘 𝕚 ( 𝕪 ) | 𝕢 𝕚 | 𝕩 𝕪 | , 𝕚 = 1 , 2 , . . , 𝕞 .

3. Main Results

Consider fractional-order memristive BAM neural networks with additive and mixed time-varying delays,
D α 𝕚 ( 𝕥 ) = 𝕣 𝕚 𝕚 ( 𝕥 ) + 𝕛 = 1 𝕟 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) 𝕙 𝕛 ( 𝕛 ( 𝕥 ) ) + 𝕛 = 1 𝕟 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) 𝕙 𝕛 ( 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) + 𝕛 = 1 𝕟 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕛 ( 𝕤 ) ) 𝕕 𝕤 + I 𝕚 ( 𝕥 ) , 𝕚 = 1 , . . . . , 𝕞 D α 𝕛 ( 𝕥 ) = 𝕞 𝕛 𝕛 ( 𝕥 ) + 𝕚 = 1 𝕞 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) 𝕘 𝕚 ( 𝕚 ( 𝕥 ) ) + 𝕚 = 1 𝕞 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) 𝕘 𝕚 ( 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) + 𝕚 = 1 𝕞 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( 𝕚 ( 𝕤 ) ) 𝕕 𝕤 + J 𝕛 ( 𝕥 ) , 𝕛 = 1 , . . . . , 𝕟
where, 𝕚 ( 𝕥 ) and 𝕛 ( 𝕥 ) denote the state variable related to the 𝕚th and 𝕛th neurons. ( 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ) 𝕟 × 𝕞 , ( 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ) 𝕞 × 𝕟   ( 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ) 𝕟 × 𝕞 , ( 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ) 𝕞 × 𝕟 ( 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ) 𝕟 × 𝕞 , ( 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ) 𝕞 × 𝕟 are connection weight matrices, 𝕣 𝕚 > 0 , 𝕞 𝕛 > 0 , are positive diagonal matrices. The external inputs are I 𝕚 ( 𝕥 ) and J 𝕛 ( 𝕥 ) . σ 1 ( 𝕥 ) , σ 2 ( 𝕥 ) , η 1 ( 𝕥 ) and η 2 ( 𝕥 ) can be assumed by
0 σ 1 ( 𝕥 ) σ 1 , σ ˙ 1 δ 1 , 0 σ 2 ( 𝕥 ) σ 2 , σ ˙ 2 δ 2 , 0 η 1 ( 𝕥 ) α 1 , η ˙ 1 α 1 , 0 η 2 ( 𝕥 ) α 2 , η ˙ 2 α 2 .
By using differential inclusion theory, the above system becomes
D α 𝕚 ( 𝕥 ) = 𝕣 𝕚 𝕚 ( 𝕥 ) + 𝕛 = 1 𝕟 c o [ 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) 𝕙 𝕛 ( 𝕛 ( 𝕥 ) ) ] + 𝕛 = 1 𝕟 c o [ 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) 𝕙 𝕛 ( 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) ] + 𝕛 = 1 𝕟 c o [ 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕛 ( 𝕤 ) ) 𝕕 𝕤 ] + I 𝕚 ( 𝕥 ) , D α 𝕛 ( 𝕥 ) 𝕞 𝕛 𝕛 ( 𝕥 ) + 𝕚 = 1 𝕞 c o [ 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) 𝕘 𝕚 ( 𝕚 ( 𝕥 ) ) ] + 𝕚 = 1 𝕞 c o [ 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) 𝕘 𝕚 ( 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) ] + 𝕚 = 1 𝕞 c o [ 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( 𝕚 ( 𝕤 ) ) 𝕕 𝕤 ] + J 𝕛 ( 𝕥 )
Furthermore, let just for efficiency
c o [ 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕛 ( 𝕥 ) ) = 𝕙 𝕕 ( 𝕚 ( 𝕥 ) ) , c o [ 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) = 𝕙 𝕓 ( 𝕚 ( 𝕥 ) ) , c o [ 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕛 ( 𝕤 ) ) 𝕕 𝕤 = 𝕙 𝕒 ( 𝕚 ( 𝕥 ) ) , c o [ 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( 𝕚 ( 𝕥 ) ) = 𝕘 𝕟 ( 𝕛 ( 𝕥 ) ) , c o [ 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) = 𝕘 𝕔 ( 𝕛 ( 𝕥 ) ) , c o [ 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( 𝕚 ( 𝕤 ) ) 𝕕 𝕤 = 𝕘 𝕖 ( 𝕛 ( 𝕥 ) )
The equilibrium point ϱ * = ( ϱ 1 * , ϱ 2 * , . . . , ϱ 𝕟 * ) T ,   χ * = ( χ 1 * , χ 2 * , . . . , χ 𝕞 * ) T , of neural networks (2) to the origin. This transformation ϱ 𝕚 ( 𝕥 ) = 𝕚 ( 𝕥 ) 𝕚 * , χ 𝕛 ( 𝕥 ) = 𝕛 ( 𝕥 ) 𝕛 * put neural networks (2) into
D α ϱ ( 𝕥 ) = 𝕣 𝕚 ϱ 𝕚 ( 𝕥 ) + 𝕛 = 1 𝕟 𝕙 𝕕 ( ϱ 𝕚 ( 𝕥 ) ) + 𝕛 = 1 𝕟 𝕙 𝕓 ( ϱ 𝕚 ( 𝕥 ) ) + 𝕛 = 1 𝕟 𝕙 𝕒 ( ϱ 𝕚 ( 𝕥 ) ) , 𝕚 = 1 , 2 , . . . , 𝕞 , D α χ 𝕛 ( 𝕥 ) = 𝕞 𝕛 χ 𝕛 ( 𝕥 ) + 𝕚 = 1 𝕞 𝕘 𝕟 ( χ 𝕛 ( 𝕥 ) ) + 𝕚 = 1 𝕞 𝕘 𝕔 ( χ 𝕛 ( 𝕥 ) ) + 𝕚 = 1 𝕞 𝕘 𝕖 ( χ 𝕛 ( 𝕥 ) ) , 𝕛 = 1 , 2 , . . . , 𝕟 .
The compact form is
D α ϱ ( 𝕥 ) = R ϱ ( 𝕥 ) + H 𝕕 ( ϱ ( 𝕥 ) ) + H 𝕓 ( ϱ ( 𝕥 ) ) + H 𝕒 ( ϱ ( 𝕥 ) ) , D α χ ( 𝕥 ) = M χ ( 𝕥 ) + G 𝕟 ( χ ( 𝕥 ) ) + G 𝕔 ( χ ( 𝕥 ) ) + G 𝕖 ( χ ( 𝕥 ) ) ,
The system (4) initial condition is considered to be
ϱ ( 𝕤 ) = Ψ ( 𝕤 ) R 𝕟 , χ ( 𝕤 ) = ϕ ( 𝕤 ) R 𝕞 .
Theorem 1.
The equilibrium of system (4) is globally robust and asymptotically stable if there exist real matrices P 1 , Q 1 , such that:
2 P 1 R + P 1 U 1 P 1 + μ D T U 1 1 D μ + P 1 U 2 P 1
+ μ B T U 2 1 B μ + P 1 U 3 P 1 + μ A T U 3 1 A μ < 0 , 2 Q 1 M + Q 1 U 4 Q 1 + β N T U 4 1 N β + Q 1 U 5 Q 1
+ β C T U 5 1 C β + Q 1 U 6 Q 1 + β E T U 6 1 E β < 0
Proof. Define a Lyapunov functional
V ( 𝕥 ) = 𝕜 = 1 2 V 𝕜 ( 𝕥 ) ,
where,
V 1 ( 𝕥 ) = D ( 1 α ) [ ϱ T ( 𝕥 ) P 1 ϱ ( 𝕥 ) ] , V 2 ( 𝕥 ) = D ( 1 α ) [ χ T ( 𝕥 ) Q 1 χ ( 𝕥 ) ] ,
Taking the time derivative of V ( 𝕥 ) along the trajectories of (4) and using Lemma 2, we obtain
V ˙ 1 ( 𝕥 ) = 2 ϱ T ( 𝕥 ) P 1 { R ϱ ( 𝕥 ) + H 𝕕 ( ϱ ( 𝕥 ) ) + H 𝕓 ( ϱ ( 𝕥 ) ) + H 𝕒 ( ϱ ( 𝕥 ) ) } V ˙ 2 ( 𝕥 ) = 2 χ T ( 𝕥 ) P 2 { M χ ( 𝕥 ) + G 𝕟 ( χ ( 𝕥 ) ) + G 𝕔 ( χ ( 𝕥 ) ) + G 𝕖 ( χ ( 𝕥 ) ) }
2 ϱ T ( 𝕥 ) P 1 H 𝕕 ( ϱ ( 𝕥 ) ) ϱ T ( 𝕥 ) P 1 U 1 P 1 ϱ ( 𝕥 ) + ϱ T ( 𝕥 ) μ D T U 1 1 D μ ϱ ( 𝕥 ) , 2 ϱ T ( 𝕥 ) P 1 H 𝕓 ( ϱ ( 𝕥 ) ) ϱ T ( 𝕥 ) P 1 U 2 P 1 ϱ ( 𝕥 ) + ϱ T ( 𝕥 ) μ B T U 2 1 B μ ϱ ( 𝕥 ) , 2 ϱ T ( 𝕥 ) P 1 H 𝕒 ( ϱ ( 𝕥 ) ) ϱ T ( 𝕥 ) P 1 U 3 P 1 ϱ ( 𝕥 ) + ϱ T ( 𝕥 ) μ A T U 3 1 A μ ϱ ( 𝕥 ) , 2 χ T ( 𝕥 ) Q 1 G 𝕟 ( χ ( 𝕥 ) ) χ T ( 𝕥 ) Q 1 U 4 Q 1 χ ( 𝕥 ) + χ T ( 𝕥 ) β N T U 4 1 N β χ ( 𝕥 ) , 2 χ T ( 𝕥 ) Q 1 G 𝕔 ( χ ( 𝕥 ) ) χ T ( 𝕥 ) Q 1 U 5 Q 1 χ ( 𝕥 ) + χ T ( 𝕥 ) β C T U 5 1 C β χ ( 𝕥 ) , 2 χ T ( 𝕥 ) Q 1 G 𝕖 ( χ ( 𝕥 ) ) χ T ( 𝕥 ) Q 1 U 6 Q 1 χ ( 𝕥 ) + χ T ( 𝕥 ) β E T U 6 1 E β χ ( 𝕥 ) ,
Substituting, we obtain
V ˙ ( ( 𝕥 ) ) ϱ T ( 𝕥 ) { 2 P 1 R + P 1 U 1 P 1 + μ D T U 1 1 D μ + P 1 U 2 P 1 + μ B T U 2 1 B μ + P 1 U 3 P 1 + μ A T U 3 1 A μ } ϱ ( 𝕥 ) + χ T ( 𝕥 ) { 2 Q 1 M + Q 1 U 4 Q 1 + β N T U 4 1 N β + Q 1 U 5 Q 1 + β C T U 5 1 C β + Q 1 U 6 Q 1 + β E T U 6 1 E β } χ ( 𝕥 )
V ˙ ( 𝕥 ) < 0 . As a result, the equilibrium point of (4) is determined to be globally robustly stable. The proof is now completed.  □
Remark 1. System (1) is the drive  system, followed by the response system being
D α ς 𝕚 ( 𝕥 ) = 𝕣 𝕚 ς 𝕚 ( 𝕥 ) + 𝕛 = 1 𝕟 𝕕 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) + 𝕛 = 1 𝕟 𝕓 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) + 𝕛 = 1 𝕟 𝕒 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕤 ) ) 𝕕 𝕤 + I 𝕚 ( 𝕥 ) , D α 𝕤 𝕛 ( 𝕥 ) = 𝕞 𝕛 𝕤 𝕛 ( 𝕥 ) + 𝕚 = 1 𝕞 𝕟 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕘 𝕚 ( ς 𝕚 ( 𝕥 ) ) + 𝕚 = 1 𝕞 𝕔 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕘 𝕚 ( ς 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) + 𝕚 = 1 𝕞 𝕖 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( ς 𝕚 ( 𝕤 ) ) 𝕕 𝕤 + J 𝕛 ( 𝕥 ) .
The memristive connection weights 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) , 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) , 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) , 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) , 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) , 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) , 𝕕 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) , 𝕓 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) , 𝕒 𝕛 𝕚 ς 𝕚 ( 𝕥 ) ) , 𝕟 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) , 𝕔 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) , 𝕖 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) will change with time. Then, we let
𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) = 𝕕 ^ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | T 𝕚 , 𝕕 ˇ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | > T 𝕚 , 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) = 𝕒 ^ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | T 𝕚 , 𝕒 ˇ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | > T 𝕚 , 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) = 𝕓 ^ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | T 𝕚 , 𝕓 ˇ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | > T 𝕚 , 𝕕 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) = 𝕕 ^ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | T 𝕚 , 𝕕 ˇ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | > T 𝕚 , 𝕒 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) = 𝕒 ^ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | T 𝕚 , 𝕒 ˇ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | > T 𝕚 , 𝕓 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) = 𝕓 ^ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | T 𝕚 , 𝕓 ˇ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | > T 𝕚 , 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) = 𝕟 ^ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | W 𝕛 , 𝕟 ˇ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | > W 𝕛 , 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) = 𝕔 ^ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | W 𝕛 , 𝕔 ˇ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | > W 𝕛 , 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) = 𝕖 ^ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | W 𝕛 , 𝕖 ˇ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | > W 𝕛 , 𝕟 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) = 𝕟 ^ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | W 𝕛 , 𝕟 ˇ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | > W 𝕛 , 𝕔 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) = 𝕔 ^ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | W 𝕛 , 𝕔 ˇ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | > W 𝕛 , 𝕖 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) = 𝕖 ^ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | W 𝕛 , 𝕖 ˇ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | > W 𝕛 .
Using the differential inclusion theory, the above systems can be rewritten as
D α 𝕚 ( 𝕥 ) = 𝕣 𝕚 𝕚 ( 𝕥 ) + 𝕛 = 1 𝕟 c o [ 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕛 ( 𝕥 ) ) + 𝕛 = 1 𝕟 c o [ 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) + 𝕛 = 1 𝕟 c o [ 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕛 ( 𝕤 ) ) 𝕕 𝕤 + I 𝕚 ( 𝕥 ) , D α 𝕛 ( 𝕥 ) = 𝕞 𝕛 𝕛 ( 𝕥 ) + 𝕚 = 1 𝕞 c o [ 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( 𝕚 ( 𝕥 ) ) + 𝕚 = 1 𝕞 c o [ 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) + 𝕚 = 1 𝕞 c o [ 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( 𝕚 ( 𝕤 ) ) 𝕕 𝕤 + J 𝕛 ( 𝕥 ) .
and
D α ς 𝕚 ( 𝕥 ) = 𝕣 𝕚 ς 𝕚 ( 𝕥 ) + 𝕛 = 1 𝕟 c o [ 𝕕 𝕛 𝕚 ς 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) + 𝕛 = 1 𝕟 c o [ 𝕓 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) + 𝕛 = 1 𝕟 𝕔 o [ 𝕒 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕤 ) ) 𝕕 𝕤 + I 𝕚 ( 𝕥 ) + U 𝕚 ( 𝕥 ) , D α 𝕤 𝕛 ( 𝕥 ) = 𝕞 𝕛 𝕤 𝕛 ( 𝕥 ) + 𝕚 = 1 𝕞 c o [ 𝕟 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ς 𝕚 ( 𝕥 ) ) + 𝕚 = 1 𝕞 c o [ 𝕔 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( ς 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) + 𝕚 = 1 𝕞 c o [ 𝕖 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( ς 𝕚 ( 𝕤 ) ) 𝕕 𝕤 + J 𝕛 ( 𝕥 ) + V 𝕚 ( 𝕥 ) .
with
c o [ 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] = 𝕕 ^ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | < T 𝕚 , 𝕔 o { 𝕕 ^ 𝕛 𝕚 , 𝕕 ˇ 𝕛 𝕚 } , | 𝕚 ( 𝕥 ) | = T 𝕚 , 𝕕 ˇ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | > T 𝕚 , c o [ 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] = 𝕒 ^ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | < T 𝕚 , 𝕔 o { 𝕒 ^ 𝕛 𝕚 , 𝕒 ˇ 𝕛 𝕚 } , | 𝕚 ( 𝕥 ) | = T 𝕚 , 𝕒 ˇ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | > T 𝕚 , c o [ 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] = 𝕓 ^ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | < T 𝕚 , 𝕔 o { 𝕓 ^ 𝕛 𝕚 , 𝕓 ˇ 𝕛 𝕚 } , | 𝕚 ( 𝕥 ) | = T 𝕚 , 𝕓 ˇ 𝕛 𝕚 , | 𝕚 ( 𝕥 ) | > T 𝕚 , c o [ 𝕕 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] = 𝕕 ^ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | < T 𝕚 , 𝕔 o { 𝕕 ^ 𝕛 𝕚 , 𝕕 ˇ 𝕚 𝕛 } , | ς 𝕚 ( 𝕥 ) | = T 𝕚 , 𝕕 ˇ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | > T 𝕚 , c o [ 𝕒 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] = 𝕒 ^ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | < T 𝕚 , 𝕔 o 𝕒 ^ 𝕛 𝕚 , 𝕒 ˇ 𝕛 𝕚 } , | ς 𝕚 ( 𝕥 ) | = T 𝕚 , 𝕒 ˇ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | > T 𝕚 , c o [ 𝕓 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] = 𝕓 ^ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | < T 𝕚 , 𝕔 o { 𝕓 ^ 𝕛 𝕚 , 𝕓 ˇ 𝕛 𝕚 } , | ς 𝕚 ( 𝕥 ) | = T 𝕚 , 𝕓 ˇ 𝕛 𝕚 , | ς 𝕚 ( 𝕥 ) | > T 𝕚 , c o [ 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] = 𝕟 ^ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | < W 𝕛 , 𝕔 o { 𝕟 ^ 𝕚 𝕛 , 𝕟 ˇ 𝕚 𝕛 } , | 𝕛 ( 𝕥 ) | = W 𝕛 , 𝕟 ˇ 𝕚 𝕛 , | 𝕚 ( 𝕥 ) | > W 𝕛 , c o [ 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] = 𝕔 ^ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | < W 𝕛 , 𝕔 o { 𝕔 ^ 𝕚 𝕛 , 𝕔 ˇ 𝕚 𝕛 } , | 𝕛 ( 𝕥 ) | = W 𝕛 , 𝕔 ˇ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | > W 𝕛 , c o [ 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] = 𝕖 ^ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | < W 𝕛 , 𝕔 o { 𝕖 ^ 𝕚 𝕛 , 𝕖 ˇ 𝕚 𝕛 } , | 𝕛 ( 𝕥 ) | = W 𝕛 , 𝕖 ˇ 𝕚 𝕛 , | 𝕛 ( 𝕥 ) | > W 𝕛 , c o [ 𝕟 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] = 𝕟 ^ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | < W 𝕛 , 𝕔 o { 𝕟 ^ 𝕚 𝕛 , 𝕟 ˇ 𝕚 𝕛 } , | 𝕤 𝕛 ( 𝕥 ) | = W 𝕛 , 𝕟 ˇ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | > W 𝕛 ,
c o [ 𝕔 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] = 𝕔 ^ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | < W 𝕛 , 𝕔 o { 𝕔 ^ 𝕚 𝕛 , 𝕟 ˇ 𝕚 𝕛 } , | 𝕤 𝕛 ( 𝕥 ) | = W 𝕛 , 𝕟 ˇ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | > W 𝕛 , c o [ 𝕖 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] = 𝕖 ^ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | < W 𝕛 , 𝕔 o { 𝕖 ^ 𝕚 𝕛 , 𝕖 ˇ 𝕚 𝕛 } , | 𝕤 𝕛 ( 𝕥 ) | = W 𝕛 , 𝕖 ˇ 𝕚 𝕛 , | 𝕤 𝕛 ( 𝕥 ) | > W 𝕛 .
For the sake of convenience, let
c o [ 𝕕 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕛 ( 𝕥 ) ) = 𝕙 𝕕 ( 𝕚 ( 𝕥 ) ) , c o [ 𝕓 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) = 𝕙 𝕓 ( 𝕚 ( 𝕥 ) ) , c o [ 𝕒 𝕛 𝕚 ( 𝕚 ( 𝕥 ) ) ] 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕛 ( 𝕤 ) ) 𝕕 𝕤 = 𝕙 𝕒 ( 𝕚 ( 𝕥 ) ) , c o [ 𝕕 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) = 𝕙 𝕕 ( ς 𝕚 ( 𝕥 ) ) , c o [ 𝕓 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕥 σ 1 ( 𝕥 ) ) σ 2 ( 𝕥 ) ) ) = 𝕙 𝕓 ( ς 𝕚 ( 𝕥 ) ) , c o [ 𝕒 𝕛 𝕚 ( ς 𝕚 ( 𝕥 ) ) ] 𝕥 τ ( 𝕥 ) 𝕥 𝕙 𝕛 ( 𝕤 𝕛 ( 𝕤 ) ) 𝕕 𝕤 = 𝕙 𝕒 ( ς 𝕚 ( 𝕥 ) ) , c o [ 𝕟 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( ς 𝕚 ( 𝕥 ) ) = 𝕘 𝕟 ( 𝕛 ( 𝕥 ) ) , c o [ 𝕔 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) = 𝕘 𝕔 ( 𝕛 ( 𝕥 ) ) , c o [ 𝕖 𝕚 𝕛 ( 𝕛 ( 𝕥 ) ) ] 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( 𝕚 ( 𝕤 ) ) 𝕕 𝕤 = 𝕘 𝕖 ( 𝕛 ( 𝕥 ) ) , c o [ 𝕟 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( ς 𝕚 ( 𝕥 ) ) = 𝕘 𝕟 ( 𝕤 𝕛 ( 𝕥 ) ) , c o [ 𝕔 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] 𝕘 𝕚 ( ς 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) = 𝕘 𝕔 ( 𝕤 𝕛 ( 𝕥 ) ) , c o [ 𝕖 𝕚 𝕛 ( 𝕤 𝕛 ( 𝕥 ) ) ] 𝕥 υ ( 𝕥 ) 𝕥 𝕘 𝕚 ( 𝕣 𝕚 ( 𝕤 ) ) 𝕕 𝕤 = 𝕘 𝕖 ( 𝕤 𝕛 ( 𝕥 ) ) .
Before processing our main results, we set ς 𝕚 ( 𝕥 ) 𝕚 ( 𝕥 ) = ϖ 𝕚 ( 𝕥 ) 𝕤 𝕛 ( 𝕥 ) 𝕛 ( 𝕥 ) = ψ 𝕛 ( 𝕥 ) , then, the synchronization error system is
D α ϖ 𝕚 ( 𝕥 ) = 𝕣 𝕚 ϖ 𝕚 ( 𝕥 ) + 𝕛 = 1 𝕟 𝕙 𝕕 ( ς 𝕚 ( 𝕥 ) ) 𝕛 = 1 𝕟 𝕙 𝕕 ( 𝕚 ( 𝕥 ) ) + 𝕛 = 1 𝕟 𝕙 𝕓 ( ς 𝕚 ( 𝕥 ) ) 𝕛 = 1 𝕟 𝕙 𝕓 ( 𝕚 ( 𝕥 ) ) + 𝕛 = 1 𝕟 𝕙 𝕒 ( ς 𝕚 ( 𝕥 ) ) 𝕛 = 1 𝕟 𝕙 𝕒 ( 𝕚 ( 𝕥 ) ) + U ( 𝕥 ) , D α ψ 𝕛 ( 𝕥 ) = 𝕞 𝕛 ψ 𝕛 ( 𝕥 ) + 𝕚 = 1 𝕞 𝕘 𝕟 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕚 = 1 𝕞 𝕘 𝕟 ( 𝕛 ( 𝕥 ) ) + 𝕚 = 1 𝕞 𝕘 𝕔 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕛 = 1 𝕞 𝕘 𝕔 ( 𝕛 ( 𝕥 ) ) + 𝕚 = 1 𝕞 𝕘 𝕖 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕚 = 1 𝕞 𝕘 𝕖 ( 𝕛 ( 𝕥 ) ) + V ( 𝕥 ) ,
By using Hypothesis 1, we have,
| 𝕙 𝕕 ( ς 𝕚 ( 𝕥 ) ) 𝕙 𝕕 ( 𝕚 ( 𝕥 ) ) | 𝕕 𝕛 𝕚 𝕡 𝕛 | 𝕤 𝕛 ( 𝕥 ) 𝕛 ( 𝕥 ) | , 𝕕 𝕛 𝕚 𝕡 𝕛 | ψ 𝕛 ( 𝕥 ) | , | 𝕙 𝕓 ( ς 𝕚 ( 𝕥 ) ) 𝕙 𝕓 ( 𝕚 ( 𝕥 ) ) | 𝕓 𝕛 𝕚 𝕡 𝕛 | 𝕤 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) | , 𝕓 𝕛 𝕚 𝕡 𝕛 | ψ 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) | , | 𝕙 𝕒 ( ς 𝕚 ( 𝕥 ) ) 𝕙 𝕒 ( 𝕚 ( 𝕥 ) ) | 𝕒 𝕛 𝕚 𝕡 𝕛 𝕥 τ ( 𝕥 ) 𝕥 | 𝕤 𝕛 ( 𝕤 ) 𝕕 𝕤 𝕛 ( 𝕤 ) 𝕕 𝕤 | , 𝕒 𝕛 𝕚 𝕡 𝕛 𝕥 τ ( 𝕥 ) 𝕥 | ψ 𝕛 ( 𝕤 ) | 𝕕 𝕤 , | 𝕘 𝕟 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕘 𝕟 ( 𝕛 ( 𝕥 ) ) | 𝕟 𝕚 𝕛 𝕢 𝕚 | ς 𝕚 ( 𝕥 ) 𝕚 ( 𝕥 ) | , 𝕟 𝕚 𝕛 𝕢 𝕚 | ϖ 𝕚 ( 𝕥 ) | ,
| 𝕘 𝕔 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕘 𝕔 ( 𝕛 ( 𝕥 ) ) | 𝕔 𝕚 𝕛 𝕢 𝕚 | ς 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) | , 𝕔 𝕚 𝕛 𝕢 𝕚 | ϖ 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) | , | 𝕘 𝕖 ( 𝕤 𝕛 ( 𝕥 ) ) 𝕘 𝕖 ( 𝕛 ( 𝕥 ) ) | 𝕖 𝕚 𝕛 𝕢 𝕚 𝕥 υ ( 𝕥 ) 𝕥 | ς 𝕚 ( 𝕤 ) 𝕚 ( ( 𝕤 ) ) | 𝕕 𝕤 , 𝕖 𝕚 𝕛 𝕢 𝕚 𝕥 υ ( 𝕥 ) 𝕥 | ϖ 𝕚 ( 𝕤 ) | 𝕕 𝕤 ,
The compact form of (12) is
D α ϖ ( 𝕥 ) = R ϖ ( 𝕥 ) + H 𝕕 ( ψ 𝕛 ( 𝕥 ) ) + H 𝕓 ( ψ 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) + 𝕥 τ ( 𝕥 ) 𝕥 H 𝕒 ( ψ 𝕛 ( 𝕤 ) ) 𝕕 𝕤 + U ( 𝕥 ) , D α ψ ( 𝕥 ) = M ψ ( 𝕥 ) + G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) + G 𝕔 ( ϖ 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) + 𝕥 υ ( 𝕥 ) 𝕥 G 𝕖 ( ϖ 𝕚 ( 𝕤 ) ) 𝕕 𝕤 + V ( 𝕥 ) .
When 𝕌 ( 𝕥 ) , 𝕍 ( 𝕥 ) = 0 , then the system decreases to
D α ϖ ( 𝕥 ) = R ϖ ( 𝕥 ) + H 𝕕 ( ψ 𝕛 ( 𝕥 ) ) + H 𝕓 ( ψ 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) + 𝕥 τ ( 𝕥 ) 𝕥 H 𝕒 ( ψ 𝕛 ( 𝕤 ) ) 𝕕 𝕤 , D α ψ ( 𝕥 ) = M ψ ( 𝕥 ) + G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) + G 𝕔 ( ϖ 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) + 𝕥 υ ( 𝕥 ) 𝕥 G 𝕖 ( ϖ 𝕚 ( 𝕤 ) ) 𝕕 𝕤 .
Theorem 2.
Under Hypothesis 1, the real matrices are R 1 , R 2 , R 3 , R 4 , Z 1 , Z 2 , T 1 , T 2 , T 3 , T 4 , T 5 , T 6 , W 1 , W 2 , W 3 , W 4 , W 5 , W 6 , W 7 , W 8 , such that the following LMIs holds
Φ = d i a g ( Φ 1 , 1 , . . . . , Φ 11 , 11 ) < 0 ,
Ψ = d i a g ( Ψ 1 , 1 , . . . . , Ψ 11 , 11 ) < 0 ,
where
Φ 1 , 1 = 2 RR 1 + R 1 U 1 R 1 + R 1 U 2 R 1 + R 1 U 3 R 1 + β N T U 4 1 N β + σ 1 2 R 3 + υ 2 W 7 + Q 1 + σ 2 2 R 4 + T 1 + T 2 + T 3 + T 4 + T 5 + T 6 , Φ 2 , 2 = β C T U 5 1 C β Q 1 , Φ 3 , 3 = T 1 ( 1 α 1 α 2 ) , Φ 4 , 4 = T 2 ( 1 α 1 ) , Φ 5 , 5 = T 3 ( 1 α 2 ) , Φ 6 , 6 = T 4 , Φ 7 , 7 = T 5 , Φ 8 , 8 = T 6 , Φ 9 , 9 = β E T U 6 1 E β W 7 , Φ 10 , 10 = R 3 , Φ 11 , 11 = R 4 .
and
Ψ 1 , 1 = 2 MR 2 + R 2 U 4 R 2 + R 2 U 5 R 2 + R 2 U 6 R 2 + μ D T U 1 1 D μ + η 1 2 Z 1 + τ 2 W 8 + Q 2 + η 2 2 Z 2 + W 1 + W 2 + W 3 + W 4 + W 5 + W 6 , Ψ 2 , 2 = μ B T U 2 1 B μ Q 2 , Ψ 3 , 3 = W 1 ( 1 δ 1 δ 2 ) , Ψ 4 , 4 = W 2 ( 1 δ 1 ) , Ψ 5 , 5 = W 3 ( 1 δ 2 ) , Ψ 6 , 6 = W 4 , Ψ 7 , 7 = W 5 , Ψ 8 , 8 = W 6 , Ψ 9 , 9 = μ A T U 3 1 A μ W 8 , Ψ 10 , 10 = Z 1 , Ψ 11 , 11 = Z 2 .
then the system (13) is globally asymptotically stable.
Appendix A contains the detailed proof of Theorem 8.
Remark 2.
When σ 1 ( 𝕥 ) = σ 2 ( 𝕥 ) = σ ( 𝕥 ) , system (13) correspondingly decreases to
D α ϖ ( 𝕥 ) = R ϖ ( 𝕥 ) + H 𝕕 ( ψ 𝕛 ( 𝕥 ) ) + H 𝕓 ( ψ 𝕛 ( 𝕥 σ ( 𝕥 ) ) ) + 𝕥 τ ( 𝕥 ) 𝕥 H 𝕒 ( ψ 𝕛 ( 𝕤 ) ) 𝕕 𝕤 , D α ψ ( 𝕥 ) = M ψ ( 𝕥 ) + G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) + G 𝕔 ( ϖ 𝕚 ( 𝕥 η ( 𝕥 ) ) ) + 𝕥 υ ( 𝕥 ) 𝕥 G 𝕖 ( ϖ 𝕚 ( 𝕤 ) ) 𝕕 𝕤 .
Theorem 3.
Under Hypothesis 1, the real matrices are Z 1 , Z 2   R 3 , R 4 , Q 1 , Y 1 , T 7 , W 7 , such that the following LMIs hold,
Λ = d i a g ( Λ 1 , 1 , . . . . , Λ 5 , 5 ) < 0 ,
Υ = d i a g ( Υ 1 , 1 , . . . . , Υ 5 , 5 ) < 0 ,
Λ 1 , 1 = 2 RR 3 + R 3 U 7 R 3 + R 3 U 8 R 3 + R 3 U 9 R 3 + β N T U 10 1 N β + σ 2 Q 1 + T 7 + υ 2 Z 1 + R 1 , Λ 2 , 2 = β C T U 11 1 C β R 1 , Λ 3 , 3 = β E T U 12 1 E β Z 1 , Λ 4 , 4 = Q 1 , Λ 5 , 5 = T 7 .
Υ 1 , 1 = 2 MR 4 + R 4 U 10 R 4 + R 4 U 11 R 4 + R 4 U 12 R 4 + μ D T U 7 1 D μ + η 2 Y 1 + W 7 + τ 2 Z 2 + R 2 , Υ 2 , 2 = μ B T U 8 1 B μ R 2 , Υ 3 , 3 = μ A T U 9 1 A μ Z 2 , Υ 4 , 4 = Y 1 , Υ 5 , 5 = W 7 .
then, system (16) is globally asymptotically stable.
Appendix B contains the detailed proof of Theorem 10.
Remark 3.
When H 𝕒 = G 𝕖 = 0 , the system (13) correspondingly decreases to
D α ϖ ( 𝕥 ) = R ϖ ( 𝕥 ) + H 𝕕 ( ψ 𝕛 ( 𝕥 ) ) + H 𝕓 ( ψ 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) , D α ψ ( 𝕥 ) = M ψ ( 𝕥 ) + G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) + G 𝕔 ( ϖ 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) .
Theorem 4.
Under Hypothesis 1, the real matrices are R 1 , R 2 , T 1 , T 2 , T 3 , T 4 , T 5 , T 6 , W 1 , W 2 , W 3 , W 4 , W 5 , W 6 such that the following LMIs hold
ξ = d i a g ( ξ 1 , 1 , . . . . , ξ 8 , 8 ) < 0 ,
π = d i a g ( π 1 , 1 , . . . . , π 8 , 8 ) < 0 ,
ξ 1 , 1 = 2 RR 1 + R 1 U 1 R 1 + R 1 U 2 R 1 + T 1 + T 2 + T 3 + T 4 + T 5 + T 6 + Q 1 + β N T U 4 1 N β , ξ 2 , 2 = β C T U 5 1 C β Q 1 , ξ 3 , 3 = T 1 ( 1 α 1 α 2 ) , ξ 4 , 4 = T 2 ( 1 α 1 ) , ξ 5 , 5 = T 3 ( 1 α 2 ) , ξ 6 , 6 = T 4 , ξ 7 , 7 = T 5 , ξ 8 , 8 = T 6 .
and
π 1 , 1 = 2 MR 2 + R 2 U 4 R 2 + R 2 ε 5 R 2 + W 1 + W 2 + W 3 + W 4 + W 5 + W 6 + Q 2 + μ D T U 1 1 D μ , π 2 , 2 = μ B T U 2 1 B μ Q 2 , π 3 , 3 = W 1 ( 1 δ 1 δ 2 ) , π 4 , 4 = W 2 ( 1 δ 1 ) , π 5 , 5 = W 3 ( 1 δ 2 ) , π 6 , 6 = W 4 , π 7 , 7 = W 5 , π 8 , 8 = W 6 .
then system (19) is globally asymptotically stable.
Appendix C contains the detailed proof of Theorem 11.

4. Illustrative Example

Example 1.
Consider the fact that system (13), as with the parameters, is
𝕕 𝟙𝟙 ( 𝟙 ( 𝕥 ) ) = 0.3 , | 𝟙 ( 𝕥 ) | T 𝟙 , 0.1 , | 𝟙 ( 𝕥 ) | > T 𝟙 , 𝕕 𝟙2 ( 2 ( 𝕥 ) ) = 4.05 , | 2 ( 𝕥 ) | T 2 , 3.95 , | 2 ( 𝕥 ) | > T 2 , 𝕕 2𝟙 ( 𝟙 ( 𝕥 ) ) = 0.15 , | 𝟙 ( 𝕥 ) | T 𝟙 , 0.05 , | 𝟙 ( 𝕥 ) | > T 𝟙 , 𝕕 22 ( 2 ( 𝕥 ) ) = 0.35 , | 2 ( 𝕥 ) | T 2 , 0.25 , | 2 ( 𝕥 ) | > T 2 , 𝕒 𝟙𝟙 ( 𝟙 ( 𝕥 ) ) = 0.2 , | 𝟙 ( 𝕥 ) | T 𝟙 , 0 , | 𝟙 ( 𝕥 ) | > T 𝟙 , 𝕒 𝟙2 ( 2 ( 𝕥 ) ) = 5.05 , | 2 ( 𝕥 ) | T 2 , 4.95 , | 2 ( 𝕥 ) | > T 2 , 𝕒 2𝟙 ( 𝟙 ( 𝕥 ) ) = 0.05 , | 𝟙 ( 𝕥 ) | T 𝟙 , 0.05 , | 𝟙 ( 𝕥 ) | > T 𝟙 , 𝕒 22 ( 2 ( 𝕥 ) ) = 0.35 , | 2 ( 𝕥 ) | T 2 , 0.25 , | 2 ( 𝕥 ) | > T 2 , 𝕓 𝟙𝟙 ( 𝟙 ( 𝕥 ) ) = 0.1 , | 𝟙 ( 𝕥 ) | T 𝟙 , 0 , | 𝟙 ( 𝕥 ) | > T 𝟙 , 𝕓 𝟙2 ( 2 ( 𝕥 ) ) = 0.05 , | 2 ( 𝕥 ) | T 2 , 0.25 , | 2 ( 𝕥 ) | > T 2 , 𝕓 2𝟙 ( 𝟙 ( 𝕥 ) ) = 0.05 , | 𝟙 ( 𝕥 ) | T 𝟙 , 0.36 , | 𝟙 ( 𝕥 ) | > T 𝟙 , 𝕓 22 ( 2 ( 𝕥 ) ) = 0.05 , | 2 ( 𝕥 ) | T 2 , 0.11 , | 2 ( 𝕥 ) | > T 2 , 𝕟 𝟙𝟙 ( ς 1 ( 𝕥 ) ) = 0.1 , | ς 1 ( 𝕥 ) | W 𝟙 , 0.4 , | ς 1 ( 𝕥 ) | > W 𝟙 , 𝕟 𝟙2 ( ς 2 ( 𝕥 ) ) = 0.06 , | ς 2 ( 𝕥 ) | W 2 , 0.1 , | ς 2 ( 𝕥 ) | > W 2 , 𝕟 2𝟙 ( ς 1 ( 𝕥 ) ) = 0.06 , | ς 1 ( 𝕥 ) | W 𝟙 , 0.5 , | ς 1 ( 𝕥 ) | > W 𝟙 , 𝕟 22 ( ς 2 ( 𝕥 ) ) = 0.06 , | ς 2 ( 𝕥 ) | W 2 , 0.3 , | ς 2 ( 𝕥 ) | > W 2 , 𝕔 𝟙𝟙 ( ς 1 ( 𝕥 ) ) = 0.4 , | ς 1 ( 𝕥 ) | W 𝟙 , 0.38 , | ς 1 ( 𝕥 ) | > W 𝟙 , 𝕔 𝟙2 ( ς 2 ( 𝕥 ) ) = 0.2 , | ς 2 ( 𝕥 ) | W 2 , 0.25 , | ς 2 ( 𝕥 ) | > W 2 , 𝕔 2𝟙 ( ς 1 ( 𝕥 ) ) = 0.87 , | ς 1 ( 𝕥 ) | W 𝟙 , 0.5 , | ς 1 ( 𝕥 ) | > W 𝟙 , 𝕔 22 ( ς 2 ( 𝕥 ) ) = 0.5 , | ς 2 ( 𝕥 ) | W 2 , 0.38 , | ς 2 ( 𝕥 ) | > W 2 , 𝕖 𝟙𝟙 ( ς 1 ( 𝕥 ) ) = 1.3 , | ς 1 ( 𝕥 ) | W 𝟙 , 0.2 , | ς 1 ( 𝕥 ) | > W 𝟙 ,
𝕖 𝟙2 ( ς 2 ( 𝕥 ) ) = 0.28 , | ς 2 ( 𝕥 ) | W 2 , 0.1 , | ς 2 ( 𝕥 ) | > W 2 , 𝕖 2𝟙 ( ς 1 ( 𝕥 ) ) = 1.1 , | ς 1 ( 𝕥 ) | W 𝟙 , 0.4 , | ς 1 ( 𝕥 ) | > W 𝟙 , 𝕖 22 ( ς 2 ( 𝕥 ) ) = 0.88 , | ς 2 ( 𝕥 ) | W 2 , 0.5 , | ς 2 ( 𝕥 ) | > W 2 .
Here, we take an activation function as 𝕙 ( 𝕤 ) = 𝕘 ( 𝕤 ) = t a n h ( 𝕤 ) . Then, let σ 1 = 0.87 , σ 2 = 0.88 , α 1 = 0.27 , α 2 = 0.36 , β = 0.75 , τ = 0.87 , η 1 = 0.78 , η 2 = 0.76 , μ = 0.65 , δ 1 = 0.78 , δ 2 = 0.76 , υ = 0.76 . Then, there exist the matrices as follows:
R = 4 0 0 6 , M = 0.1 0 0 0.2 , D = 0.2 4 0.1 0.3 , A = 0.1 5 0 0.3 , B = 0.1 0.05 0.05 0.05 , C = 0.4 0.25 0.87 0.5 , N = 0.1 0.06 0.06 0.06 , E = 0.3 0.28 1.1 0.88 .
The following feasible solutions are obtained by solving LMIs (14), (15).
R 1 = 1.8987 0.0564 0.0564 1.2887 , R 2 = 98.0141 3.4251 3.4251 0.2475 , R 3 = 1.5071 0.0000 0.0000 1.5071 , R 4 = 1.5071 0.0000 0.0000 1.5071 , T 1 = 4.0678 0.0000 0.0000 4.0680 , T 2 = 2.0641 0.0000 0.0000 2.0641 , T 3 = 2.3541 0.0000 0.0000 2.3541 , T 4 = 1.4091 0.0011 0.0011 1.5072 , T 5 = 1.5071 0.0000 0.0000 1.5071 , T 6 = 1.5071 0.0000 0.0000 1.5071 , Z 1 = 1.4475 0.0007 0.0007 1.5072 , Z 2 = 1.4505 0.0007 0.0007 1.5072 , W 1 = 3.1255 0.0039 0.0039 2.7892 , W 2 = 4.8069 0.0235 0.0235 6.8262 , W 3 = 4.5634 0.0198 0.0198 6.2613 , W 4 = 1.4091 0.0011 0.0011 1.5072 , W 5 = 1.4091 0.0011 0.0011 1.5072 , W 6 = 1.3111 0.0023 0.0023 1.5074 , W 7 = 0.6275 0.7118 0.7118 0.9301 , W 8 = 1.4294 0.1632 0.1632 6.7240 , Q 1 = 1.5456 0.0540 0.0540 1.4589 , Q 2 = 1.2582 0.0307 0.0307 1.4745 .
U 1 = 2.7767 , U 2 = 0.3447 , U 3 = 0.5048 , U 4 = 0.0022 , U 5 = 0.0643 , U 6 = 0.9023 .
Therefore, it follows from Theorem 8 that the memristive BAM neural network with given parameters is globally asymptotically stable.

5. Conclusions

There is a new sufficient condition for guaranteeing the globally robust asymptotically stability of the equilibrium point for fractional-order memristive BAM neural networks, with mixed and additive time varying delays. Using the Lyapunov functional and LMI approaches, robust stability results can be obtained. At last, a numerical example is presented. Future research topics are stochastic BAM neural networks, and stochastic recurrent neural networks.

Author Contributions

X.H., methodology; M.H., problem formulation; S.S., derivation works; B.D., numerical results; M.S.A., simulation results. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by National Natural Science Foundation of China (62173215), Major Basic Research Program of the Natural Science Foundation of Shandong Province in China (ZR202105090005), and the Support Plan for Outstanding Youth Innovation Team in Shandong Higher Education Institutions (2019KJI008).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Proof of Theorem 8. Define a Lyapunov functional 
V ( 𝕥 ) = 𝕜 = 1 8 V 𝕜 ( 𝕥 ) ,
where
V 1 ( 𝕥 ) = D ( 1 α ) [ ϖ T ( 𝕥 ) R 1 ϖ ( 𝕥 ) ] , V 2 ( 𝕥 ) = D ( 1 α ) [ ψ T ( 𝕥 ) R 2 ψ ( 𝕥 ) ] , V 3 ( 𝕥 ) = σ 1 σ 1 0 𝕥 + θ 𝕥 ϖ T ( 𝕤 ) R 3 ϖ ( 𝕤 ) 𝕕 𝕤 𝕕 θ + σ 2 σ 2 0 𝕥 + θ 𝕥 ϖ T ( 𝕤 ) R 4 ϖ ( 𝕤 ) 𝕕 𝕤 𝕕 θ , V 4 ( 𝕥 ) = η 1 η 1 0 𝕥 + θ 𝕥 ψ T ( 𝕤 ) Z 1 ψ ( 𝕤 ) 𝕕 𝕤 𝕕 θ + η 2 η 2 0 𝕥 + θ 𝕥 ψ T ( 𝕤 ) Z 2 ψ ( 𝕤 ) 𝕕 𝕤 𝕕 θ , V 5 ( 𝕥 ) = 𝕥 η ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) T 1 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 1 ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) T 2 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 2 ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) T 3 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 𝕥 ϖ T ( 𝕤 ) T 4 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 1 𝕥 ϖ T ( 𝕤 ) T 5 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 2 𝕥 ϖ T ( 𝕤 ) T 6 ϖ ( 𝕤 ) 𝕕 𝕤 , V 6 ( 𝕥 ) = 𝕥 σ ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) W 1 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 1 ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) W 2 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 2 ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) W 3 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 𝕥 ψ T ( 𝕤 ) W 4 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 1 𝕥 ψ T ( 𝕤 ) W 5 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 2 𝕥 ψ T ( 𝕤 ) W 6 ψ ( 𝕤 ) 𝕕 𝕤 , V 7 ( 𝕥 ) = υ ( 𝕥 ) υ ( 𝕥 ) 0 𝕥 + θ 𝕥 ϖ T ( 𝕤 ) W 7 ϖ ( 𝕤 ) 𝕕 𝕤 𝕕 θ , V 8 ( 𝕥 ) = τ ( 𝕥 ) τ ( 𝕥 ) 0 𝕥 + θ 𝕥 ψ T ( 𝕤 ) W 8 ψ ( 𝕤 ) 𝕕 𝕤 𝕕 θ .
The time derivatives of V ( 𝕥 ) as follows:
V ˙ 1 ( 𝕥 ) = 2 ϖ T ( 𝕥 ) R 1 { R ϖ ( 𝕥 ) + H 𝕕 ( ψ 𝕛 ( 𝕥 ) ) + H 𝕓 ( ψ 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) + 𝕥 τ ( 𝕥 ) 𝕥 H 𝕒 ( ψ 𝕛 ( 𝕤 ) ) 𝕕 𝕤 } V ˙ 2 ( 𝕥 ) = 2 ψ T ( 𝕥 ) R 2 { M ψ ( 𝕥 ) + G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) + G 𝕔 ( ϖ 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) + 𝕥 υ ( 𝕥 ) 𝕥 G 𝕖 ( ϖ 𝕚 ( 𝕤 ) ) 𝕕 𝕤 }
By applying Lemma 2 and Hypothesis 1, we obtain
2 ϖ T ( 𝕥 ) R 1 H 𝕕 ( ψ 𝕛 ( 𝕥 ) ) ϖ T ( 𝕥 ) R 1 U 1 R 1 ϖ ( 𝕥 ) + ψ T ( 𝕥 ) μ D T U 1 1 D μ ψ ( 𝕥 ) , 2 ϖ T ( 𝕥 ) R 1 H 𝕓 ( ψ 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) ϖ T ( 𝕥 ) R 1 U 2 R 1 ϖ ( 𝕥 ) + ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) μ B T U 2 1 B μ ψ ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) , 2 ϖ T ( 𝕥 ) R 1 [ 𝕥 τ ( 𝕥 ) 𝕥 H 𝕒 ( ψ 𝕛 ( 𝕤 ) ) 𝕕 𝕤 ] ϖ T ( 𝕥 ) R 1 U 3 R 1 θ ( 𝕥 ) + [ 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ] T [ μ A T U 3 1 A μ ] [ 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ] , 2 ψ T ( 𝕥 ) R 2 G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) ψ T ( 𝕥 ) R 2 U 4 R 2 ψ ( 𝕥 ) + ϖ T ( 𝕥 ) β N T U 4 1 N β ϖ ( 𝕥 ) , 2 ψ T ( 𝕥 ) R 2 G 𝕔 ( ϖ 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) ψ T ( 𝕥 ) R 2 U 5 R 2 ψ ( 𝕥 ) + ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) β C T U 5 1 C β ϖ ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) , 2 ψ T ( 𝕥 ) R 2 [ 𝕥 υ ( 𝕥 ) 𝕥 G 𝕖 ( ϖ 𝕚 ( 𝕤 ) ) 𝕕 𝕤 ] ψ T ( 𝕥 ) R 2 U 6 R 2 ψ ( 𝕥 ) + [ 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ] T [ β E T U 6 1 E β ] [ 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ] .
then, one has
V ˙ 1 ( 𝕥 ) ϖ T ( 𝕥 ) [ 2 RR 1 ] ϖ ( 𝕥 ) + ϖ T ( 𝕥 ) [ R 1 U 1 R 1 ] ϖ ( 𝕥 ) + ψ T ( 𝕥 ) [ μ D T U 1 1 D μ ] ψ ( 𝕥 ) + ϖ T ( 𝕥 ) [ R 1 U 2 R 1 ] ϖ ( 𝕥 ) + ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) [ μ B T U 2 1 B μ ] ψ ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) + ϖ T ( 𝕥 ) [ R 1 U 3 R 1 ] ϖ ( 𝕥 ) + 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 T [ μ A T U 3 1 A μ ] 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 , V ˙ 2 ( 𝕥 ) ψ T ( 𝕥 ) [ 2 R 2 M ] ψ 𝕥 ) + ψ T ( 𝕥 ) [ R 2 U 4 R 2 ] ψ ( 𝕥 ) + ϖ T ( 𝕥 ) [ β 𝕟 𝕥 U 4 1 N β ] ϖ ( 𝕥 ) + ψ T ( 𝕥 ) [ R 2 U 5 R 2 ] ψ ( 𝕥 ) + ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) [ β C T U 5 1 C β ] ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) + ψ T ( 𝕥 ) [ R 2 U 6 R 2 ] ψ ( 𝕥 ) + 𝕥 υ ( 𝕥 ) 𝕥 θ ( 𝕤 ) 𝕕 𝕤 T [ β E T U 6 1 E β ] 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ,
V ˙ 3 ( 𝕥 ) = σ 1 2 ϖ T ( 𝕥 ) R 3 ϖ ( 𝕥 ) σ 1 𝕥 σ 1 𝕥 ϖ T ( 𝕤 ) R 3 ϖ ( 𝕤 ) 𝕕 𝕤 + σ 2 2 ϖ T ( 𝕥 ) R 4 ϖ ( 𝕥 ) σ 2 𝕥 σ 2 𝕥 ϖ T ( 𝕤 ) R 4 ϖ ( 𝕤 ) 𝕕 𝕤 , V ˙ 4 ( 𝕥 ) = η 1 2 ψ T ( 𝕥 ) Z 1 ψ ( 𝕥 ) η 1 𝕥 η 1 𝕥 ψ T ( 𝕤 ) Z 1 ψ ( 𝕤 ) 𝕕 𝕤 + η 2 2 ψ T ( 𝕥 ) Z 2 ψ ( 𝕥 ) η 2 𝕥 η 2 𝕥 ψ T ( 𝕤 ) Z 2 ψ ( 𝕤 ) 𝕕 𝕤 , V ˙ 5 ( 𝕥 ) [ ϖ T ( 𝕥 ) T 1 ϖ ( 𝕥 ) ϖ T ( 𝕥 η ( 𝕥 ) ) T 1 ϖ ( 𝕥 η ( 𝕥 ) ) ( 1 α 1 α 2 ) ] + [ ϖ T ( 𝕥 ) T 2 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 1 ( 𝕥 ) ) T 2 θ ( 𝕥 η 1 ( 𝕥 ) ) ( 1 α 1 ) + [ ϖ T ( 𝕥 ) T 3 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 2 ( 𝕥 ) ) T 3 ϖ ( 𝕥 η 2 ( 𝕥 ) ) ( 1 α 2 ) ] + [ ϖ T ( 𝕥 ) T 4 ϖ ( 𝕥 ) ϖ T ( 𝕥 η ) T 4 θ ( 𝕥 η ) ] + [ ϖ T ( 𝕥 ) T 5 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 1 ) T 5 ϖ ( 𝕥 η 1 ) ] + [ ϖ T ( 𝕥 ) T 6 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 2 ) T 6 ϖ ( 𝕥 η 2 ) ] , V ˙ 6 ( 𝕥 ) [ ψ T ( 𝕥 ) W 1 ψ ( 𝕥 ) ψ T ( 𝕥 σ ( 𝕥 ) ) W 1 ψ ( 𝕥 σ ( 𝕥 ) ) ( 1 δ 1 δ 2 ) ] + [ ψ T ( 𝕥 ) W 2 ψ ( 𝕥 ) ψ T ( 𝕥 σ 1 ( 𝕥 ) ) W 2 ψ ( 𝕥 σ 1 ( 𝕥 ) ) ( 1 δ 1 ) + [ ψ T ( 𝕥 ) W 3 γ ( 𝕥 ) ψ T ( 𝕥 σ 2 ( 𝕥 ) ) W 3 ψ ( 𝕥 σ 2 ( 𝕥 ) ) ( 1 δ 2 ) ] + [ ψ T ( 𝕥 ) W 4 ψ ( 𝕥 ) ψ T ( 𝕥 σ ) W 4 ψ ( 𝕥 σ ) ] + [ ψ T ( 𝕥 ) W 5 ψ ( 𝕥 ) ψ T ( 𝕥 σ 1 ) W 5 ψ ( 𝕥 σ 1 ) ] + [ ψ T ( 𝕥 ) W 6 ψ ( 𝕥 ) ψ T ( 𝕥 σ 2 ) W 6 ψ ( 𝕥 σ 2 ) ] , V ˙ 7 ( 𝕥 ) = υ 2 ( 𝕥 ) ϖ T ( 𝕥 ) W 7 ϖ ( 𝕥 ) υ ( 𝕥 ) 𝕥 υ ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) W 7 ϖ ( 𝕤 ) 𝕕 𝕤 , V ˙ 8 ( 𝕥 ) = τ 2 ( 𝕥 ) ψ T ( 𝕥 ) W 8 ψ ( 𝕥 ) τ ( 𝕥 ) 𝕥 τ ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) W 8 ψ ( 𝕤 ) 𝕕 𝕤 ,
From Hypothesis 1, we have
0 ϖ T ( 𝕥 ) Q 1 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) Q 1 ϖ ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) , 0 ψ T ( 𝕥 ) Q 2 ψ ( 𝕥 ) ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) Q 2 ψ ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) .
V ˙ ( 𝕥 ) ξ T ( 𝕥 ) Φ ξ ( 𝕥 ) + ζ T ( 𝕥 ) Ψ ζ ( 𝕥 ) .
where
ξ T ( 𝕥 ) = [ ϖ T ( 𝕥 ) ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) , ϖ T ( 𝕥 η ( 𝕥 ) ) , ϖ T ( 𝕥 η 1 ( 𝕥 ) ) , ϖ T ( 𝕥 η 2 ( 𝕥 ) ) , ϖ T ( 𝕥 η ) , ϖ T ( 𝕥 η 1 ) , ϖ T ( 𝕥 η 2 ) , ( 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ) T , ( 𝕥 σ 1 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ) T , ( 𝕥 σ 2 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ) T . ]
ζ T ( 𝕥 ) = [ ψ T ( 𝕥 ) ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) , ψ T ( 𝕥 σ ( 𝕥 ) ) , ψ T ( 𝕥 σ 1 ( 𝕥 ) ) , ψ T ( 𝕥 σ 2 ( 𝕥 ) ) , ψ T ( 𝕥 σ ) , ψ T ( 𝕥 σ 1 ) , ψ T ( 𝕥 σ 2 ) , ( 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ) T , ( 𝕥 η 1 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ) T , ( 𝕥 η 2 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ) T . ]
Hence, V ˙ ( 𝕥 ) 0 , based on Lyapunov theory (13), is globally asymptotically stable. Hence, the proof is completed.  □

Appendix B

Proof of Theorem 10. Define a Lyapunov functional
V ( 𝕥 ) = 𝕜 = 1 8 V 𝕜 ( 𝕥 ) ,
where
V 1 ( 𝕥 ) = D ( 1 α ) [ ϖ T ( 𝕥 ) R 3 ϖ ( 𝕥 ) ] , V 2 ( 𝕥 ) = D ( 1 α ) [ ψ T ( 𝕥 ) R 4 ψ ( 𝕥 ) ] , V 3 ( 𝕥 ) = σ σ 0 𝕥 + θ 𝕥 ϖ T ( 𝕤 ) Q 1 ϖ ( 𝕤 ) 𝕕 𝕤 𝕕 θ , V 4 ( 𝕥 ) = η η 0 𝕥 + θ 𝕥 ψ T ( 𝕤 ) Y 1 ψ ( 𝕤 ) 𝕕 𝕤 𝕕 θ , V 5 ( 𝕥 ) = 𝕥 η 𝕥 ϖ T ( 𝕤 ) T 7 ϖ ( 𝕤 ) 𝕕 𝕤 , V 6 ( 𝕥 ) = 𝕥 σ 𝕥 ψ T ( 𝕤 ) W 7 ψ ( 𝕤 ) 𝕕 𝕤 , V 7 ( 𝕥 ) = υ ( 𝕥 ) υ ( 𝕥 ) 0 𝕥 + θ 𝕥 ϖ T ( 𝕤 ) Z 1 ϖ ( 𝕤 ) 𝕕 𝕤 𝕕 θ , V 8 ( 𝕥 ) = τ ( 𝕥 ) τ ( 𝕥 ) 0 𝕥 + θ 𝕥 ψ T ( 𝕤 ) Z 2 ψ ( 𝕤 ) 𝕕 𝕤 𝕕 θ .
By applying the Lemma 2, we have,
2 ϖ T ( 𝕥 ) R 3 H d ( ψ 𝕛 ( 𝕥 ) ) ϖ T ( 𝕥 ) R 3 U 7 R 3 ϖ ( 𝕥 ) + ψ T ( 𝕥 ) μ D T U 7 1 D μ ψ ( 𝕥 ) , 2 ϖ T ( 𝕥 ) R 3 H 𝕓 ( ψ 𝕛 ( 𝕥 σ ( 𝕥 ) ) ) ϖ T ( 𝕥 ) R 3 U 8 R 3 ϖ ( 𝕥 ) + ψ T ( 𝕥 σ ( 𝕥 ) ) ) μ B T U 8 1 B μ ψ ( 𝕥 σ ( 𝕥 ) ) , 2 ϖ T ( 𝕥 ) R 3 H 𝕒 [ 𝕥 τ ( 𝕥 ) 𝕥 ψ 𝕛 ( 𝕤 ) 𝕕 𝕤 ] ϖ T ( 𝕥 ) R 3 U 9 R 3 ϖ ( 𝕥 ) + [ 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ] T [ μ A T U 9 1 A μ ] [ 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ] , 2 ψ T ( 𝕥 ) R 4 G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) ψ T ( 𝕥 ) R 4 U 10 R 4 ψ ( 𝕥 ) + ϖ T ( 𝕥 ) β N T U 10 1 N β ϖ ( 𝕥 ) , 2 ψ T ( 𝕥 ) R 4 G 𝕔 ( ϖ 𝕚 ( 𝕥 η ( 𝕥 ) ) ψ T ( 𝕥 ) R 4 U 11 R 4 ψ ( 𝕥 ) + ϖ T ( 𝕥 η ( 𝕥 ) ) β C T U 11 1 C β ϖ ( 𝕥 η ( 𝕥 ) ) , 2 ψ T ( 𝕥 ) R 4 G 𝕖 [ 𝕥 υ ( 𝕥 ) 𝕥 ϖ 𝕚 ( 𝕤 ) 𝕕 𝕤 ] ψ T ( 𝕥 ) R 4 U 12 R 4 ψ ( 𝕥 ) + [ 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ] T [ β E 𝕥 U 12 1 E β ] [ 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ] .
The time derivatives of V ( 𝕥 ) are as follows:
V ˙ 1 ( 𝕥 ) ϖ T ( 𝕥 ) [ 2 RR 3 ] ϖ ( 𝕥 ) + ϖ T ( 𝕥 ) [ R 3 U 7 R 3 ] ϖ ( 𝕥 ) + ψ T ( 𝕥 ) [ μ D 𝕥 U 7 1 D μ ] ψ ( 𝕥 ) + ϖ T ( 𝕥 ) [ R 3 U 8 R 3 ] ϖ ( 𝕥 ) + ψ T ( 𝕥 σ ( 𝕥 ) ) [ μ B T U 8 1 B μ ] ψ ( 𝕥 σ ( 𝕥 ) ) + ϖ T ( 𝕥 ) [ R 3 U 9 U 3 ] ϖ ( 𝕥 ) + 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 T [ μ A T U 9 1 A μ ] 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 , V ˙ 2 ( 𝕥 ) ψ T ( 𝕥 ) [ 2 R 4 M ] ψ ( 𝕥 ) + ψ T ( 𝕥 ) [ R 4 U 10 R 4 ] ψ ( 𝕥 ) + ϖ T ( 𝕥 ) [ β N T U 10 1 N β ] ϖ ( 𝕥 + ψ T ( 𝕥 ) [ R 4 U 11 R 4 ] ψ ( 𝕥 ) ) + ϖ T ( 𝕥 η ( 𝕥 ) ) [ β C T U 11 1 C β ] ϖ ( 𝕥 η ( 𝕥 ) ) + ψ T ( 𝕥 ) [ R 4 U 12 R 4 ] ψ ( 𝕥 ) + 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 T [ β E T U 12 1 E β ] 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 , V ˙ 3 ( 𝕥 ) = σ 2 ϖ T ( 𝕥 ) Q 1 ϖ ( 𝕥 ) σ 𝕥 σ 𝕥 ϖ T ( 𝕤 ) Q 1 ϖ ( 𝕤 ) 𝕕 𝕤 , V ˙ 4 ( 𝕥 ) = η 2 ψ T ( 𝕥 ) Y 1 ψ ( 𝕥 ) η 𝕥 η 𝕥 ψ T ( 𝕤 ) Y 1 ψ ( 𝕤 ) 𝕕 𝕤 , V ˙ 5 ( 𝕥 ) = [ ϖ T ( 𝕥 ) T 7 ϖ ( 𝕥 ) ϖ T ( 𝕥 η ) T 7 ϖ ( 𝕥 η ) ] , V ˙ 6 ( 𝕥 ) = [ ψ T ( 𝕥 ) W 7 ψ ( 𝕥 ) ψ T ( 𝕥 σ ) W 7 ψ ( 𝕥 σ ) , V ˙ 7 ( 𝕥 ) = υ 2 ( 𝕥 ) ϖ T ( 𝕥 ) Z 1 ϖ ( 𝕥 ) υ ( 𝕥 ) 𝕥 υ ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) Z 1 ϖ ( 𝕤 ) 𝕕 𝕤 , V ˙ 8 ( 𝕥 ) = τ 2 ( 𝕥 ) ψ T ( 𝕥 ) Z 2 ψ ( 𝕥 ) τ ( 𝕥 ) 𝕥 τ ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) Z 2 ψ ( 𝕤 ) 𝕕 𝕤 ,
From Hypothesis 1, we have
0 ϖ T ( 𝕥 ) R 1 ϖ ( 𝕥 ) ϖ T ( 𝕥 η ( 𝕥 ) ) R 1 ϖ ( 𝕥 η ( 𝕥 ) ) , 0 ψ T ( 𝕥 ) R 2 ψ ( 𝕥 ) ψ T ( 𝕥 σ ( 𝕥 ) ) R 2 ψ ( 𝕥 σ ( 𝕥 ) ) .
According to Λ and Υ , this implies that
V ˙ ( 𝕥 ) ϑ T ( T ) Λ ϑ ( 𝕥 ) + Ω T ( 𝕥 ) Υ Ω ( 𝕥 ) .
ϑ T ( 𝕥 ) = [ ϖ T ( 𝕥 ) ϖ T ( 𝕥 η ( 𝕥 ) ) , ( 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ) T , ( 𝕥 σ 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ) T , ( ϖ ( 𝕥 η ) ) T ] ,
Ω T ( 𝕥 ) = [ ψ T ( 𝕥 ) ψ T ( 𝕥 σ ( 𝕥 ) ) , ( 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ) T , ( 𝕥 η 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 ) T , ( ψ ( 𝕥 σ ) ) T ] .
Hence, V ˙ ( 𝕥 ) 0 , based on Lyapunov theory (16), is globally asymptotically stable. Hence, the proof is completed.  □

Appendix C

Proof of Theorem 11. Define a Lyapunov functional
V ( 𝕥 ) = 𝕜 = 1 4 V 𝕜 ( 𝕥 ) ,
where
V 1 ( 𝕥 ) = D ( 1 α ) [ ϖ T ( 𝕥 ) R 1 ϖ ( 𝕥 ) ] , V 2 ( 𝕥 ) = D ( 1 α ) [ ψ T ( 𝕥 ) R 2 ψ ( 𝕥 ) ] , V 3 ( 𝕥 ) = 𝕥 η ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) T 1 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 1 ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) T 2 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 2 ( 𝕥 ) 𝕥 ϖ T ( 𝕤 ) T 3 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 𝕥 ϖ T ( 𝕤 ) 𝕥 4 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 1 𝕥 ϖ T ( 𝕤 ) T 5 ϖ ( 𝕤 ) 𝕕 𝕤 + 𝕥 η 2 𝕥 ϖ T ( 𝕤 ) T 6 ϖ ( 𝕤 ) 𝕕 𝕤 , V 4 ( 𝕥 ) = 𝕥 σ ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) W 1 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 1 ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) W 2 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 2 ( 𝕥 ) 𝕥 ψ T ( 𝕤 ) W 3 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 𝕥 ψ T ( 𝕤 ) W 4 ψ ( 𝕤 ) 𝕕 𝕤 + 𝕥 σ 1 𝕥 ψ T ( 𝕤 ) W 5 ψ ( 𝕤 ) 𝕕 𝕤 + T σ 2 𝕥 ψ T ( 𝕤 ) W 6 ψ ( 𝕤 ) 𝕕 𝕤 .
By applying Lemma 2, we have
2 ϖ T ( 𝕥 ) R 1 H 𝕕 ( ψ 𝕛 ( 𝕥 ) ) ϖ T ( 𝕥 ) R 1 U 1 R 1 ϖ ( 𝕥 ) + ψ T ( 𝕥 ) μ D T U 1 1 D μ ψ ( 𝕥 ) , 2 ϖ T ( 𝕥 ) R 1 H 𝕓 ( ψ 𝕛 ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) ) ϖ T ( 𝕥 ) R 1 U 2 R 1 ϖ ( 𝕥 ) + ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) μ B T U 2 1 B μ ψ ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) , 2 ψ T ( 𝕥 ) R 2 G 𝕟 ( ϖ 𝕚 ( 𝕥 ) ) ψ T ( 𝕥 ) R 2 U 4 R 2 ψ ( 𝕥 ) + ϖ T ( 𝕥 ) β N T U 4 1 N β ϖ ( 𝕥 ) , 2 ψ T ( 𝕥 ) R 2 G 𝕔 ( ϖ 𝕚 ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ) ψ T ( 𝕥 ) R 2 U 5 R 2 ψ ( 𝕥 ) + ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) β C T U 5 1 C β ϖ ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) ,
The time derivatives of V ( 𝕥 ) are as follows:
V ˙ 1 ( 𝕥 ) ϖ T ( 𝕥 ) [ 2 RR 1 ] ϖ ( 𝕥 ) + ϖ T ( 𝕥 ) [ R 1 U 1 R 1 ] ϖ ( 𝕥 ) + ψ T ( 𝕥 ) [ μ D T U 1 1 D μ ] ψ ( 𝕥 ) + ϖ T ( 𝕥 ) [ R 1 U 2 R 1 ] ϖ ( 𝕥 ) + ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) [ μ B T U 2 1 B μ ] ψ ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) + ϖ T ( 𝕥 ) [ R 1 U 3 R 1 ] ϖ ( 𝕥 ) + 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 T [ μ A T U 3 1 A μ ] 𝕥 τ ( 𝕥 ) 𝕥 ψ ( 𝕤 ) 𝕕 𝕤 , V ˙ 2 ( 𝕥 ) ψ T ( 𝕥 ) [ 2 R 2 M ] ψ ( 𝕥 ) + ψ T ( 𝕥 ) [ R 2 U 4 R 2 ] ψ ( 𝕥 ) + ϖ T ( 𝕥 ) [ β N T U 4 1 N β ] ϖ ( 𝕥 ) + ψ T ( 𝕥 ) [ R 2 U 5 R 2 ] ψ ( 𝕥 ) + ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) [ β C T U 5 1 C β ] ϖ ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) + ψ T ( 𝕥 ) [ R 2 U 6 R 2 ] ψ ( 𝕥 ) + 𝕥 υ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 T [ β E T U 6 1 E β ] 𝕥 ϑ ( 𝕥 ) 𝕥 ϖ ( 𝕤 ) 𝕕 𝕤 ,
V ˙ 3 ( 𝕥 ) [ ϖ T ( 𝕥 ) T 1 ϖ ( 𝕥 ) ϖ T ( 𝕥 η ( 𝕥 ) ) T 1 ϖ ( 𝕥 η ( 𝕥 ) ) ( 1 α 1 α 2 ) ] + [ ϖ T ( 𝕥 ) T 2 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 1 ( 𝕥 ) ) T 2 ϖ ( 𝕥 η 1 ( 𝕥 ) ) ( 1 α 1 ) + [ ϖ T ( 𝕥 ) T 3 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 2 ( 𝕥 ) ) T 3 ϖ ( 𝕥 η 2 ( 𝕥 ) ) ( 1 α 2 ) ] + [ ϖ T ( 𝕥 ) T 4 ϖ ( 𝕥 ) ϖ T ( 𝕥 η ) T 4 ϖ ( 𝕥 η ) ] + [ ϖ T ( 𝕥 ) T 5 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 1 ) T 5 ϖ ( 𝕥 η 1 ) ] + [ ϖ T ( 𝕥 ) T 6 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 2 ) T 6 ϖ ( 𝕥 η 2 ) ] , V ˙ 4 ( 𝕥 ) [ ψ T ( 𝕥 ) W 1 ψ ( 𝕥 ) ψ T ( 𝕥 σ ( 𝕥 ) ) W 1 ψ ( 𝕥 σ ( 𝕥 ) ) ( 1 δ 1 δ 2 ) ] + [ ψ T ( 𝕥 ) W 2 ψ ( 𝕥 ) ψ T ( 𝕥 σ 1 ( 𝕥 ) ) W 2 γ ( 𝕥 σ 1 ( 𝕥 ) ) ( 1 δ 1 ) + [ ψ T ( 𝕥 ) W 3 γ ( 𝕥 ) ψ T ( 𝕥 σ 2 ( 𝕥 ) ) W 3 ψ ( 𝕥 σ 2 ( 𝕥 ) ) ( 1 δ 2 ) ] + [ ψ T ( 𝕥 ) W 4 ψ ( 𝕥 ) ψ T ( 𝕥 σ ) W 4 ψ ( 𝕥 σ ) ] + [ ψ T ( 𝕥 ) W 5 ψ ( 𝕥 ) ψ T ( 𝕥 σ 1 ) W 5 ψ ( 𝕥 σ 1 ) ] + [ ψ T ( 𝕥 ) W 6 ψ ( 𝕥 ) ψ T ( 𝕥 σ 2 ) W 6 ψ ( 𝕥 σ 2 ) ]
From Hypothesis 1, we have
0 ϖ T ( 𝕥 ) Q 1 ϖ ( 𝕥 ) ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) ) Q 1 ϖ ( 𝕥 η ( 𝕥 ) ) , 0 ψ T ( 𝕥 ) Q 2 ψ ( 𝕥 ) ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) ) Q 2 ψ ( 𝕥 σ ( 𝕥 ) ) .
Adding to the above, we obtain
V ˙ ( 𝕥 ) T ( 𝕥 ) ξ ( 𝕥 ) + T ( 𝕥 ) π ( 𝕥 ) < 0 ,
we have,
V ˙ ( 𝕥 ) 0
( 𝕥 ) = [ ϖ T ( 𝕥 ) ϖ T ( 𝕥 η 1 ( 𝕥 ) η 2 ( 𝕥 ) , ϖ T ( 𝕥 η ( 𝕥 ) ) , ϖ T ( 𝕥 η 1 ( 𝕥 ) ) , ϖ T ( 𝕥 η 2 ( 𝕥 ) ) , ϖ T ( 𝕥 η ) , ϖ T ( 𝕥 η 1 ) , ϖ T ( 𝕥 η 2 ) ] ,
( 𝕥 ) = [ ψ T ( 𝕥 ) ψ T ( 𝕥 σ 1 ( 𝕥 ) σ 2 ( 𝕥 ) , ψ T ( 𝕥 σ ( 𝕥 ) ) , ψ T ( 𝕥 σ 1 ( 𝕥 ) ) , ψ T ( 𝕥 σ 2 ( 𝕥 ) ) , ψ T ( 𝕥 σ ) , ψ T ( 𝕥 σ 1 ) , ψ T ( 𝕥 σ 2 ) ] .
Hence V ˙ ( 𝕥 ) 0 Based on Lyapunov theory, (19) is a globally asymptotically stable. Hence, the proof is completed.  □

References

  1. Kilbas, A.A.; Srivastava, H.M.; Trujillo, J.J. Theory and Applications of Fractional Differential Equations; Elsevier: Amsterdam, The Netherlands, 2006. [Google Scholar]
  2. Podlubny, I. Fractional Differential Equations; Academic Press: New York, NY, USA, 1998. [Google Scholar]
  3. Heaviside, O. Electromagnetic Theory; Chelsea: New York, NY, USA, 1971. [Google Scholar]
  4. Ma, W.; Li, C.; Wu, Y.; Wu, Y. Adaptive synchronization of fractional neural networks with unknown parameters and time delays. Entropy 2014, 16, 6286–6299. [Google Scholar] [CrossRef]
  5. Koeller, R.C. Application of fractional calculus to the theory of viscoelasticity. J. Appl. Mech. 1984, 51, 294–298. [Google Scholar] [CrossRef]
  6. Bagley, R.L.; Calico, R.A. Fractional order state equations for the control of viscoelastically damped structures. J. Guid. Control Dyn. 1991, 14, 304–311. [Google Scholar] [CrossRef]
  7. Hartley, T.T.; Lorenzo, C.F. Dynamics and control of initialized fractional-order systems. Nonlinear Dyn. 2002, 29, 201–233. [Google Scholar] [CrossRef]
  8. Ahmeda, E.; Elgazzar, A.S. On fractional order differential equations model for nonlocal epidemics. Phys. A Stat. Mech. Its Appl. 2007, 379, 607–614. [Google Scholar] [CrossRef]
  9. Benchohra, M.; Henderson, J.; Ntouyas, S.K.; Ouahab, A. Existence results for fractional functional differential inclusions with infinite delay and applications to control theory. Fract. Calc. Appl. Anal. 2008, 11, 35–56. [Google Scholar]
  10. Zhang, Z.; Cao, J.; Zhou, D. Novel LMI-based condition on global asymptotic stability for a class of Cohen-Grossberg BAM networks with extended activation functions. IEEE Trans. Neural Netw. Learn Syst. 2014, 25, 1161–1172. [Google Scholar]
  11. Ji, H.; Zhang, H.; Senping, T. Reachable set estimation for inertial Markov jump BAM neural network with partially unknown transition rates and bounded disturbances. J. Franklin Inst. 2017, 354, 7158–7182. [Google Scholar] [CrossRef]
  12. Wang, F.; Yang, Y.; Xu, X.; Li, L. Global asymptotic stability of impulsive fractional-order BAM neural networks with time delay. Neural Comput. Appl. 2012, 28, 345–352. [Google Scholar] [CrossRef]
  13. Kosko, B. Bidirectional associative memories. IEEE Trans. Syst. Cybern. 1998, 18, 49–60. [Google Scholar] [CrossRef] [Green Version]
  14. Ali, M.; Hymavathi, M.; Rajchakit, G.; Sumit, S.; Palanisamy, L.; Porpattama, H. Synchronization of Fractional Order Fuzzy BAM Neural Networks With Time Varying Delays and Reaction Diffusion Terms. IEEE Access 2020, 8, 186551–186571. [Google Scholar] [CrossRef]
  15. Yang, W.; Yu, W.; Cao, J.; Alsaadi, F.E.; Hayat, T. Global exponential stability and lag synchronization for delayed memristive fuzzy Cohen-Grossberg BAM neural networks with impulses. Neural Netw. 2018, 98, 122–153. [Google Scholar] [CrossRef]
  16. Kosko, B. Adaptive bidirectional associative memories. Appl. Opt. 1987, 26, 4947–4960. [Google Scholar] [CrossRef] [Green Version]
  17. Huang, M.; Xu, C.J. Existence and exponential stability of anti-periodic solutions in bidirectional associative memory neural networks with distributed delays. J. Comput. Theor. Nanosci. 2016, 13, 964–970. [Google Scholar] [CrossRef]
  18. Cai, Z.; Huang, L. Functional diferential inclusions and dynamic behaviors for memristor-based BAM neural networks with time-varying delays. Commun. Nonlinear Sci. Numer. Simul. 2014, 19, 1279–1300. [Google Scholar] [CrossRef]
  19. Wang, W.; Yu, M.; Luo, X.; Liu, L.; Yuan, M.; Zhao, W. Synchronization of memristive BAM neural networks with leakage delay and additive time-varying delay components via sampled-data control. Chaos Solit Fractals 2017, 104, 84–97. [Google Scholar] [CrossRef]
  20. Rajivganthi, C.; Rihan, F.; Lakshmanan, S.; Rakkiyappan, R.; Muthukumar, P. Synchronization of memristor-based delayed BAM neural networks with fractional-order derivatives. Complexity 2016, 21, 412–426. [Google Scholar] [CrossRef]
  21. Wang, P.; Zhong, S.; Lei, H. Delay-Dependent Stability Analysis for Neural Networks with Two Additive Time-Varying Delay Components. In Proceedings of the 2012 International Conference on Computer Science and Service System, Nanjing, China, 11–13 August 2012; pp. 2181–2184. [Google Scholar]
  22. Cao, J. Global asymptotic stability of delayed bidirectional memory neural networks. Appl. Math. Comput. 2003, 142, 333–339. [Google Scholar]
  23. Shao, H.; Han, Q. New delay-dependent stability criteria for neural networks with two additive time-varying delay components. IEEE Trans. Neural Netw. 2011, 22, 812–818. [Google Scholar] [CrossRef] [PubMed]
  24. Lam, J.; Gao, H.; Wang, C. Stability analysis for continuous systems with two additive time-varying delay components. Syst. Control Lett. 2007, 56, 16–24. [Google Scholar] [CrossRef]
  25. Gao, H.; Chen, T.; Lam, J. A new delay system approach to network-based control. Automatica 2008, 44, 39–52. [Google Scholar] [CrossRef] [Green Version]
  26. Lou, X.Y.; Cui, B.T. Stochastic exponential stability for Markovian jumping BAM neural networks with time-varying delays. IEEE Trans. Syst. Man Cybern. B 2007, 37, 713–719. [Google Scholar] [CrossRef] [PubMed]
  27. Cao, Y.; Wang, S.; Guo, Z.; Huang, T.; Wen, S. Synchronization of memristive neural networks with leakage delay and parameters mismatch via event-triggered control. Neural Netw. 2019, 119, 178–189. [Google Scholar] [CrossRef] [PubMed]
  28. Guo, Z.; Wang, J.; Yan, Z. Global exponential synchronization of two memristor-based recurrent neural networks with time delays via static or dynamic coupling. IEEE Trans. Syst. Manand Cybern. 2015, 45, 235–249. [Google Scholar] [CrossRef]
  29. Xiong, L.L.; Zhang, H.Y.; Li, Y.K.; Liu, Z.X. Improved stability and H infinity performance for neutral systems with uncertain Markovian jump. Nonlinear-Anal.-Hybrid Syst. 2016, 19, 13–25. [Google Scholar] [CrossRef]
  30. Cao, J.; Chen, T. Globally exponentially robust stability and periodicity of delayed neural networks. Chaos Solut. Fractals 2004, 22, 957–963. [Google Scholar] [CrossRef]
  31. Ensari, T.; Arik, S. New results for robust stability of dynamical neural networks with discrete time delays. Expert Syst. Appl. 2010, 37, 5925–5930. [Google Scholar] [CrossRef]
  32. Li, X.; She, K.; Zhong, S.; Shi, K.; Kang, W.; Cheng, J.; Yu, Y. Extended robust global exponential stability for uncertain switched memristor-based neural networks with time-varying delays. Appl. Math. Comput. 2018, 325, 271–290. [Google Scholar] [CrossRef]
  33. Lakshmanan, S.; Balasubramaniam, P. New results of robust stability analysis for neutral-type neural networks with time-varying delays and Markovian jumping parameters. Can. J. Phys. 2011, 89, 827–840. [Google Scholar] [CrossRef]
  34. Yu, H.; Wu, H. Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions. J. Math. Sci. 2012, 187, 511–523. [Google Scholar] [CrossRef]
  35. Balasubramaniam, P.; Vembarasan, V.; Rakkiyappan, R. Delay-dependent robust asymptotic state estimation of TakagiSugeno fuzzy Hopfield neural networks with mixed interval time-varying delays. Expert Syst. Appl. 2012, 39, 472–481. [Google Scholar] [CrossRef]
  36. Balasubramaniam, P.; Vembarasan, V.; Rakkiyappan, R. Global robust asymptotic stability analysis of uncertain switched Hopfield neural networks with time delay in the leakage term. Neural Comput. Appl. 2012, 21, 1593–1616. [Google Scholar] [CrossRef]
  37. Li, T.; Guo, L.; Lin, C. A new criterion of delay-dependent stability for uncertain time-delay systems. IET Control Theory Appl. 2007, 1, 611–616. [Google Scholar] [CrossRef]
  38. Zhou, S.; Li, T.; Shao, H.Y.; Zheng, W.X. Output feedback H control for uncertain discrete-time hyperbolic fuzzy systems. Eng. Appl. Artif. Intell. 2006, 19, 487–499. [Google Scholar] [CrossRef]
  39. Sabatier, J.; Agrawal, O.P.; Machado, J.A.T. Advances in Fractional Calculus; Springer: Dordrecht, The Netherlands, 2007. [Google Scholar]
  40. Syed Ali, M.; Hymavathi, M.; Priya, B.; Asma Kauser, S.; Kumar Thakur, G. Stability analysis of stochastic fractional-order competitive neural networks with leakage delay. AIMS Math. 2021, 6, 3205–3241. [Google Scholar] [CrossRef]
  41. Syed Ali, M.; Hymavathi, M.; Sibel, S.; Vineet, S.; Sabri, A. Global Asymptotic Synchronization of Impulsive Fractional-Order Complex-Valued Memristor-Based Neural Networks with Time Varying Delays. Commun. Innonlinear Sci. Numer. Simul. 2019, 78, 104869. [Google Scholar] [CrossRef]
  42. Jackubovic, V.A. The S-procedure in nonlinear control theory. Vetnik Leningrad Univ. Math. 1977, 4, 73–93. [Google Scholar]
  43. Syed Ali, M.; Hymavathi, M. Synchronization of Fractional Order Neutral Type Fuzzy Cellular Neural Networks with Discrete and Distributed Delays via State Feedback Control. Neural Process. Lett. 2021, 53, 925–957. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Han, X.; Hymavathi, M.; Sanober, S.; Dhupia, B.; Syed Ali, M. Robust Stability of Fractional Order Memristive BAM Neural Networks with Mixed and Additive Time Varying Delays. Fractal Fract. 2022, 6, 62. https://doi.org/10.3390/fractalfract6020062

AMA Style

Han X, Hymavathi M, Sanober S, Dhupia B, Syed Ali M. Robust Stability of Fractional Order Memristive BAM Neural Networks with Mixed and Additive Time Varying Delays. Fractal and Fractional. 2022; 6(2):62. https://doi.org/10.3390/fractalfract6020062

Chicago/Turabian Style

Han, Xiuping, M. Hymavathi, Sumaya Sanober, Bhawna Dhupia, and M. Syed Ali. 2022. "Robust Stability of Fractional Order Memristive BAM Neural Networks with Mixed and Additive Time Varying Delays" Fractal and Fractional 6, no. 2: 62. https://doi.org/10.3390/fractalfract6020062

APA Style

Han, X., Hymavathi, M., Sanober, S., Dhupia, B., & Syed Ali, M. (2022). Robust Stability of Fractional Order Memristive BAM Neural Networks with Mixed and Additive Time Varying Delays. Fractal and Fractional, 6(2), 62. https://doi.org/10.3390/fractalfract6020062

Article Metrics

Back to TopTop