Design and Practical Stability of a New Class of Impulsive Fractional-Like Neural Networks

In this paper, a new class of impulsive neural networks with fractional-like derivatives is defined, and the practical stability properties of the solutions are investigated. The stability analysis exploits a new type of Lyapunov-like functions and their derivatives. Furthermore, the obtained results are applied to a bidirectional associative memory (BAM) neural network model with fractional-like derivatives. Some new results for the introduced neural network models with uncertain values of the parameters are also obtained.


Generalized FLDs and Integrals
In this Section, we will state some main definitions and lemmas following [51][52][53]. Let R + = [0, ∞), R n be the n-dimensional Euclidean space, and let Ω ⊂ R n be a bounded domain that contains the origin.
For givent ∈ R + and 0 < q ≤ 1, we will consider a generalized q th −order fractional-like derivative D q t x(t) for a function x : [t, ∞) → R n defined as [52] D q t x(t) = lim , θ → 0 , t >t.

Impulsive Fractional-Like Neural Networks: Main Notions and Definitions
In this paper, we consider the next system of impulsive Hopfield fractional-like neural networks defined as where . . , n, n ≥ 2, k = 1, 2, . . . . In the above impulsive fractional-like neural network model, x i (t) represents the state of the i−th node at time t, n corresponds to the number of neurons in the neural network, the positive functions C i , R i are, respectively, the capacitance and the resistance for the node i at time t, α ij are the connection weights, f j denotes the activation function which determines the output f j (x j (t)) of the jth unit at time t, γ i denotes the external bias of the node i at time t, and t k , k = 1, 2, ... are the moments of impulsive perturbations and satisfy t 0 < t 1 < t 2 · · · < t k < t k+1 < . . . , lim k→∞ t k = ∞. The numbers x i (t − k ) and x i (t + k ) are, respectively, the states of the ith node before and after an impulse perturbation at the moment t k and the functions I ik represent the magnitudes of the impulsive changes of the states x i (t) at the impulsive moments t k .
Let x 0 = col(x 01 , x 02 , . . . , x 0n ) ∈ Ω. We will denote by x(t) = x(t; t 0 , x 0 ) the solution of the fractional-like impulsive neural network system (1) that satisfies the initial condition Following the theory of impulsive fractional-order neural network systems [27,30], and the new theory of impulsive fractional-like systems [51][52][53], the solutions x(t) of the neural network models (1) are piecewise continuous functions that have points of discontinuity of the first kind t k and are left continuous at these moments. For such functions, the following identities are satisfied: All of these piecewise continuous functions formed the space PC q (R + , R n ). Let h : [t 0 , ∞) × Ω → R be a continuous function. The next sets will be called h − mani f olds defined by the function h: To guarantee that the solution x(t; t 0 , x 0 ) of the initial value problem (IVP) (1)-(2) exists on [t 0 , ∞), and for the future investigations we will need the following assumptions.
A1. The function h is continuous on [t 0 , ∞) × Ω and the sets M t , M t (ε) are (n − 1)-dimensional manifolds in R n .
In this paper we will use the following definition for practical exponential stability of the neural network system (1) with respect to manifolds defined by the function h given in [51].

Remark 3.
The problems of exponential stability of integer-order neural networks have been investigated by numerous authors [3,4,8,[11][12][13][14]. Indeed, the concept of exponential stability is one of the the most important qualitative concepts for such models because it guarantees the fast convergent rate [13]. The notion of exponential stability has been generalized in [66] to this of Mittag-Leffler stability for fractional-order systems. For Mittag-Leffler stability results of fractional neural networks see, for example [27,28,30] and the bibliography therein. With the present research, we will complement the existing results and will present results on (λ, A)-practical exponential stability for impulsive fractional-like neural network systems.
What follows is the definition of the class V q t k of Lyapunov-like functions defined in [52] for any t k ∈ R + , k = 0, 1, 2, . . . .
V is defined on G, V has non-negative values and V(t, 0) = 0 for t ≥ t k ; 2.
V is continuous in G, q−differentiable in t and locally Lipschitz continuous with respect to its second argument on each of the sets G k ; 3.
For each k = 0, 1, 2, . . . and x ∈ Ω, there exist the finite limits For a function V ∈ V q t k , t > t k , we define its the upper right fractional-like derivative as [52]: Let for simplicity denote by Then [46,52] the fractional-like derivative of the function V(t, x) with respect to the solution where V is a partial derivative of the function V.
From (3) and (4) it follows We will also need the following result from [52]. Then In what follows, for a bounded continuous function f defined on R + , we set Theorem 1. Assume that 0 < λ < A are given, and: 1. Assumptions A1-A3 hold.
Then the neural network system (1) is (λ, A)-practically exponentially stable with respect to the function h.

Remark 4.
If the assumptions of Theorem 1 hold globally on R n , i.e., if Ω ≡ R n , then the system (1) is (λ, A)-globally practically exponentially stable with respect to the function h. Note that, in this case the condition x ∈ Ω implies x + I k (x) ∈ Ω for k = 1, 2, . . . is obvious.
Remark 5. Theorem 1 offers sufficient conditions for practical exponential stability (global practical exponential stability) with respect to a function h for the designed fractional-like impulsive neural network model. Exponential stability results for single solutions of the model (1) (equilibrium, zero solution, periodic solution) can be obtained as corollaries for particular choices of the function h. For example, in the case when h(t, x) = ||x − x * ||, where x * is a single solution of (1) and ||.|| is the norm in R n , our results extend and improve the existing exponential stability results for integer-order neural networks [3,4,8,[11][12][13][14].

Remark 6.
Our results also complement the existing Mittag-Leffler stability results for fractional neural networks [27,28,30]. The key features of FLDs provide less complicated from the computational aspects criteria. Thus, the new results are more appropriate for the numerous applications of neural network models with derivatives of non integer order.
The new exponential stability results proved in Theorem 1 can be useful for various classes of fractional-like neural network models. Next, we will apply the obtained criteria to study the practical stability properties of following system of impulsive Hopfield fractional-like bidirectional associative memory (BAM) neural networks: where t 0 ∈ R + , t 0 < t 1 < t 2 , . . . , j = 1, 2, . . . , n 1 , i = 1, 2, . . . , n 2 , n = n 1 + n 2 , x i (t) and y j (t) correspond to the states of the ith unit and jth unit, respectively, at time t, C y i , R y i , C z i , R z i are positive constants, the real constants w ji , h ij are the connection weights, f z j , g y i ∈ C q [R, R] are the activation functions; γ y i , γ z j ∈ C q [R + , R] denote external inputs at time t, and the constants Q ik , T jk determine the abrupt changes of the states at the impulsive moments t k .
Note that different types of BAM neural networks of integer order have been intensively investigated due to the great opportunities for their application in many fields such as pattern recognition and automatic control [11,12]. Results on fractional BAM neural network models with Caputo fractional derivatives have been also published in the recent literature. See, for example [27] and the references therein. In this Section, we will extend the existing results to the fractional-like case.
and κ * is such that 4. For the function h(t, y, z) we have where Λ(H) ≥ 1 exists for any 0 < H ≤ ∞.
Then (8) is (λ, A)-globally practically exponentially stable with respect to the function h.
Proof. The proof of Theorem 2 follows the steps in the proof of Theorem 1. In this case we can use the Lyapunov's function Then, inequalities in the form (5) follow from the condition A5 and instead of (7), from condition 1 of Theorem 2, we get Condition 2 of Theorem 2 implies the existence of a positive number κ * such that κ * ≤ min min and, hence + D q t k V(y(t), z(t)) ≤ −κ * V(y(t), z(t)) + G(t).
The proof is completed by applying conditions 3 and 4 of Theorem 2.

Example 2. Consider the following impulsive BAM fractional-like Hopfield neural network model
and t k → ∞ as k → ∞.
For the system (12) all conditions of Theorem 2 are satisfied. Indeed, we have that L z j = M y i = 1, i, j = 1, 2, for i, j = 1, 2, k = 1, 2, . . . and 0 < κ * ≤ 0.2. Hence, the fractional-like impulsive BAM neural network system (10) is (λ, A)-globally practically exponentially stable with respect to the function h(y 1 , y 2 , z 1 , The global exponentially stable behavior is shown in Figure 2 for λ = 8, A = 11. (a) Behavior of the state variable y 1 (t); (b) Behavior of the state variable y 2 (t); (c) Behavior of the state variable z 1 (t); (d) Behavior of the state variable z 2 (t).
Then we have that, if all uncertain terms are bounded, and satisfied all conditions of Theorem 3, the system (9) is (λ, A)-globally practically robustly exponentially stable with respect to the function h(x 1 , x 2 ) = |x 1 | + |x 2 |.
Note that, if some of the uncertain terms is unbounded, Theorem 3 cannot guarantee the robust practical stability of the fractional-like model (9). For example, forP 2k = 2, k = 1, 2, . . . , the unstable behavior of the model (14) is shown in Figure 3 for λ = 5, A = 9.

Conclusions
In this paper a new class of impulsive neural network systems with FLDs has been proposed. Practical stability analysis is performed and efficient sufficient conditions are established. With this research we extend the results on impulsive neural network Hopfield-type models to the fractional-like case. In addition, the obtained results are applied to neural networks with uncertain valued of parameters. Since the use of FLDs overcome some difficulties in evaluating fractional derivatives the obtained results are more appropriate for applications.