Adaptive Synchronization for a Class of Uncertain Fractional-Order Neural Networks

In this paper, synchronization for a class of uncertain fractional-order neural networks subject to external disturbances and disturbed system parameters is studied. Based on the fractional-order extension of the Lyapunov stability criterion, an adaptive synchronization controller is designed, and fractional-order adaptation law is proposed to update the controller parameter online. The proposed controller can guarantee that the synchronization errors between two uncertain fractional-order neural networks converge to zero asymptotically. By using some proposed lemmas, the quadratic Lyapunov functions are employed in the stability analysis. Finally, numerical simulations are presented to confirm the effectiveness of the proposed method.


Introduction
Fractional calculus, as a branch of mathematics, mainly handles the generalization of the concepts of differentiation and the integration of arbitrary orders.Nowadays, fractional-order systems, which are expressed by fractional-order differential equations, play a momentous role in different control applications [1][2][3][4].For instance, these fractional-order systems have been utilized in controller design, in mathematical modeling of some real-life phenomena, and in identification of physical systems [5][6][7][8][9][10].One of the major advantages of the fractional-order derivatives is that they can furnish a satisfying framework for the expression of memory and hereditary properties of certain processes and materials.Consequently, many researchers utilize fractional-order calculus in neural networks to construct fractional-order neural networks, which can be more appropriate to describe the dynamical behavior of the neurons in neural networks, such as "memory".In [11], Arena et al. first give a cellular neural network with fractional-order cells.In [12], Petráč introduces a fractional-order 3-cell network which exhibits limit cycles and stable orbits for different parameter values.Besides, fractional-order neural networks may play an important role in parameter estimation domain [13,14].Accordingly, the combination of memory terms (a fractional-order derivative or integral operator) into neural networks is an important improvement, and it will be of great significance to study fractional-order neural networks [3,15].
The synchronization problem has captured more and more researchers's attention due to its potential applications in secure communication [3,6,7,15].It is known that there are many synchronization results about integer-order neural networks (see, [16,17] and references therein).Since bifurcations and chaos of fractional-order neural networks have been firstly appeared in [18,19], some significant results on fractional-order neural networks have been given.In [20], a fractional-order Hopfield neural model is discussed, and its stability is studied by energy-like functions.Chaos in fractional-order cellular neural networks is investigated in [21].By using the Mittag-Leffler function, M-matrix and linear feedback control, the synchronization problem for a class of fractional-order chaotic neural networks is studied in [3].For more recent results concerning chaotic synchronization in fractional-order neural networks one can refer to [22][23][24] and the references therein.
Meantime, the actual systems are often subject to external disturbances and some system uncertainties.These unknown terms, which have to be handled during analyzing and controlling the system, can be brought through various forms, and they are hardly to be avoided.To the authors' best knowledge, there are only very few works dealing with the robust synchronization of uncertain fractional-order neural networks.
Based on the above discussions, we know that it is still of considerable meaning to find some direct systematic approaches for designing robust synchronization controllers of uncertain fractional-order neural networks.Thanks to the works of Li et al. [4], the Lyapunov direct method (also called the Lyapunov second method) has been extended to fractional-order nonlinear systems.In this paper, a robust controller is designed to solve the synchronization problem of fractional-order neural networks with both system uncertainties and unknown Lipshitz constants.The fractional-order Lyapunov approach is used to analyze the stability of the closed-loop system.There are three main contributions which are worth to be emphasized: (1) adaptive synchronization controller is designed for fractional-order neural networks; (2) quadratic Lyapunov function is used in the stability analysis for fractional-order neural networks; (3) fractional-order adaptation law is proposed to update the controller parameter.
The rest of this paper is organized as follows: Some necessary definitions and lemmas are given in Section 2. The introduction of fractional-order network model, the design of synchronization controller and stability analysis are included in Section 3. Section 4 presents a simulation example.Finally, this paper is concluded in Section 5.

Preliminaries
The fractional-order integrodifferential operator is the extended concept of the integer-order integrodifferential operator.The commonly used definitions in literatures are Grünwald-Letnikov, Riemann-Liouville, and Caputo definitions.The main reason why Caputo's derivative was introduced for engineering applications, is that its Laplace transform requires integer-order derivatives for the initial conditions.On the contrary, the classical Riemann-Liouville definition has a Laplace transform that involves fractional-order derivatives that are difficult to be physically interpreted.We will use the Caputo's derivative in this paper.The lower limit of the fractional calculus is set as 0. The fractional-order integral with order α can be expressed as: where Γ(•) is the Euler's Gamma function.The Caputo fractional derivative is defined as follows: where α is the fractional order, and n is an integer satisfying n − 1 ≤ α < n.For simplicity, we will invariably assume that 0 < α < 1 in the rest of this paper.Consequently, (2) can be written as: In controlling nonlinear systems, Lyapunov second method gives a way to analyze the stability of the system without explicitly solving the differential equations.Although the Lyapunov stability theory for integer-order systems was proposed in 1892 and it has been studied and modified by lots of expert researchers, the Lyapunov stability theory for fractional order systems has not been developed until recently [4].One of the main contribution of [4] is the following Lemma: Lemma 1. [4] Suppose that x = 0 be an equilibrium of the following fractional-order nonlinear system: where x(t) ∈ R n is the state vector, and f (x) ∈ R n is a Lipschitz continuous nonlinear function.If there exists a Lyapunov function V (t, x(t)) such that: where 0 < β < 1, t ≥ 0, g 1 , g 2 , g 3 , p 1 and p 2 are arbitrary positive constants.Then the equilibrium point of system ( 4) is Mittag-Leffler stable.
The following Lemmas and properties will be used in this paper.
Lemma 3. [25,26] Let x(t) ∈ R n be a continuous and derivable function.Then for any t > 0 the following inequality holds: Definition 1.
[1] The Mittag-Leffler function with two parameters can be written as: where α, β > 0 and The Laplace transform of Mittag-Leffler function is for some T > 0, then the following equations hold: and In this paper, we employ the Caputo version and use an algorithm for fractional order differential equations, which is the generalization of Adams-Bashforth-Moulton one.A brief introduction of the algorithm is given as following.

Problem Statement
The dynamic of a class of fractional-order cellular neural networks can be expressed by the following fractional-order differential equations: where α is the fractional order, n is the number of units in the neural network, x i (t) represents the state of the ith unit at time t, āij (t), which is assumed to be disturbed, is the constant connection weight of the jth neuron on the ith neuron, f j (x) is unknown nonlinear function, c i corresponds to the rate with which the ith neuron will reset its potential to the resting state when disconnected from the network, and  , then the fractional-order neural network ( 14) can be rewritten into the following compact form: The following assumptions are needed.
Assumption 1.The unknown nonlinear function f (x(t)) is fully unknown, and there exists an unknown positive constant for all x 1 , x 2 ∈ R n .
Remark 1. Assumption 1 can be seen in many literatures [3,13,[15][16][17][18][19][22][23][24].Besides, different from most of above works, in this paper, we assume that the Lipschitz constant L i is unknown.Accordingly, Assumption 1 is actual, in fact, the exact value of a Lipschitz constant is hard to determine in a Lipschitz continuous function.
Assumption 2. The disturbed parameter matrix A(t) has bounded norm with unknown upper bound, i.e., their exists an unknown positive constant κ such that A ≤ κ.
Based on the drive-response concept, we set the system (14) as the drive fractional-order neural network and consider a response network which can be characterized as follows: or, equivalently: where the control input which will be given later.
Assumption 3. The external disturbance d i (t) is a bounded continuous function, i.e., there exists an unknown positve constant ρ i such that Remark 2. It should be noted that Assumption 3 is not restrictive because we only assume the existence of the upper bound of the disturbance d i (t), and its exact value is not needed in the controller design.

Adaptive Synchronization Controller Design and Stability Analysis
Defining the synchronization error e(t) = [e 1 (t), • • • , e n (t)] T as: then the error dynamics between the response system (18) and the drive system (15) can be written as: Multiplying e T (t) to both sides of (21), and using Assumption where a = κL is an unknown constant.Then, the controller can be designed as where â(t) and ρi (t) are the estimation functions of the unknown positive constants a and ρ i , respectively.To proceed, let us give the following Lemmas first.
Proof.It is easy to know that there exists a nonnegative function h(t) such that Taking the Laplace transform on (24) gives where Y (s) and H(s) are the Laplace transform of y(t) and h(t), respectively.
Taking the inverse Laplace transform on (25) yields From the definition of fractional-order integration (1) and ( 26) we have Noting that h(t) ≥ 0, we have 1 It follows from ( 27) that y(t) ≤ y(0) for all t > 0. Furthermore, for arbitrary t 1 < t 2 , we have from which we know the function y(t) is monotone decreasing.
where k is a positive constant, then the following inequality holds: Proof.Using the fractional integral operator C 0 D −α t to both sides of (29), it follows from Lemma 4 that It follows from (31) that There exists a nonnegative function m(t) such that Taking the Laplace transform (L {•}) on (33) gives where X 2 (s) and M (s) are Laplace transform of x 2 (t) and m(t), respectively.Using (8), the solution of (34) can be given as where * represents the convolution operator.Noting that both E α,0 (−2kt α ) and t −1 are nonnegative functions, it follows from ( 35) that ( 30) holds.And this ends the proof of Lemma 5.
Lemma 7. Let V 2 (t) = 1 2 x T (t)x(t) + 1 2 y T (t)y(t), where x(t), y(t) ∈ R n have continuous derivative.If there exists a constant h 0 > 0 such that then x(t) and y(t) are bounded for all t > 0, and x(t) converges to zero asymptotically, where • represents the Euclid norm.
Proof.Lemma 6 is a straight corollary of Lemma 5.
From above discussions, now we are ready to give the following results.
Theorem 1.Consider the drive neural network (15) and the response neural network (18).Suppose that Assumptions 1 and 3 are satisfied.Let the synchronization controller be designed as (23), and â(t) and ρi (t) are respectively updated as where σ and γ i are positive design parameters, then synchronization between two neural networks ( 15) and ( 18) will be achieved.
Proof.Consider the following Liyapunov function candidate: where are the estimation errors of the unknown constants a and ρ i , respectively.Noting that the Caputo derivative of a constant is 0, we know that . By applying Lemma 3 to (39), we have Substituting (22), the control input ( 23) and the fractional-order parameter adaptation laws (37) and (38) into (42) yields where Consequently, from Lemma 6 and (43) we can conclude that all signals in the closed-loop system will remain bounded and the synchronization error e(t) will converge to zero asymptotically.This ends the proof of Theorem 1.
Remark 3. Compared with the integer-order calculus, fractional-order integration and differentiation provides a pleasurable technique for depicting hereditary and memory properties of the actual systems.With respect to fractional-order systems, it is very hard to analyze the stability by finding its dominant roots or by using other algebraic methods, which are usually used in the stability of integer-order systems.Up to now, direct check of the stability of fractional-order systems by using polynomial criteria is still not possible, because the characteristic equation of the fractional order system is not a polynomial but a pseudo-polynomial function of fractional powers of the complex variable s [7,27,28].So, most of control schemes existing for integer-order neural networks can not be extended to fractional-order neural directly.Literature [29] considered α-synchronization for fractional-order neural networks, but, unfortunately, the given result is not correct [30].Yet, in this paper, our work offers a framework to analyze the stability of the fractional-order neural networks.Remark 4. It should be pointed out that the proposed Lemma 5 might be very powerful in stability analysis of fractional-order nonlinear systems by using fractional-order Lyapunov second method.
Remark 5. To update â(t) and ρi (t), fractional-order adaptation laws (37) and (38) are introduced in this paper.Compared with classical integer-order adaptation law, the fractional-order adaptation law enlarges the parameter adaptation performance by heightening one degree of freedom.From (37) and (38) we can see that the systems parameters σ and γ i should be positive, and their values will influence the change speed of the two functions â(t) and ρi (t).Thus, in the simulation, the synchronization errors will converge to the origin more rapidly if we choose the parameters larger.Remark 6.The synchronization problem is investigated for a class of fractional-order chaotic neural networks in [3] and [15].It should be pointed out that our work is quite different with the result in [3] and [15].First, in [3] and [15], a simple feedback controller is constructed, yet, in this paper we consider the adaptive synchronization problem and design fractional-order adaptation laws to update the control parameters online.Second, in [3], the stability analysis is carried out based on the following conclusion (see, [3], Equation ( 23)) But according to the definition of the Caputo's derivative, the function should be differentiable.Is |e i (t)| differentiable if e i (t) is sign reversal?Thus, the aforementioned conclusion is questionable.In this paper, the underlying closed-loop stability is discussed and strict proofs are given.

Simulation Studies
In the drive neural network (15), let the fractional order α be α = 0.81, the nonlinear function Under these parameters, the dynamical behavior of neural network ( 15) is depicted in Figure 1.With respect to the response neural network (18), the initial values are chosen as y 1 (0) = −2, y 2 (0) = 2, y 3 (0) = −3.The design parameters are chosen as σ = 2, γ 1 = γ 3 = 3, γ 2 = 1.The external disturbance is chosen as d(t) = [0.5 sin(t), 0.4 cos(t), 0.2 sin(t) + 0.2 cos(t)] T , and we can see that Assumption 3 is satisfied.Besides, we can see that the controller has nothing to to with the systems nonlinear functions f (x(t)) and f (y(t)) once Assumption 1 is satisfied.Here we give the exact models of the two fractional-order neural networks is only for the simulation purpose.
The simulation results are depicted in Figures 2-7.Figures 2-5 show the synchronization between two fractional-order neural networks and the time response of the synchronization errors.The time response of the the control inputs, and the updated parameters are included in Figures 6 and 7, respectively.From the results we can see that the synchronization errors converge to origin rapidly, and favorable synchronization performance has been achieved.To avoid the chattering phenomenon which is brought form the sign function (see, the controller (23)), the sign function can be replaced by some continuous functions.For example, let us use arctan(20•) to replace sign(•), the simulation results are presented in Figures 8 and 9

Conclusions
We investigate the robust adaptive synchronization for fractional-order neural networks with both system uncertainties and unknown Lipschitz constants.It is showed that fractional-order differential equations can be used to update parameters online and quadratic Lyapunov functions can be used in the stability analysis of fractional-order neural networks.The proposed methods enable us to establish a fundamental stability analysis framework in fractional-order neural networks.Working towards relaxing the requirements of the knowledge of system uncertainties (for example, the system uncertainties are fully unknown) in the controller design is our further investigation direction.