Open Access This article is
- freely available
Mathematics 2018, 6(9), 144; doi:10.3390/math6090144
Stability Analysis of Cohen–Grossberg Neural Networks with Random Impulses
Department of Mathematics, Texas A&M University-Kingsville, Kingsville, TX 78363, USA
Florida Institute of Technology, Melbourne, FL 32901, USA
Faculty of Mathematics, Plovdiv University, Tzar Asen 24, 4000 Plovdiv, Bulgaria
School of Mathematics, Statistics and Applied Mathematics, National University of Ireland, H91 CF50 Galway, Ireland
Author to whom correspondence should be addressed.
Received: 27 July 2018 / Accepted: 17 August 2018 / Published: 21 August 2018
The Cohen and Grossberg neural networks model is studied in the case when the neurons are subject to a certain impulsive state displacement at random exponentially-distributed moments. These types of impulses significantly change the behavior of the solutions from a deterministic one to a stochastic process. We examine the stability of the equilibrium of the model. Some sufficient conditions for the mean-square exponential stability and mean exponential stability of the equilibrium of general neural networks are obtained in the case of the time-varying potential (or voltage) of the cells, with time-dependent amplification functions and behaved functions, as well as time-varying strengths of connectivity between cells and variable external bias or input from outside the network to the units. These sufficient conditions are explicitly expressed in terms of the parameters of the system, and hence, they are easily verifiable. The theory relies on a modification of the direct Lyapunov method. We illustrate our theory on a particular nonlinear neural network.
Keywords:Cohen and Grossberg neural networks; random impulses; mean square stability
Artificial neural networks are important technical tools for solving a variety of problems in various scientific disciplines. Cohen and Grossberg  introduced and studied in 1983 a new model of neural networks. This model was extensively studied and applied in many different fields such as associative memory, signal processing and optimization problems. Several authors generalized this model  by including delays [3,4], impulses at fixed points [5,6] and discontinuous activation functions . Furthermore, a stochastic generalization of this model was studied in . The included impulses model the presence of the noise in artificial neural networks. Note that in some cases in the artificial neural network, the chaos improves the noise (see, for example, ).
To the best of our knowledge, there is only one published paper studying neural networks with impulses at random times . However, in , random variables are incorrectly mixed with deterministic variables; for example for the random variables is not a deterministic index function (it is a stochastic process), and it has an expected value labeled by E, which has to be taken into account on page 13 of ; in addition, in , one has to be careful since the expected value of a product of random variables is equal to the product of expected values only for independent random variables. We define the generalization of Cohen and Grossberg neural network with impulses at random times, briefly giving an explanation of the solutions being stochastic processes, and we study stability properties. Note that a brief overview of randomness in neural networks and some methods for their investigations are given in  where the models are stochastic ones. Impulsive perturbation is a common phenomenon in real-world systems, so it is also important to consider impulsive systems. Note that the stability of deterministic models with impulses for neural networks was studied in [12,13,14,15,16,17,18]. However, the occurrence of impulses at random times needs to be considered in real-world systems. The stability problem for the differential equation with impulses at random times was studied in [19,20,21]. In this paper, we study the general case of the time-varying potential (or voltage) of the cells, with the time-dependent amplification functions and behaved functions, as well as time-varying strengths of connectivity between cells and variable external bias or input from outside the network to the units. The study is based on an application of the Lyapunov method. Using Lyapunov functions, some stability sufficient criteria are provided and illustrated with examples.
2. System Description
We consider the model proposed by Cohen and Grossberg  in the case when the neurons are subject to a certain impulsive state displacement at random moments.
Let be a fixed point and the probability space () be given. Let a sequence of independent exponentially-distributed random variables with the same parameter defined on the sample space be given. Define the sequence of random variables by:
The random variable measures the waiting time of the k-th impulse after the -th impulse occurs, and the random variable denotes the length of time until k impulses occur for .
The random variable is Erlang distributed, and it has a pdf and a cdf .
Consider the general model of the Cohen–Grossberg neural networks with impulses occurring at random times (RINN):where n corresponds to the number of units in a neural network; denotes the potential (or voltage) of cell i at time t, , denotes the activation functions of the neurons at time t and represents the response of the j-th neuron to its membrane potential and . Now, represents an amplification function; represents an appropriately behaved function; the connection matrix denotes the strengths of connectivity between cells at time t; and if the output from neuron j excites (resp., inhibits) neuron i, then (resp., ), and the functions , correspond to the external bias or input from outside the network to the unit i at time t.
We list some assumptions, which will be used in the main results:
(H1) For all , the functions , and there exist constants such that for .
(H2) There exist positive numbers such that for .
In the case when the strengths of connectivity between cells are constants, then Assumption (H2) is satisfied.
For the activation functions, we assume:
(H3) The neuron activation functions are Lipschitz, i.e., there exist positive numbers such that for .
Note that the activation functions satisfying Condition (H3) are more general than the usual sigmoid activation functions.
2.1. Description of the Solutions of Model (2)
Consider the sequence of points where the point is an arbitrary value of the corresponding random variable . Define the increasing sequence of points by:
Note that are values of the random variables .
Consider the corresponding RINN (2) initial value problem for the system of differential equations with fixed points of impulses (INN):
The solution of the differential equation with fixed moments of impulses (4) depends not only on the initial point , but on the moments of impulses , i.e., the solution depends on the chosen arbitrary values of the random variables . We denote the solution of the initial value problem (4) by . We will assume that:
The set of all solutions of the initial value problem for the impulsive fractional differential Equation (4) for any values of the random variables generates a stochastic process with state space . We denote it by , and we will say that it is a solution of RINN (2).
Note that is a deterministic function, but is a stochastic process.
2.2. Equilibrium of Model (2)
We define an equilibrium of the model (2) assuming Condition (H1) is satisfied:
A vector is an equilibrium point of RINN (2), if the equalities:andhold.
We assume the following:
(H4) Let RINN (2) have an equilibrium vector .
If Assumption (H4) is satisfied, then we can shift the equilibrium point of System (2) to the origin. The transformation is used to put System (2) in the following form:where , , and .
If Assumption (H3) is fulfilled, then the function F in RINN (8) satisfies for .
3. Some Stability Results for Differential Equations with Impulses at Random Times
Consider the general type of initial value problem (IVP) for a system of nonlinear random impulsive differential equations (RIDE):with , random variables are defined by (1), and .
We note that the two-moment exponential stability for stochastic equations is known as the mean square exponential stability, and in the case of , it is called mean exponential stability.
Note that the p-moment exponential stability of RIDE (9) was studied in  by an application of Lyapunov functions from the class , , with:
We will use the Dini derivative of the Lyapunov function given by:
Now, we will give a sufficient condition result:
(). Let the following conditions be satisfied:
- For and and for any initial values , the corresponding IVP for the ordinary differential equation has a unique solution.
- The function , and there exist positive constants such that:
- there exists a function , and the inequality:
- for any , there exist constants for such that:
Then, the trivial solution of RIDE (9) is p-moment exponentially stable.
4. Stability Analysis of Neural Networks with Random Impulses
We will introduce the following assumptions:
(H5) For , the functions , and there exist constants such that for any where is the equilibrium from Condition (H4).
If Condition (H5) is satisfied, then the inequality holds for RINN (8).
(H6) The inequality:holds.
(H7) For any , there exists positive number such that the inequalities:hold where is the equilibrium from Condition (H4).
If Assumption (H7) is fulfilled, then the impulsive functions in RINN (8) satisfy the inequalities .
Let Assumptions (H1)–(H7) be satisfied. Then, the equilibrium point of RINN (2) is mean square exponentially stable.
Consider the quadratic Lyapunov function , . From Remarks 6, 8 and inequality , we get:where the positive constant is defined by (12). Therefore, Condition 2(ii) of Theorem 1 is satisfied. Furthermore, from (H7), it follows that Condition 2(iii) of Theorem 1 is satisfied.
Let , and the random variables are exponentially distributed with . Consider the following special case of RINN (2):with , , , and is given by:
The point is the equilibrium point of RINN (14), i.e., Condition (H4) is satisfied. Now, Assumption (H1) is satisfied with . In addition, Assumption (H5) is satisfied with .
Furthermore, where is given by:
Therefore, Assumption (H2) is satisfied. Note that Assumption (H3) is satisfied with Lipschitz constants .
Then, the constant defined by (12) is . Next, Assumption (H7) is fulfilled with because:
Therefore, according to Theorem 1, the equilibrium of RINN (14) is mean square exponentially stable.
Consider the system (14) without any kind of impulses. The equilibrium is asymptotically stable (see Figure 1 and Figure 2). Therefore, an appropriate perturbation of the neural networks by impulses at random times can keep the stability properties of the equilibrium.
Note that Condition (H7) is weaker than Condition (3.6) in Theorem 3.2 , and as a special case of Theorem 2, we obtain weaker conditions for exponential stability of the Cohen and Grossberg model without any type of impulses. For example, if we consider (14) according to Condition (3.6) , the inequality is not satisfied, and Theorem 3.2  does not give us any result about stability (compare with Example 1).
Now, consider the following assumption:
(H8) The inequality:holds.
Let Assumptions (H1)–(H5), (H7) and (H8) be satisfied. Then, the equilibrium point of RINN (2) is mean exponentially stable.
For any , we define . Then:and:where , .
Then, for and according to Remarks 6 and 8, we obtain:
Furthermore, from (H7) and Remark 9, it follows that Condition 2(iii) of Theorem 1 is satisfied. From Theorem 1, we have that Theorem 3 is true. ☐
The point is the equilibrium point of RINN (20), i.e., Condition (H4) is satisfied. Now, Assumption (H5) is satisfied with .
Furthermore, where is given by (16). Therefore, Assumption (H2) is satisfied. Then, the inequality holds.
According to Theorem 3, the equilibrium of (20) is mean exponentially stable.
In this paper, we study stability properties of the equilibrium point of a generalization of the Cohen–Grossberg model of neural networks in the case when:
- the potential (or voltage) of any cell is perturbed instantaneously at random moments, i.e., the neural network is modeled by a deterministic differential equation with impulses at random times. This presence of randomness in the differential equation totally changes the behavior of the solutions (they are not deterministic functions, but stochastic processes).
- the random moments of the impulsive state displacements of neurons are exponentially distributed.
- the connection matrix is not a constant matrix which is usually the case in the literature (it is a matrix depending on time since the strengths of connectivity between cells could be changed in time).
- the external bias or input from outside the network to any unit is not a constant (it is variable in time).
- sufficient conditions for mean-square exponential stability and for mean exponential stability of the equilibrium are obtained.
All authors contributed equally to the writing of this paper. All four authors read and approved the final manuscript.
The research was partially supported by Fund MU17-FMI-007, University of Plovdiv “Paisii Hilendarski”.
Conflicts of Interest
The authors declare that they have no competing interests.
- Cohen, M.; Grossberg, S. Stability and global pattern formation and memory storage by competitive neural networks. IEEE Trans. Syst. Man Cyber 1983, 13, 815–826. [Google Scholar] [CrossRef]
- Guo, S.; Huang, L. Stability analysis of Cohen-Grossberg neural networks. IEEE Trans. Neural Netw. 2006, 17, 106–117. [Google Scholar] [CrossRef] [PubMed]
- Bai, C. Stability analysis of Cohen–Grossberg BAM neural networks with delays and impulses. Chaos Solitons Fractals 2008, 35, 263–267. [Google Scholar] [CrossRef]
- Cao, J.; Liang, J. Boundedness and stability for Cohen–Grossberg neural network with time-varying delays. J. Math. Anal. Appl. 2004, 296, 665–685. [Google Scholar] [CrossRef]
- Aouiti, C.; Dridi, F. New results on impulsive Cohen–Grossberg neural networks. Neural Process. Lett. 2018, 48, 1–25. [Google Scholar] [CrossRef]
- Liu, M.; Jiang, H.; Hu, C. Exponential Stability of Cohen-Grossberg Neural Networks with Impulse Time Window. Discret. Dyn. Nat. Soc. 2016, 2016, 2762960. [Google Scholar] [CrossRef]
- Meng, Y.; Huang, L.; Guo, Z.; Hu, Q. Stability analysis of Cohen–Grossberg neural networks with discontinuous neuron activations. Appl. Math. Model. 2010, 34, 358–365. [Google Scholar] [CrossRef]
- Huang, C.; Huang, L.; He, Y. Mean Square Exponential Stability of Stochastic Cohen-Grossberg Neural Networks with Unbounded Distributed Delays. Discret. Dyn. Nat. Soc. 2010, 2010, 513218. [Google Scholar] [CrossRef]
- Bucolo, M.; Caponetto, R.; Fortuna, L.; Frasca, M.; Rizzo, A. Does chaos work better than noise? IEEE Circuits Syst. Mag. 2002, 2, 4–19. [Google Scholar] [CrossRef]
- Vinodkumar, A.; Rakkiyappan, R. Exponential stability results for fixed and random type impulsive Hopfield neural networks. Int. J. Comput. Sci. Math. 2016, 7, 1–19. [Google Scholar] [CrossRef]
- Scardapane, S.; Wang, D. Randomness in neural networks: an overview. WIREs Data Min. Knowl. Discov. 2017, 7, 1–18. [Google Scholar] [CrossRef]
- Gopalsamy, K. Stability of artificial neural networks with impulses. Appl. Math. Comput. 2004, 154, 783–813. [Google Scholar] [CrossRef]
- Rakkiyappan, R.; Balasubramaiam, P.; Cao, J. Global exponential stability of neutral-type impulsive neural networks. Nonlinear Anal. Real World Appl. 2010, 11, 122–130. [Google Scholar] [CrossRef]
- Song, X.; Zhao, P.; Xing, Z.; Peng, J. Global asymptotic stability of CNNs with impulses and multi-proportional delays. Math. Methods Appl. Sci. 2016, 39, 722–733. [Google Scholar]
- Wu, Z.; Li, C. Exponential stability analysis of delayed neural networks with impulsive time window. In Proceedings of the 2017 Ninth International Conference on Advanced Computational Intelligence (ICACI), Doha, Qatar, 4–6 February 2017; pp. 37–42. [Google Scholar]
- Wang, L.; Zou, X. Exponential stability of Cohen-Grossberg neural networks. Neural Netw. 2002, 15, 415–422. [Google Scholar] [CrossRef]
- Yang, Z.; Xu, D. Stability analysis of delay neural networks with impulsive effects. IEEE Trans. Circuits Syst. Express Briefs 2015, 52, 517–521. [Google Scholar] [CrossRef]
- Zhou, Q. Global exponential stability of BAM neural networks with distributed delays and impulses. Nonlinear Anal. Real World Appl. 2009, 10, 144–153. [Google Scholar] [CrossRef]
- Agarwal, R.; Hristova, S.; O’Regan, D.; Kopanov, P. P-moment exponential stability of differential equations with random impulses and the Erlang distribution. Mem. Differ. Equ. Math. Phys. 2017, 70, 99–106. [Google Scholar] [CrossRef]
- Agarwal, R.; Hristova, S.; O’Regan, D. Exponential stability for differential equations with random impulses at random times. Adv. Differ. Equ. 2013, 372, 12. [Google Scholar] [CrossRef]
- Agarwal, R.; Hristova, S.; O’Regan, D.; Kopanov, P. Impulsive differential equations with Gamma distributed moments of impulses and p-moment exponential stability. Acta Math. Sci. 2017, 37, 985–997. [Google Scholar] [CrossRef]
Figure 1. Example 1. Graph of the solution of the system ODE corresponding to (14) with .
Figure 2. Example 1. Graph of the solution of the system ODE corresponding to (14) with .
Figure 3. Example 2. Graph of the solution of the system ODE corresponding to (20) with .
Figure 4. Example 2. Graph of the solution of the system ODE corresponding to (20) with .
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).