Unpredictable Oscillations for Hopﬁeld-Type Neural Networks with Delayed and Advanced Arguments

: This is the ﬁrst time that the method for the investigation of unpredictable solutions of differential equations has been extended to unpredictable oscillations of neural networks with a generalized piecewise constant argument, which is delayed and advanced. The existence and exponential stability of the unique unpredictable oscillation are proven. According to the theory, the presence of unpredictable oscillations is strong evidence for Poincaré chaos. Consequently, the paper is a contribution to chaos applications in neuroscience. The model is inspired by chaotic time-varying stimuli, which allow studying the distribution of chaotic signals in neural networks. Unpredictable inputs create an excitation wave of neurons that transmit chaotic signals. The technique of analysis includes the ideas used for differential equations with a piecewise constant argument. The results are illustrated by examples and simulations. They are carried out in MATLAB Simulink to demonstrate the simplicity of the diagrammatic approaches.

It is known that oscillations and periodic motions are frequently observed in the activities of the neurons in the brain. Recent developments in the field of neural networks have led to an increased interest in the complexity of the dynamics. Oscillations and chaos in neural networks are actual and have stimulated the interest of many scientists [46][47][48][49][50]. They occur in a neural network system due to the properties of single neurons [46,50,51] and synaptic connections among neurons [52,53]. Neural networks in present research display unpredictable oscillations and chaos. The unpredictable function was introduced in [29] and is based on the dynamics of unpredictable points and Poincaré chaos [30]. More precisely, the function is an unpredictable point of the Bebutov dynamics, and consequently, it is a member of the chaotic set [31]. The notion of the unpredictable point extends the frontiers of the classical theory of dynamical systems, and the unpredictable function provides new problems of the existence of unpredictable oscillations for the theory of differential equations [29][30][31][32][33][34][35][36]. These studies have been identified as major contributing factors for the emergence of new types of sophisticated motion. Significant results have been obtained for unpredictable oscillations of Hopfield-type neural networks, shunting inhibitory cellular neural networks, and inertial neural networks [37][38][39].

Preliminaries
Denote by R, N, Z the set of all real numbers, natural numbers, and integers, respectively. Introduce a norm for the vector u = (u 1 , · · · , u m ), u i ∈ R, i = 1, · · · , m, as ||u|| = max 1≤i≤m |u i |, where | · | is the absolute value. Correspondingly, for a square matrix It is assumed that there exists a positive number θ such that θ k+1 − θ k ≤ θ for all integers k.
The main subject under investigation in this paper is the following Hopfield-type neural network system with a piecewise constant argument: where t, a i > 0-the rates with which the units self-regulate or reset their potentials when isolated from other units and inputs; m-the number of neurons in the network; x i (t)-the state of the ith unit at time t; f j , g j -the activation functions of the incoming potentials of the unit j; b ij , c ij -the synaptic connection weights of the unit j on the unit i; ϑ i (t)-the time-varying stimulus, corresponding to the external input from outside the network to the unit i.
Throughout this paper, we assume that the parameters b ij and c ij are real and that the activation functions f j , g j : R → R j = 1, 2, ..., m are continuous functions. Moreover, suppose that there exist positive constants λ andλ such that the inequality λ ≤ a i ≤λ, holds for each i = 1, 2, . . . , m.
We present system (1) in the following vector form: . . , f m (x m )) and g(x) = colon(g 1 (x 1 ), g 2 (x 2 ), . . . , g m (x m )) are the activations, and ϑ = colon(ϑ 1 , ϑ 2 , . . . , ϑ m ) is the input vector. Moreover, A = diag(−a 1 , −a 2 , . . . , −a m ), As the usual activations for continuous time neural network dynamics, the following sigmoidal functions are considered [45]: They are used in neural networks as activation functions, since they allow both amplifying weak signals and do not become saturated by strong signals. The activation function and the output function are summed up with the term transfer functions. If the activation function determines the total signal a neuron receives, the transfer function translates the input signals to the output signals.
The block diagram of the Hopfield-type neural network system with a piecewise constant argument is shown in Figure 1, and the symbols for the diagram are described in Table 1.

Definition 1 ([29]).
A uniformly continuous and bounded function v : R → R m is unpredictable if there exist positive numbers 0 , δ and sequences t n , u n , both of which diverge to infinity such that v(t + t n ) → v(t) as n → ∞ uniformly on compact subsets of R and v(t + t n ) − v(t) ≥ 0 for each t ∈ [u n − δ, u n + δ] and n ∈ N. Table 1. Characteristics of elements of the block diagram in Figure 1.

Symbols Description
Integrator block

Sum block
Gain blocks, with values A, B, C Transfer function block, with nonlinear functions f and g MATLAB function block, with the piecewise constant function γ(t) Input function Output function

Main Results
The functions of this space are assumed to satisfy the following properties: (A1) they are uniformly continuous; (A2) there exists a number H > 0 such that ϕ 1 < H for each function ϕ; (A3) there exists a sequence t n that diverges to infinity such that ϕ(t + t n ) → ϕ(t) uniformly on each closed and bounded interval of the real axis for each function ϕ.
The following conditions on the system (2) are assumed: where t n is the sequence given in Definition 1. (1) if and only if it is a solution of the following integral equation: Let us introduce the operator Π on Σ 0 such that: Proof. Let us evaluate the derivative of Πϕ(t) with respect to the time variable t. Then, we have: Hence, we can find for all t ∈ R that: Since the derivative of Πϕ(t) is bounded, Πϕ is uniformly continuous. This means that Πϕ satisfies the property (A1). Moreover, we have for ϕ ∈ Σ 0 that: The last inequality together with the condition (C4) implies that ||Πϕ|| 1 < H. Thus, Πϕ satisfies the property (A2). Now, we need to check the last property (A3) for Πϕ. In other words, we have to verify that there exists a sequence t n that diverges to infinity such that for each Πϕ ∈ Σ 0 , Πϕ(t + t n ) → Πϕ(t) uniformly on each closed and bounded interval of the real axis. Fix an arbitrary positive number ε and a closed interval [a, b], where a, b ∈ R with a < b. It is enough to show that ||Πϕ(t + t n ) − Πϕ(t)|| < ε for sufficiently large n and t ∈ [a, b]. We choose two numbers c < a and > 0 such that: Take n large enough such that Then, for ϕ ∈ Σ 0 , by writing: one can see that: is valid. If we divide the last integral into two parts, we get for t ∈ [a, b] that: We need to find an upper bound for the last integral. For this purpose, we shall evaluate it by dividing the interval of integration into subintervals as follows. For a fixed t ∈ [a, b], we assume without loss of generality that θ i ≤ θ i−η n + t n and Let us denote: We shall need the following presentation of the last integral.
Now, if we define the integrals in the last expression as: and: where k = i, i + 1, · · · , i + p − 1, then we can write that: For t ∈ [θ k−η n + t n , θ k+1 ), γ(t) = ξ k , and we have by the condition (C8) that γ(t + t n ) = ξ k+η n , k = i, i + 1, · · · , i + p − 1. Hence, we obtain that: Since the function ϕ is uniformly continuous, for large n and > 0, we can find a ρ > 0 such that ϕ(ξ k + t n + o(1)) − ϕ(ξ k + t n ) < if |ξ k+η n − ξ k − t n | < ρ. As a result of this discussion, we conclude that: Moreover, we have by the condition (C8) that: Applying a similar idea used for the integral A k , we get: Thus, it is true that: .

Lemma 3.
The operator Π is a contraction on Σ 0 .
Proof. Let the functions ϕ and ψ belong to the space Σ 0 . We obtain for all t ∈ R that: Then, it is true for all t ∈ R that: Consequently, the condition (C5) implies that the operator Π : Σ 0 → Σ 0 is contractive. The lemma is proven.
The following assertion is needed in the proof of the stability of the solution.

Lemma 4 ([10]
). Assume that the conditions (C1), (C7) are fulfilled and z(t) is a continuous function with z(t) 1 < H. If w(t) is a solution of: then the following inequality: holds for all t ∈ R, where K is as defined in (C6).
Proof. First, we fix an integer i such that t ∈ [θ i , θ i+1 ) and then consider two alternative cases (a) For (a) t ≥ ξ i , we have: The Gronwall-Bellman lemma yields that: Moreover, for t ∈ [θ i , θ i+1 ), we have: Consequently, it follows from the condition (C7) that w(ξ The assertion for case (b) θ i ≤ t < ξ i < θ i+1 , i ∈ Z can be proven in the same way. Thus, one can conclude that (8) holds for all t ∈ R. The lemma is proven.

Theorem 1.
Assume that the conditions (C1)-(C8) hold true. If the function ϑ is unpredictable, then the system (1) has a unique exponentially stable unpredictable solution.
Proof. First, we show that Σ 0 is a complete space. Let φ k (t), which has a limit φ(t) on R, be a Cauchy sequence in the space Σ 0 . It can be easily shown that the limit function φ(t) is uniformly continuous and bounded, and hence, it satisfies the properties (A2) and (A3). It remains only to show that φ(t) satisfies the property (A3). Consider a closed and bounded interval I ⊂ R. We have: If one takes sufficiently large n and k such that each term on the right-hand side of the last inequality is less than ε 3 for a small enough ε > 0 and t ∈ I, then the inequality φ(t + t n ) − |φ(t) < ε is satisfied on I. This implies that the sequence φ(t + t n ) converges uniformly to φ(t) on I. Thus, the space Σ 0 is complete. Since the operator Π is invariant and contractive in Σ 0 , according to Lemmas 2 and 3, respectively, it follows from the contraction mapping theorem that the operator Π has a unique fixed point z(t) ∈ Σ 0 , which is the unique solution of the neural network system (1). Hence, the uniqueness of the solution is shown.
Next, we verify that this solution is unpredictable. We can find a positive number κ and l, k ∈ N such that the following inequalities: and: are satisfied. Suppose that the numbers κ, l, k and n ∈ N are fixed. Denote: and consider the cases: (i) ∆ ≥ 0 /l, (ii) ∆ < 0 /l. (i) If ∆ ≥ 0 /l holds, we have: (ii) If ∆ < 0 /l is true, it follows from (11) that: for t ∈ [u n , u n + κ]. We can see that: [Az(s) + B f (z(s)) + Cg(z(γ(s))) + ϑ(s)]ds and: Subtracting the first equation from the second one, we get: Therefore, we have that: z(γ(s + t n )) − z(γ(s)) ds + κ 2 0 for t ∈ [u n + κ 2 , u n + κ]. For a fixed t ∈ [u n + κ 2 , u n + κ], we can take sufficiently small κ so that θ i−η n + t n ≤ u n < u n + κ 2 ≤ t ≤ u n + κ < θ i+1 for some i ∈ Z. Hence, γ(t) = ξ i for t ∈ [u n + κ 2 , u n + κ], which implies together with the condition (C8) that γ(t + t n ) = ξ i+η n . Since z(t) ∈ Σ 0 , the function z is uniformly continuous. Using this fact, for 0 > 0 and for large n, we can find a ρ > 0 such that: At the end, we have by the inequality (10) that: Based on the inequalities obtained in cases (i) and (ii), we see that the solution z(t) is unpredictable with u n = u n + 3κ 4 and δ n = κ 4 . Lastly, let us consider the stability of the solution z(t). Denote w(t) = y(t) − z(t), where y(t) = colon(y 1 (t), y 2 (t), ..., y m (t)) is another solution of the neural network system (1) with a piecewise constant argument of the generalized type. Then, w(t) = colon(w 1 (t), w 2 (t), ..., w m (t)) will be a solution of (7).
We have that: By applying the inequalities (8) to (12), we obtain that: Hence, we find that: The last inequality can be written as follows: e λs w(s) ds.
If we apply the Gronwall-Bellman lemma for the last inequality, it leads to: In other words, we have: Now, based on the condition (C6), we conclude that the solution z(t) of (1) is uniformly exponentially stable. The theorem is proven.

Examples and Numerical Simulations
We present two examples in this section. First, we construct an example of an unpredictable function by means of the logistic map considered in [29]. Then, we make use of this function in the second example, which deals with a Hopfield-type neural network system. Example 1. Consider the following discrete logistic map given by: where i ∈ Z. We know that if µ ∈ (0, 4], then the iterations of this map belong to the interval [0, 1] [54]. Moreover, if µ ∈ [3 + ( 2 3 ) 1/2 , 4], equation (13) has an unpredictable solution. Let Ψ i , i ∈ Z, denote an unpredictable solution of (13) for µ = 3.93. There exist a positive number ε 0 and sequences p n q n that diverge to infinity such that |Ψ i+p n − Ψ i | → 0 as n → ∞ for each i in bounded intervals of integers and |Ψ p n +q n − Ψ q n | ≥ ε 0 for each n ∈ N.
Consider the following integral defined by: where Ω(t) = Ψ i for t ∈ [i, i + 1), i ∈ Z. It is worth noting that Θ(t) is bounded on the whole real axis such that sup t∈R |Θ(t)| ≤ 1/4. In [37], it was proven that the function Θ(t) is an unpredictable function.
Since we do not know the initial value of the function Θ(t), we are not able to visualize it. For this reason, we represent the function Θ(t) as follows: where Θ 0 = 0 −∞ e 4s Ω(s)ds. It is worth noting that the simulation of an unpredictable function is impossible, since the initial value is not known.
That is why, we simulate a function Φ(t) that approaches the function Θ(t) as time increases. Let us determine: where Φ 0 is a fixed number, which is not necessarily equal to Θ 0 . Subtract equality (15) from equality (14) to obtain that The last equation demonstrates that the difference Θ(t) − Φ(t) exponentially diminishes. This means that the function Φ(t) exponentially tends to the unpredictable function Θ(t), i.e., the graphs of these functions approach, as time increases. The functions Φ(t) and Θ(t) are the solutions of the differential equation: and instead of the curve describing the unpredictable solution Θ(t), we can take the graph of Φ(t), which approximates the first one asymptotically. In Figure 2, we depict the graph defined by the initial value Φ(0) = 0.45.

Example 2.
Consider the following Hopfield-type neural network given by: Here, Θ(t) is the unpredictable function mentioned in Example 1.
The graph of function ψ(t) approaches the unpredictable solution x(t) of equation (16), as time increases. That is, instead of the curve describing the unpredictable solution, one can consider the graph of x(t). We present the coordinates of the solution ψ(t) in Figure 3. Moreover, Figure 4 indicates the trajectory of the solution.   Further, we describe a circuit implementation of the proposed Hopfield-type neural network (16) using MATLAB Simulink. The Simulink model of the Hopfield-type neural network is depicted in Figure 5, and the symbols are described in Table 2. In the block diagram, we took the hyperbolic tangent transfer function as the sigmoid functions f and g. To implement the block diagram, we used the transfer function "tansig" from the MATLAB Simulink library. Inputs ϑ 1 (t), ϑ 2 (t), ϑ 3 (t) are unpredictable functions. Table 2. Characteristics of elements of the block diagram in Figure 5.

Symbols Description
Integrator block

Conflicts of Interest:
The authors declare no conflict of interest.