# On Optimization Techniques for the Construction of an Exponential Estimate for Delayed Recurrent Neural Networks

^{*}

^{†}

^{‡}

## Abstract

**:**

## 1. Introduction

## 2. Exponential Estimate

- (a)
- convexity—for any ${P}_{1}\in {\Omega}_{n}$, ${P}_{2}\in {\Omega}_{n}$, $x\in {\mathbb{R}}^{n}$, and $\xi \in [0,1]$ we have ${x}^{\top}(\xi {P}_{1}+(1-\xi ){P}_{2})x=\xi {x}^{\top}{P}_{1}x+(1-\xi ){x}^{\top}{P}_{2}x>0$;
- (b)
- cone—for any $P\in {\Omega}_{n}$, $x\in {\mathbb{R}}^{n}$, and $\eta >0$ we have $\eta {x}^{\top}Px>0$.

**Lemma**

**1.**

**Lemma**

**2.**

**Definition**

**1.**

**Definition**

**2.**

**Theorem**

**1.**

**Proof.**

**Corollary**

**1.**

**Proof.**

## 3. Optimization Method

**Definition**

**3.**

**Definition**

**4.**

**Lemma**

**3.**

**Proof.**

**Theorem**

**2.**

**Proof.**

## 4. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Abbreviations

LSTM | Long Short-Term Memory |

GRU | Gated Recurrent Units |

RNN | Recurrent Neural Network |

DDE | Delayed Differential equarion |

## References

- Habiba, M.; Pearlmutter, B.A. Neural Ordinary Differential Equation based Recurrent Neural Network Model. arXiv
**2020**, arXiv:2005.09807. [Google Scholar] - Sherstinsky, A. Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) network. Phys. D Nonlinear Phenom.
**2020**, 404, 132306. [Google Scholar] [CrossRef] [Green Version] - Fang, S.; Jiang, M.; Wang, X. Exponential convergence estimates for neural networks with discrete and distributed delays. Nonlinear Anal. Real World Appl.
**2009**, 702–714. [Google Scholar] [CrossRef] - Hale, J.K.; Lunel, S.M.V. Introduction to Functional Differential Equations; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013; Volume 99. [Google Scholar]
- Zhang, H.; Wang, Z.; Liu, D. A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks. IEEE Trans. Neural Networks Learn. Syst.
**2014**, 25, 1229–1262. [Google Scholar] [CrossRef] - Hopfield, J.J. Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. USA
**1984**, 81, 3088–3092. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Wei, J.; Ruan, S. Stability and bifurcation in a neural network model with two delays. Phys. D Nonlinear Phenom.
**1999**, 130, 255–272. [Google Scholar] [CrossRef] - Ruan, S.; Wei, J. On the zeros of a third degree exponential polynomial with applications to a delayed model for the control of testosterone secretion. Math. Med. Biol.
**2001**, 18, 41–52. [Google Scholar] [CrossRef] - Ruan, S.; Wei, J. On the zeros of transcendental functions with applications to stability of delay differential equations with two delays. Dyn. Contin. Discret. Impuls. Syst. Ser. A
**2003**, 10, 863–874. [Google Scholar] - Yan, X.P.; Li, W.T. Stability and bifurcation in a simplified four-neuron BAM neural network with multiple delays. Discret. Dyn. Nat. Soc.
**2006**, 2006. [Google Scholar] [CrossRef] - Huang, C.; Huang, L.; Feng, J.; Nai, M.; He, Y. Hopf bifurcation analysis for a two-neuron network with four delays. Chaos Solitons Fractals
**2007**, 795–812. [Google Scholar] [CrossRef] - Martsenyuk, V. On an indirect method of exponential estimation for a neural network model with discretely distributed delays. Electron. J. Qual. Theory Differ. Equ.
**2017**, 2017, 1–16. [Google Scholar] [CrossRef] - Martsenyuk, V. Indirect method of exponential convergence estimation for neural network with discrete and distributed delays. Electron. J. Differ. Equ.
**2017**, 2017, 1–16. [Google Scholar] - Gu, K. An integral inequality in the stability problem of time-delay systems. In Proceedings of the 39th IEEE Conference on Decision and Control (Cat. No.00CH37187), Sydney, Australia, 12–15 December 2000. [Google Scholar] [CrossRef]
- Sanchez, E.; Perez, J. Input-to-state stability (ISS) analysis for dynamic neural networks. IEEE Trans. Circuits Syst. Fundam. Theory Appl.
**1999**, 46, 1395–1398. [Google Scholar] [CrossRef] - Zhang, F. (Ed.) The Schur Complement and Its Applications; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar] [CrossRef]
- Khusainov, D.Y.; Martsenyuk, V.P. Optimization method for stability analysis of delayed linear systems. Cybern. Syst. Anal.
**1996**, 32, 534–538. [Google Scholar] [CrossRef] - Boyd, S.; Vandenberghe, L. Convex Optimization; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar] [CrossRef]

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Martsenyuk, V.; Rajba, S.; Karpinski, M.
On Optimization Techniques for the Construction of an Exponential Estimate for Delayed Recurrent Neural Networks. *Symmetry* **2020**, *12*, 1731.
https://doi.org/10.3390/sym12101731

**AMA Style**

Martsenyuk V, Rajba S, Karpinski M.
On Optimization Techniques for the Construction of an Exponential Estimate for Delayed Recurrent Neural Networks. *Symmetry*. 2020; 12(10):1731.
https://doi.org/10.3390/sym12101731

**Chicago/Turabian Style**

Martsenyuk, Vasyl, Stanislaw Rajba, and Mikolaj Karpinski.
2020. "On Optimization Techniques for the Construction of an Exponential Estimate for Delayed Recurrent Neural Networks" *Symmetry* 12, no. 10: 1731.
https://doi.org/10.3390/sym12101731