93D05 Lyapunov and other classical stabilities (Lagrange, Poisson, Lp; lp, etc.)
Refine
Document Type
- Preprint (2)
Language
- English (2)
Has Fulltext
- yes (2)
Keywords
Faculty / Organisational entity
In this paper we study a particular class of \(n\)-node recurrent neural networks (RNNs).In the \(3\)-node case we use monotone dynamical systems theory to show,for a well-defined set of parameters, that,generically, every orbit of the RNN is asymptotic to a periodic orbit.Then, within the usual 'learning' context of NeuralNetworks, we investigate whether RNNs of this class can adapt their internal parameters soas to 'learn' and then replicate autonomously certain external periodic signals.Our learning algorithm is similar to identification algorithms in adaptivecontrol theory. The main feature of the adaptation algorithm is that global exponential convergenceof parameters is guaranteed. We also obtain partial convergence results in the \(n\)-node case.
Nonlinear stochastic dynamical systems as ordinary stochastic differential equations and stochastic difference methods are in the center of this presentation in view of the asymptotical behaviour of their moments. We study the exponential p-th mean growth behaviour of their solutions as integration time tends to infinity. For this purpose, the concepts of nonlinear contractivity and stability exponents for moments are introduced as generalizations of well-known moment Lyapunov exponents of linear systems. Under appropriate monotonicity assumptions we gain uniform estimates of these exponents from above and below. Eventually, these concepts are generalized to describe the exponential growth behaviour along certain Lyapunov-type functionals.