Local uniform stability of competitive neural networks with different time-scales under vanishing perturbations
Introduction
Biological-relevant neural networks represent large- and multi-time-scales nonlinear dynamical systems capturing both the activity and synaptic changes. These systems form the basis for every single cognitive task and their complex dynamical behavior has been very rigourously mathematically analyzed. In [12] K-monotone theory and singular perturbation theory have been used to prove the existence and local stability of equilibria. In [17], [13], [11] the bound has been computed explicitly by using the methods of [19]. In [14], [15] flow-invariance and Lyapunov's indirect method have been used to prove global exponential stability without singular perturbation. In [9] the results in [14] have been improved. Nonsmooth analysis techniques represent the basis for the proof of existence and uniqueness of the equilibrium of a multi-time scale neural network and for the global exponential stability of this unique equilibrium in [7]. Improved stability conditions with and without time-varying delays are given in [9]. The introduction of a time-delay to characterize transmission delays in biological neural systems is described in [10] and the global exponential stability based on nonsmooth analysis techniques is shown.
On the other hand, biological networks pose a certain degree of uncertainty or undergo many parametric perturbations. Thus, it is imperative to understand the dynamical behavior of the neural network which as a result of fluctuating activation functions and synaptic weights can lead to spurious states and instabilities. In spite of this more realistic view, most analysis was focused on the dynamical behavior of activity-states equations. In [22], [3], criteria for the robust stability of Hopfield-type neural networks of continuous and discrete-type were determined. Norm-bounded parameter uncertainties in both feedback and delayed feedback matrices are considered in delayed cellular neural networks [23]. The uniqueness of the equilibrium and its global asymptotic stability is proven. Nonlinear perturbations of the current and delayed state are considered in [6] and based on the Lyapunov method a sufficient delay-dependent criterion for asymptotic stability is derived in terms of the linear matrix inequality. Additional aspects such as perturbations of the time-constant matrix and of the lateral inhibition matrix are considered in [4], [5] and novel stability conditions are derived. Perturbations in bidirectional associative memory (BAM) neural networks and their exponential stability have attracted considerable attention and have been investigated in [2], [1].
Advances in the mathematical theory of uncertain nonlinear singularly perturbed systems [18], [20] have recently lead to new stability results for parametric uncertain multi-time scales neural networks. The robust stability of competitive neural networks with short and long-term dynamics under perturbations was analyzed in [16]. The derived stability conditions are very restrictive and depend on the chosen bounds for the nonlinear uncertainties. A sufficient condition for the time-scale associated with the short-term memory state was given.
In the present paper, we will derive less restrictive stability conditions for perturbations of the output functions and lateral interconnection weights that can be easily interpreted in terms of the network's architecture. We will show that we are able to find a time-scale even for cases where an estimate obtained based on the method described in [16] is not possible.
Section snippets
Problem statement
The general neural network equations describing the temporal evolution of the unperturbed STM and LTM states for the i th neuron of an n-neuron network are where is the current activity level, is the time constant of the neuron, is the contribution of the external stimulus term, is the neuron's output, is the external time constant stimulus and is the synaptic efficiency. is the fast
Reduction
Since the nonlinearity in the second equation is a rank-one matrix only, we can use the notation and the Kronecker product for matrices together with the relations between and [8] to simplify the equations further. Introducing the variable (2) is transformed into By introducing the dynamic variable and
Robust stability analysis
We first assume and analyze the stability of the equilibrium using the method of [18]. We consider perturbations of the feedback matrices and of the output functions.
It is well known that the stability of a linear singular perturbed system can be inferred from the stability of the subsystems if is small [21]. In [18] a similar result has been proven for linear systems with vanishing perturbations by using the indirect Lyapunov approach. We will apply this result to analyze
Example
Note that finding an example which is not stable or unstable for all is not trivial. For the case of one neuron, there is no bound on : if the reduced system and the boundary layer system are stable, then the system is stable for all . Remark 6 Applying Gershgorin's Theorem to We obtain that the linear system is stable for all , if and which reads for , and and unstable for
Comparisons
In this section, we compare various stability results for competitive neural networks with different time scales under norm-bounded parameter uncertainties and describe stability aspects for other activity-states-only neural systems.
In [16] we prove the asymptotic stability of the multi-time scale uncertain neural system by analyzing the neural network as a nominal linear system that has norm-bounded perturbations describing the output functions and the weights. We derive the necessary
Conclusion
We demonstrated local uniform stability for a class of unsupervised competitive neural networks describing a coupled perturbed activity and synaptic dynamics. We assumed that the parametric uncertainties of the neuron's output function and lateral inhibition terms are bounded. Based on Gershgorin's Theorem we were able to derive simple stability conditions in terms of the network's architecture and find an improved time-scale associated with the short term memory state than obtained based on
Anke Meyer-Baese is Associate Professor in the Department of Scientific Computing at the Florida State University. Her research areas include theory and application of neural networks, medical image processing, pattern recognition and parallel processing. She published over 150 papers in several areas ranging from intelligent systems, medical image processing, speech recognition and neural networks. She is the author of the book “Pattern Recognition in Medical Imaging” which appeared in
References (23)
- et al.
Delay-dependent stability for uncertain cellular neural networks with discrete and distribute time-varying delays
Journal of the Franklin Institute
(2008) - et al.
New delay-dependent robust stability criterion for uncertain neural networks with time-varying delays
Applied Mathematics and Computation
(2008) - et al.
On robust stability criterion for dynamic systems with time-varying delays and nonlinear perturbations
Applied Mathematics and Computation
(2008) - et al.
Global exponential stability of delayed competitive neural networks with different time scales
Neural Networks
(2005) - et al.
Robustness and perturbation analysis of a class of artificial neural networks
Neural Networks
(1994) - et al.
Global robust exponential stability of interval general BAM neural network with delays
Neural Processing Letters
(2006) - et al.
Global robust exponential stability of interval BAM neural network with mixed delays under uncertainty
Neural Processing Letters
(2007) - et al.
Robustness analysis of a class of discrete-time recurrent neural networks under perturbations
IEEE Transactions on Circuit and Systems I
(1999) - et al.
Global exponential stability of multitime-scale competitive neural networks with nonsmooth functions
IEEE Transactions on Neural Networks
(2006) Matrix Analysis for Scientists and Engineers
(2005)
Global exponential convergence of multitime-scale neural networks
IEEE Transactions on Circuit and System II
Cited by (38)
Stability and synchronization criteria for fractional order competitive neural networks with time delays: An asymptotic expansion of Mittag Leffler function
2019, Journal of the Franklin InstituteImpulsive effects on competitive neural networks with mixed delays: Existence and exponential stability analysis
2019, Mathematics and Computers in SimulationGlobal asymptotic stability analysis of two-time-scale competitive neural networks with time-varying delays
2018, NeurocomputingCitation Excerpt :A maximal upper bound for the fast time-scale associated with the neural activity state was derived. Stability conditions were given in [12] based on Gershgorin’s Theory and a more realistic upper bound for ε is obtained. These existing models [1,7–12] were extended to deal with stochastic disturbance in [13], where the conditions ensuring the existence of the exponentially mean-square stability equilibria was established based on the theory of singularly perturbed stochastic systems.
Anke Meyer-Baese is Associate Professor in the Department of Scientific Computing at the Florida State University. Her research areas include theory and application of neural networks, medical image processing, pattern recognition and parallel processing. She published over 150 papers in several areas ranging from intelligent systems, medical image processing, speech recognition and neural networks. She is the author of the book “Pattern Recognition in Medical Imaging” which appeared in Elsevier, Academic Press in 2003 and of “Biomedical Signal Processing: Advanced Methods and Applications” with MIT Press 2010.
Rodney G. Roberts received B.S. degrees in Electrical Engineering and Mathematics from Rose-Hulman Institute of Technology in 1987 and a MSEE and Ph.D. in Electrical Engineering from Purdue University in 1988 and 1992, respectively. From 1992 until 1994, he was a National Research Council Fellow at Wright Patterson Air Force Base in Dayton, Ohio. Since 1994 he has been at the Florida A&M University—Florida State University College of Engineering where he is currently a Professor of Electrical and Computer Engineering. His research interests are in the areas of robotics and image processing.
Vera Thuemmler received her Ph.D. in Mathematics from Bielefeld University, Germany, in 2005. She won the Bielefeld University Dissertation Award in 2006 for her thesis “Numerical analysis of the method of freezing traveling waves”. Her research interests are mainly in the numerical analysis of dynamical systems and applications to neural networks and systems biology.