Research Article  Open Access
New Improved Exponential Stability Criteria for DiscreteTime Neural Networks with TimeVarying Delay
Abstract
The robust stability of uncertain discretetime recurrent neural networks with timevarying delay is investigated. By decomposing some connection weight matrices, new LyapunovKrasovskii functionals are constructed, and serial new improved stability criteria are derived. These criteria are formulated in the forms of linear matrix inequalities (LMIs). Compared with some previous results, the new results are less conservative. Three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed method.
1. Introductionn
In recent years, recurrent neural networks (see [1–7]), such as Hopfield neural networks, cellular neural networks, and other networks have been widely investigated and successfully applied in all kinds of science areas such as pattern recognition, image processing, and fixedpoint computation. However, because of the finite switching speed of neurons and amplifiers, time delay is unavoidable in nature and technology. It can make important effects on the stability of dynamic systems. Thus, the studies on stability are of great significance. There has been a growing research interest on the stability analysis problems for delayed neural networks, and many excellent papers and monographs have been available. On the other hand, during the design of neural network and its hardware implementation, the convergence of a neural network may often be destroyed by its unavoidable uncertainty due to the existence of modeling error, the deviation of vital data, and so on. Therefore, the studies on robust convergence of delayed neural network have been a hot research direction. Up to now, many sufficient conditions, either delaydependent or delayindependent, have been proposed to guarantee the global robust asymptotic or exponential stability for different class of delayed neural networks (see [8–13]).
It is worth pointing out that most neural networks have been assumed to be in continuous time, but few in discrete time. In practice, the discretetime neural networks are more applicable to problems that are inherently temporal in nature or related to biological realities. And they can ideally keep the dynamic characteristics, functional similarity, and even the physical or biological reality of the continuoustime networks under mild restriction. Thus, the stability analysis problems for discretetime neural networks have received more and more interest, and some stability criteria have been proposed in literature (see [14–25]). In [14], Liu et al. researched a class of discretetime RNNs with timevarying delay, and proposed a delaydependent condition guaranteeing the global exponential stability. By using a similar technique to that in [21], the result obtained in [14] has been improved by Song and Wang in [15]. The results in [15] are further improved by Zhang et al. in [16] by introducing some useful terms. In [17], Yu et al. proposed a new less conservative result than that obtained in [16] via constructing a new augment LyapunovKrasovskii functional.
In this paper, the connection weight matrix is decomposed, and some new LyapunovKrasovskii functionals are constructed. Combined with linear matrix inequality (LMI) technique, serial new improved stability criteria are derived. Numerical examples show that these new criteria are less conservative than those obtained in [14–17].
Notation 1. The notations are used in our paper except where otherwise specified. denotes a vector or a matrix norm; are real and ndimension real number sets, respectively; is nonnegative integer set. is identity matrix; represents the elements below the main diagonal of a symmetric block matrix; Real matrix denotes that is a positive definite (negative definite) matrix; ; denotes the minimum and maximum eigenvalue of a real matrix.
2. Preliminaries
Consider a discretetime recurrent neural network with timevarying delays [17] described by where denotes the neural state vector; , are the neuron activation functions; is the external input vector; Positive integer represents the transmission delay that satisfies , where are known positive integers representing the lower and upper bounds of the delay. , , . with describes the rate with which the th neuron will reset its potential to the resting state in isolation when disconnected from the networks and external inputs; represent the weighting matrices; denote the timevarying structured uncertainties which are of the following form: where are known real constant matrices with appropriate dimensions, is unknown timevarying matrix function satisfying .
The nominal of can be defined as
To obtain our main results, we need to introduce the following assumption, definition and lemmas.
Assumption 1. For any , ,
where are known constant scalars.
As pointed out in [16] under Assumption 1, system (2.3) has equilibrium points. Assume that is an equilibrium point of (2.3) and let , . Then, system (2.3), can be transformed into the following form:
where , , . From Assumption 1, for any , , functions satisfy and
Remark 2.1. Assumption 1 is widely used for dealing with the stability problem for neural networks. As pointed out in [13, 14, 16, 17, 26, 27], constants can be positive, negative, and zero. Thus, this assumption is less restrictive than traditional Lipschitz condition.
Definition 2.2. The delayed discretetime recurrent neural network in (2.5) is said to be globally exponentially stable if there exist two positive scalars and such that
Lemma 2.3 (Tchebychev Inequality [28]). For any given vectors , the following inequality holds: Lemma 2.4 (see [29]). For given matrices and of appropriate dimensions, then for all satisfying , if and only if there is an , such that Lemma 2.5 (see [16]). If Assumption 1 holds, then for any positivedefinite diagonal matrix , the following inequality holds: where , .Lemma 2.6 (see [30]). Given constant symmetric matrices where and , then if and only if Lemma 2.7 (see [13]). Let and be real constant matrices with appropriate dimensions, matrix satisfying , then, for any , .
3. Main Results
To obtain our main results, we decompose the connection weight matrix as follows: Then, we can get the following stability results.
Theorem 3.1. For any given positive scalars , then, under Assumption 1, system (2.5) without uncertainty is globally exponentially stable for any timevarying delay satisfying , if there exist positivedefinite matrices , positivedefinite diagonal matrices , and arbitrary matrices with appropriate dimensions, such that the following LMI holds: where
Proof. Construct a new augmented LyapunovKrasovskii functional candidate as follows: where ; 0 is zero matrix with appropriate dimensions: Set , , , , , . Define . Then along the solution of system (2.5) we have On the other hand, since , , we have Noting that Combining (3.12)–(3.17), (3.36)–(3.40), similar to the proof of Theorem 3.1, one can easily obtain this result, which completes the proof.
Remark 3.10. Compared with the augmented Lyapunov functional constructed in Theorem 3.1, this new augmented Lyapunov functional include the term , which makes the conservatism of the stability criterion be reduced further (details for more, see Example 4.2).
4. Numerical Examples
In this section, three numerical examples will be presented to show the validity of the main results derived above.
Example 4.1. For the convenience of comparison, let us consider a delayed discretetime recurrent neural network in (2.5) with parameters given by The activation functions are given by . It is easy to see that the activation functions satisfy Assumption 1 with , . For , references [15–17] gave out the allowable upper bound of the timevarying delay, respectively. Decompose matrix as , where Table 1 shows that our results are less conservative than these previous results.
Example 4.2. Consider an uncertain delayed discretetime recurrent neural network in (2.1) with parameters given by The activation functions are given by , . It is easy to see that the activation functions satisfy Assumption 1 with , . For , references [16, 17] gave out the allowable upper bound of the timevarying delay, respectively. Decompose matrix as , where Set , by using the MATLAB toolbox, the allowable upper bounds for given are showed in Table 2. Obviously, our results are less conservative than these previous results.
Example 4.3. Consider an uncertain delayed discretetime recurrent neural network in (2.1) with parameters given by
And the activation functions are the same as given in Example 4.2. Decompose matrix as , where
Set , by using the MATLAB toolbox, the allowable upper bounds for given are showed in Table 3.
The freeweighting matrices are obtained as follows when :

5. Conclusion
By decomposing some connection weight matrices , combined with linear matrix inequality (LMI) technique, some new augmented LyapunovKrasovskii functionals are constructed, and serial new improved sufficient conditions ensuring exponential stability or robust exponential stability are obtained. Numerical examples show that the new criteria derived in this paper are less conservative than some previous results obtained in the references cited therein.
Acknowlegments
This work was supported by the program for New Century Excellent Talents in University (NCET060811) and the Research Fund for the Doctoral Program of Guizhou College of Finance and Economics (200702).
References
 J. Yu, K. Zhang, S. Fei, and T. Li, “Simplified exponential stability analysis for recurrent neural networks with discrete and distributed timevarying delays,” Applied Mathematics and Computation, vol. 205, no. 1, pp. 465–474, 2008. View at: Publisher Site  Google Scholar  MathSciNet
 J. Wang, L. Huang, and Z. Guo, “Dynamical behavior of delayed Hopfield neural networks with discontinuous activations,” Applied Mathematical Modelling, vol. 33, no. 4, pp. 1793–1802, 2009. View at: Google Scholar  MathSciNet
 Y. Xia, Z. Huang, and M. Han, “Exponential $p$stability of delayed CohenGrossbergtype BAM neural networks with impulses,” Chaos, Solitons & Fractals, vol. 38, no. 3, pp. 806–818, 2008. View at: Publisher Site  Google Scholar  MathSciNet
 M. S. Ali and P. Balasubramaniam, “Stability analysis of uncertain fuzzy Hopfield neural networks with time delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 14, no. 6, pp. 2776–2783, 2009. View at: Publisher Site  Google Scholar  MathSciNet
 H. Wu and C. Shan, “Stability analysis for periodic solution of BAM neural networks with discontinuous neuron activations and impulses,” Applied Mathematical Modelling, vol. 33, pp. 2564–2574, 2009. View at: Google Scholar
 Y. Li and X. Fan, “Existence and globally exponential stability of almost periodic solution for CohenGrossberg BAM neural networks with variable coefficients,” Applied Mathematical Modelling, vol. 33, no. 4, pp. 2114–2120, 2009. View at: Google Scholar  MathSciNet
 O. M. Kwon and J. H. Park, “Improved delaydependent stability criterion for neural networks with timevarying delays,” Physics Letters A, vol. 373, no. 5, pp. 529–535, 2009. View at: Publisher Site  Google Scholar  MathSciNet
 W. Xiong, L. Song, and J. Cao, “Adaptive robust convergence of neural networks with timevarying delays,” Nonlinear Analysis: Real World Applications, vol. 9, no. 4, pp. 1283–1291, 2008. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 H. J. Cho and J. H. Park, “Novel delaydependent robust stability criterion of delayed cellular neural networks,” Chaos, Solitons & Fractals, vol. 32, no. 3, pp. 1194–1200, 2007. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 Q. Song and J. Cao, “Global robust stability of interval neural networks with multiple timevarying delays,” Mathematics and Computers in Simulation, vol. 74, no. 1, pp. 38–46, 2007. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 T. Li, L. Guo, and C. Sun, “Robust stability for neural networks with timevarying delays and linear fractional uncertainties,” Neurocomputing, vol. 71, pp. 421–427, 2007. View at: Google Scholar
 V. Singh, “Improved global robust stability of interval delayed neutral networks via split interval: genralizations,” Applied Mathematics and Computation, vol. 206, no. 1, pp. 290–297, 2008. View at: Publisher Site  Google Scholar  MathSciNet
 Y. Liu, Z. Wang, and X. Liu, “Robust stability of discretetime stochastic neural networks with timevarying delays,” Neurocomputing, vol. 71, pp. 823–833, 2008. View at: Google Scholar
 Y. Liu et al., “Discretetime recurrent neural networks with timevarying delays: exponential stability analysis,” Physics Letters A, vol. 362, pp. 480–488, 2007. View at: Google Scholar
 Q. Song and Z. Wang, “A delaydependent LMI approach to dynamics analysis of discretetime recurrent neural networks with timevarying delays,” Physics Letters A, vol. 368, pp. 134–145, 2007. View at: Google Scholar
 B. Zhang, S. Xu, and Y. Zou, “Improved delaydependent exponential stability criteria for discretetime recurrent neural networks with timevarying delays,” Neurocomputing, vol. 72, pp. 321–330, 2008. View at: Google Scholar
 J. Yu, K. Zhang, and S. Fei, “Exponential stability criteria for discretetime recurrent neural networks with timevarying delay,” Nonlinear Analysis: Real World Applications. In press. View at: Publisher Site  Google Scholar
 H. Zhao and L. Wang, “Stability and bifurcation for discretetime CohenGrossberg neural network,” Applied Mathematics and Computation, vol. 179, no. 2, pp. 787–798, 2006. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 X. Liu et al., “Discretetime BAM neural networks with variable delays,” Physics Letters A, vol. 367, pp. 322–330, 2007. View at: Google Scholar
 H. Zhao, L. Wang, and C. Ma, “Hopf bifurcation and stability analysis on discretetime Hopfield neural network with delay,” Nonlinear Analysis: Real World Applications, vol. 9, no. 1, pp. 103–113, 2008. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 H. Gao and T. Chen, “New results on stability of discretetime systems with timevarying state delay,” IEEE Transactions on Automatic Control, vol. 52, no. 2, pp. 328–334, 2007. View at: Publisher Site  Google Scholar  MathSciNet
 X. Liao and S. Guo, “Delaydependent asymptotic stability of CohenGrossberg models with multiple timevarying delays,” Discrete Dynamics in Nature and Society, vol. 2007, Article ID 28960, 17 pages, 2007. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 Q. Zhang, X. Wei, and J. Xu, “On global exponential stability of discretetime Hopfield neural networks with variable delays,” Discrete Dynamics in Nature and Society, vol. 2007, Article ID 67675, 9 pages, 2007. View at: Publisher Site  Google Scholar  MathSciNet
 Y Chen, W. Bi, and Y. Wu, “Delaydependent exponential stability for discretetime BAM neural networks with timevarying delays,” Discrete Dynamics in Nature and Society, vol. 2008, Article ID 421614, 14 pages, 2008. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 S. Stević, “Permanence for a generalized discrete neural network system,” Discrete Dynamics in Nature and Society, vol. 2007, Article ID 89413, 9 pages, 2007. View at: Publisher Site  Google Scholar  MathSciNet
 Z. Wang, H. Shu, Y. Liu, D. W. C. Ho, and X. Liu, “Robust stability analysis of generalized neural networks with discrete and distributed time delays,” Chaos, Solitons & Fractals, vol. 30, no. 4, pp. 886–896, 2006. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 Y. Liu, Z. Wang, and X. Liu, “Global exponential stability of generalized recurrent neural networks with discrete and distributed delays,” Neural Networks, vol. 19, pp. 667–675, 2006. View at: Google Scholar
 T. N. Lee and U. L. Radovic, “General decentralized stabilization of largescale linear continuous and discrete timedelay systems,” International Journal of Control, vol. 46, no. 6, pp. 2127–2140, 1987. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 L. Xie, “Output feedback ${H}_{\infty}$ control of systems with parameter uncertainty,” International Journal of Control, vol. 63, no. 4, pp. 741–750, 1996. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 S. Boyd, L. El Ghaoui, E. Feron, and V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory, vol. 15 of SIAM Studies in Applied Mathematics, SIAM, Philadelphia, Pa, USA, 1994. View at: MathSciNet
Copyright
Copyright © 2009 Zixin Liu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.