Table of Contents Author Guidelines Submit a Manuscript
Discrete Dynamics in Nature and Society
Volume 2009, Article ID 874582, 23 pages
http://dx.doi.org/10.1155/2009/874582
Research Article

New Improved Exponential Stability Criteria for Discrete-Time Neural Networks with Time-Varying Delay

1School of Applied Mathematics, University of Electronic Science and Technology of China, Chengdu 610054, China
2School of Mathematics and Statistics, Guizhou College of Finance and Economics, Guiyang 550004, China
3School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, China

Received 13 March 2009; Accepted 11 May 2009

Academic Editor: Manuel De La Sen

Copyright © 2009 Zixin Liu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The robust stability of uncertain discrete-time recurrent neural networks with time-varying delay is investigated. By decomposing some connection weight matrices, new Lyapunov-Krasovskii functionals are constructed, and serial new improved stability criteria are derived. These criteria are formulated in the forms of linear matrix inequalities (LMIs). Compared with some previous results, the new results are less conservative. Three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed method.

1. Introductionn

In recent years, recurrent neural networks (see [17]), such as Hopfield neural networks, cellular neural networks, and other networks have been widely investigated and successfully applied in all kinds of science areas such as pattern recognition, image processing, and fixed-point computation. However, because of the finite switching speed of neurons and amplifiers, time delay is unavoidable in nature and technology. It can make important effects on the stability of dynamic systems. Thus, the studies on stability are of great significance. There has been a growing research interest on the stability analysis problems for delayed neural networks, and many excellent papers and monographs have been available. On the other hand, during the design of neural network and its hardware implementation, the convergence of a neural network may often be destroyed by its unavoidable uncertainty due to the existence of modeling error, the deviation of vital data, and so on. Therefore, the studies on robust convergence of delayed neural network have been a hot research direction. Up to now, many sufficient conditions, either delay-dependent or delay-independent, have been proposed to guarantee the global robust asymptotic or exponential stability for different class of delayed neural networks (see [813]).

It is worth pointing out that most neural networks have been assumed to be in continuous time, but few in discrete time. In practice, the discrete-time neural networks are more applicable to problems that are inherently temporal in nature or related to biological realities. And they can ideally keep the dynamic characteristics, functional similarity, and even the physical or biological reality of the continuous-time networks under mild restriction. Thus, the stability analysis problems for discrete-time neural networks have received more and more interest, and some stability criteria have been proposed in literature (see [1425]). In [14], Liu et al. researched a class of discrete-time RNNs with time-varying delay, and proposed a delay-dependent condition guaranteeing the global exponential stability. By using a similar technique to that in [21], the result obtained in [14] has been improved by Song and Wang in [15]. The results in [15] are further improved by Zhang et al. in [16] by introducing some useful terms. In [17], Yu et al. proposed a new less conservative result than that obtained in [16] via constructing a new augment Lyapunov-Krasovskii functional.

In this paper, the connection weight matrix is decomposed, and some new Lyapunov-Krasovskii functionals are constructed. Combined with linear matrix inequality (LMI) technique, serial new improved stability criteria are derived. Numerical examples show that these new criteria are less conservative than those obtained in [1417].

Notation 1. The notations are used in our paper except where otherwise specified. denotes a vector or a matrix norm; are real and n-dimension real number sets, respectively; is nonnegative integer set. is identity matrix; represents the elements below the main diagonal of a symmetric block matrix; Real matrix denotes that is a positive definite (negative definite) matrix; ; denotes the minimum and maximum eigenvalue of a real matrix.

2. Preliminaries

Consider a discrete-time recurrent neural network with time-varying delays [17] described by where denotes the neural state vector; , are the neuron activation functions; is the external input vector; Positive integer represents the transmission delay that satisfies , where are known positive integers representing the lower and upper bounds of the delay. , , . with describes the rate with which the th neuron will reset its potential to the resting state in isolation when disconnected from the networks and external inputs; represent the weighting matrices; denote the time-varying structured uncertainties which are of the following form: where are known real constant matrices with appropriate dimensions, is unknown time-varying matrix function satisfying .

The nominal of can be defined as

To obtain our main results, we need to introduce the following assumption, definition and lemmas.

Assumption 1. For any , , where are known constant scalars.
As pointed out in [16] under Assumption 1, system (2.3) has equilibrium points. Assume that is an equilibrium point of (2.3) and let , . Then, system (2.3), can be transformed into the following form: where , , . From Assumption 1, for any , , functions satisfy and

Remark 2.1. Assumption 1 is widely used for dealing with the stability problem for neural networks. As pointed out in [13, 14, 16, 17, 26, 27], constants can be positive, negative, and zero. Thus, this assumption is less restrictive than traditional Lipschitz condition.

Definition 2.2. The delayed discrete-time recurrent neural network in (2.5) is said to be globally exponentially stable if there exist two positive scalars   and such that

Lemma 2.3 (Tchebychev Inequality [28]). For any given vectors , the following inequality holds: Lemma 2.4 (see [29]). For given matrices and of appropriate dimensions, then for all satisfying , if and only if there is an , such that Lemma 2.5 (see [16]). If Assumption 1 holds, then for any positive-definite diagonal matrix , the following inequality holds: where , .Lemma 2.6 (see [30]). Given constant symmetric matrices where and , then if and only if Lemma 2.7 (see [13]). Let and be real constant matrices with appropriate dimensions, matrix satisfying , then, for any , .

3. Main Results

To obtain our main results, we decompose the connection weight matrix as follows: Then, we can get the following stability results.

Theorem 3.1. For any given positive scalars , then, under Assumption 1, system (2.5) without uncertainty is globally exponentially stable for any time-varying delay satisfying , if there exist positive-definite matrices , positive-definite diagonal matrices , and arbitrary matrices with appropriate dimensions, such that the following LMI holds: where

Proof. Construct a new augmented Lyapunov-Krasovskii functional candidate as follows: where ; 0 is zero matrix with appropriate dimensions: Set , , , , , . Define . Then along the solution of system (2.5) we have On the other hand, since , , we have Noting that Combining (3.12)–(3.17), (3.36)–(3.40), similar to the proof of Theorem 3.1, one can easily obtain this result, which completes the proof.

Remark 3.10. Compared with the augmented Lyapunov functional constructed in Theorem 3.1, this new augmented Lyapunov functional include the term , which makes the conservatism of the stability criterion be reduced further (details for more, see Example 4.2).

4. Numerical Examples

In this section, three numerical examples will be presented to show the validity of the main results derived above.

Example 4.1. For the convenience of comparison, let us consider a delayed discrete-time recurrent neural network in (2.5) with parameters given by The activation functions are given by . It is easy to see that the activation functions satisfy Assumption 1 with , . For , references [1517] gave out the allowable upper bound of the time-varying delay, respectively. Decompose matrix as , where Table 1 shows that our results are less conservative than these previous results.

tab1
Table 1: Allowable upper bounds for given

Example 4.2. Consider an uncertain delayed discrete-time recurrent neural network in (2.1) with parameters given by The activation functions are given by , . It is easy to see that the activation functions satisfy Assumption 1 with , . For , references [16, 17] gave out the allowable upper bound of the time-varying delay, respectively. Decompose matrix as , where Set , by using the MATLAB toolbox, the allowable upper bounds for given are showed in Table 2. Obviously, our results are less conservative than these previous results.

tab2
Table 2: Allowable upper bounds for given

Example 4.3. Consider an uncertain delayed discrete-time recurrent neural network in (2.1) with parameters given by And the activation functions are the same as given in Example 4.2. Decompose matrix as , where Set , by using the MATLAB toolbox, the allowable upper bounds for given are showed in Table 3.
The free-weighting matrices are obtained as follows when :

tab3
Table 3: Allowable upper bounds for given

5. Conclusion

By decomposing some connection weight matrices , combined with linear matrix inequality (LMI) technique, some new augmented Lyapunov-Krasovskii functionals are constructed, and serial new improved sufficient conditions ensuring exponential stability or robust exponential stability are obtained. Numerical examples show that the new criteria derived in this paper are less conservative than some previous results obtained in the references cited therein.

Acknowlegments

This work was supported by the program for New Century Excellent Talents in University (NCET-06-0811) and the Research Fund for the Doctoral Program of Guizhou College of Finance and Economics (200702).

References

  1. J. Yu, K. Zhang, S. Fei, and T. Li, “Simplified exponential stability analysis for recurrent neural networks with discrete and distributed time-varying delays,” Applied Mathematics and Computation, vol. 205, no. 1, pp. 465–474, 2008. View at Publisher · View at Google Scholar · View at MathSciNet
  2. J. Wang, L. Huang, and Z. Guo, “Dynamical behavior of delayed Hopfield neural networks with discontinuous activations,” Applied Mathematical Modelling, vol. 33, no. 4, pp. 1793–1802, 2009. View at Google Scholar · View at MathSciNet
  3. Y. Xia, Z. Huang, and M. Han, “Exponential p-stability of delayed Cohen-Grossberg-type BAM neural networks with impulses,” Chaos, Solitons & Fractals, vol. 38, no. 3, pp. 806–818, 2008. View at Publisher · View at Google Scholar · View at MathSciNet
  4. M. S. Ali and P. Balasubramaniam, “Stability analysis of uncertain fuzzy Hopfield neural networks with time delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 14, no. 6, pp. 2776–2783, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  5. H. Wu and C. Shan, “Stability analysis for periodic solution of BAM neural networks with discontinuous neuron activations and impulses,” Applied Mathematical Modelling, vol. 33, pp. 2564–2574, 2009. View at Google Scholar
  6. Y. Li and X. Fan, “Existence and globally exponential stability of almost periodic solution for Cohen-Grossberg BAM neural networks with variable coefficients,” Applied Mathematical Modelling, vol. 33, no. 4, pp. 2114–2120, 2009. View at Google Scholar · View at MathSciNet
  7. O. M. Kwon and J. H. Park, “Improved delay-dependent stability criterion for neural networks with time-varying delays,” Physics Letters A, vol. 373, no. 5, pp. 529–535, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  8. W. Xiong, L. Song, and J. Cao, “Adaptive robust convergence of neural networks with time-varying delays,” Nonlinear Analysis: Real World Applications, vol. 9, no. 4, pp. 1283–1291, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  9. H. J. Cho and J. H. Park, “Novel delay-dependent robust stability criterion of delayed cellular neural networks,” Chaos, Solitons & Fractals, vol. 32, no. 3, pp. 1194–1200, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  10. Q. Song and J. Cao, “Global robust stability of interval neural networks with multiple time-varying delays,” Mathematics and Computers in Simulation, vol. 74, no. 1, pp. 38–46, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  11. T. Li, L. Guo, and C. Sun, “Robust stability for neural networks with time-varying delays and linear fractional uncertainties,” Neurocomputing, vol. 71, pp. 421–427, 2007. View at Google Scholar
  12. V. Singh, “Improved global robust stability of interval delayed neutral networks via split interval: genralizations,” Applied Mathematics and Computation, vol. 206, no. 1, pp. 290–297, 2008. View at Publisher · View at Google Scholar · View at MathSciNet
  13. Y. Liu, Z. Wang, and X. Liu, “Robust stability of discrete-time stochastic neural networks with time-varying delays,” Neurocomputing, vol. 71, pp. 823–833, 2008. View at Google Scholar
  14. Y. Liu et al., “Discrete-time recurrent neural networks with time-varying delays: exponential stability analysis,” Physics Letters A, vol. 362, pp. 480–488, 2007. View at Google Scholar
  15. Q. Song and Z. Wang, “A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays,” Physics Letters A, vol. 368, pp. 134–145, 2007. View at Google Scholar
  16. B. Zhang, S. Xu, and Y. Zou, “Improved delay-dependent exponential stability criteria for discrete-time recurrent neural networks with time-varying delays,” Neurocomputing, vol. 72, pp. 321–330, 2008. View at Google Scholar
  17. J. Yu, K. Zhang, and S. Fei, “Exponential stability criteria for discrete-time recurrent neural networks with time-varying delay,” Nonlinear Analysis: Real World Applications. In press. View at Publisher · View at Google Scholar
  18. H. Zhao and L. Wang, “Stability and bifurcation for discrete-time Cohen-Grossberg neural network,” Applied Mathematics and Computation, vol. 179, no. 2, pp. 787–798, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  19. X. Liu et al., “Discrete-time BAM neural networks with variable delays,” Physics Letters A, vol. 367, pp. 322–330, 2007. View at Google Scholar
  20. H. Zhao, L. Wang, and C. Ma, “Hopf bifurcation and stability analysis on discrete-time Hopfield neural network with delay,” Nonlinear Analysis: Real World Applications, vol. 9, no. 1, pp. 103–113, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  21. H. Gao and T. Chen, “New results on stability of discrete-time systems with time-varying state delay,” IEEE Transactions on Automatic Control, vol. 52, no. 2, pp. 328–334, 2007. View at Publisher · View at Google Scholar · View at MathSciNet
  22. X. Liao and S. Guo, “Delay-dependent asymptotic stability of Cohen-Grossberg models with multiple time-varying delays,” Discrete Dynamics in Nature and Society, vol. 2007, Article ID 28960, 17 pages, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  23. Q. Zhang, X. Wei, and J. Xu, “On global exponential stability of discrete-time Hopfield neural networks with variable delays,” Discrete Dynamics in Nature and Society, vol. 2007, Article ID 67675, 9 pages, 2007. View at Publisher · View at Google Scholar · View at MathSciNet
  24. Y Chen, W. Bi, and Y. Wu, “Delay-dependent exponential stability for discrete-time BAM neural networks with time-varying delays,” Discrete Dynamics in Nature and Society, vol. 2008, Article ID 421614, 14 pages, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  25. S. Stević, “Permanence for a generalized discrete neural network system,” Discrete Dynamics in Nature and Society, vol. 2007, Article ID 89413, 9 pages, 2007. View at Publisher · View at Google Scholar · View at MathSciNet
  26. Z. Wang, H. Shu, Y. Liu, D. W. C. Ho, and X. Liu, “Robust stability analysis of generalized neural networks with discrete and distributed time delays,” Chaos, Solitons & Fractals, vol. 30, no. 4, pp. 886–896, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  27. Y. Liu, Z. Wang, and X. Liu, “Global exponential stability of generalized recurrent neural networks with discrete and distributed delays,” Neural Networks, vol. 19, pp. 667–675, 2006. View at Google Scholar
  28. T. N. Lee and U. L. Radovic, “General decentralized stabilization of large-scale linear continuous and discrete time-delay systems,” International Journal of Control, vol. 46, no. 6, pp. 2127–2140, 1987. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  29. L. Xie, “Output feedback H control of systems with parameter uncertainty,” International Journal of Control, vol. 63, no. 4, pp. 741–750, 1996. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  30. S. Boyd, L. El Ghaoui, E. Feron, and V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory, vol. 15 of SIAM Studies in Applied Mathematics, SIAM, Philadelphia, Pa, USA, 1994. View at MathSciNet