About this Journal Submit a Manuscript Table of Contents
Abstract and Applied Analysis
Volume 2013 (2013), Article ID 540951, 7 pages
http://dx.doi.org/10.1155/2013/540951
Research Article

Novel Global Exponential Stability Criterion for Recurrent Neural Networks with Time-Varying Delay

1School of Electrical and Information Engineering, Guangxi University of Science and Technology, Liuzhou 545006, China
2Guangxi Key Laboratory of Automobile Components and Vehicle Technology, Guangxi University of Science and Technology, Liuzhou 545006, China
3School of Computer Engineering, Guangxi University of Science and Technology, Liuzhou 545006, China

Received 15 October 2012; Accepted 3 January 2013

Academic Editor: Massimo Furi

Copyright © 2013 Wenguang Luo et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The problem of global exponential stability for recurrent neural networks with time-varying delay is investigated. By dividing the time delay interval [] into dynamical subintervals, a new Lyapunov-Krasovskii functional is introduced; then, a novel linear-matrix-inequality (LMI-) based delay-dependent exponential stability criterion is derived, which is less conservative than some previous literatures (Zhang et al., 2005; He et al., 2006; and Wu et al., 2008). An illustrate example is finally provided to show the effectiveness and the advantage of the proposed result.

1. Introduction

In the past decades, recurrent neural networks (RNNs) have been extensively investigated because of their applications, such as combinatorial optimization [1, 2], associative memories [35], signal processing [6], image processing [7], pattern recognition [8, 9], and so forth. Some of these applications often require that equilibrium points of the designed networks be stable. Meanwhile, in the hard implementation of RNNs, time delay commonly occurrs due to the finite switching speed of amplifiers or finite speed of signal processing, and its existence is always an origin of oscillation, divergence, and instability in neural networks. Therefore, the stability of RNNs with time delay has received much attention, and a large amount of results have been proposed to ensure the asymptotic or exponential stability of delayed neural networks [1021].

So far, there is a main means handling the stability of delayed neural networks: free-weighting matrix approach [2226]. Recently, a novel method was proposed for Hopfield neural networks with constant delay in [27], which brings more free-weighting matrices by dividing equally the constant time delay interval into subintervals. Further more, by dividing the time delay interval into dynamical subintervals, Zhang et al. [28] generalize this method to study global asymptotic stability of RNNs with time-varying delay. This method mainly utilizes the information in the time delay interval , which brings more freedom degrees and can reduce conservativeness.

Motivated by above-mentioned discussions, in this paper, we consider the global exponential stability of RNNs with time-varying delay. By dividing the time delay interval into dynamical subintervals, we construct a new Lyapunov-Krasovskii functional (LKF) and derive a novel sufficient condition, which is presented in term of linear matrix inequality (LMI). The obtained stability result is less conservative than some existing results [22, 23, 29]. Finally, an illustrating example is given to verify the effectiveness and the advantage of the proposed result.

The rest of this paper is organized as follows. In Section 2, the problem of exponential stability analysis for RNNs with time-varying delay is formulated. Section 3 presents our main results. An illustrating example is provided in Section 4. The conclusion is stated in Section 5.

Throughout this paper, denotes an real matrix. ,  ,  , and   represent the transpose, the Euclidean norm, the minimum eigenvalue, and the maximum eigenvalue of matrix , respectively. denotes that is a positive (negative) definite matrix. denotes an identify matrix with compatible dimensions, and * denotes the symmetric terms in a symmetric matrix.

2. Problem Formulation

Consider the following RNNs with time-varying delay: where is the state vector, denotes the neuron activation function, and is a bias value vector. is diagonal matrix with , . and are connection weight matrix and the delay connection weight matrix, respectively. The initial condition is a continuous and differentiable vector-valued function, where . The time delay is a differentiable function that satisfies: ,  , where and .

To obtain the proposal result, we assume that each is bounded and satisfies where ,  . and are some constants, .

Let ,  , system (1) is equivalent to the following form: where ,  ,  .

Noting assumption (2), we have Assuming system (3) has an equilibrium point . Then, let , and we define , system (3) is transformed into the following form: where with ,  .

In the derivation of the main results, we need the following lemmas and definitions.

Definition 1 (global exponential stability). System (5) is said to be globally exponentially stable with convergence rate , if there exist constants and , such that where .

Lemma 2. Let be a vector-valued function with the first-order continuous-derivative entries. Then, the following integral inequality holds for matrix and any matrices ,  , and two scalar functions and , where where and .

Proof. This proof can be completed in a manner similar to [30].

Lemma 3 (see [31]). For any two vectors ,  , any matrix A, any positive definite symmetric matrix with the same dimensions, and any two positive constants m, n, the following inequality holds:

3. Main Results

In this section, we will consider the delay interval , which is divided into dynamical subintervals, namely, , where . This is to say, there is a parameter sequence , which satisfies the following conditions: where , is positive integer.

Utilizing the useful information of dynamical subintervals, a novel LKF is constructed, and then a newly LMI-based delay-dependent sufficient condition can be proposed to guarantee the global exponential stability of RNNs with time-varying delay.

Theorem 4. The equilibrium point of system (5) with is globally exponentially stable with convergence rate , if there exists parameter satisfying , some positive definite symmetric matrices ,  ,  ,  ,  ,  , some positive definite diagonal matrices ,  ,  ,   and , and any matrices , where ,  , and is a positive integer, such that the following LMI has feasible solution: where .

Proof. Construct the following Lyapunov-Krasovskii functional candidate: where where , , , , , , , and .
Let the parameters ,  ; calculating the time derivatives along the trajectories of system (5) yields It is clear that the following inequality is true: According to (4), for some diagonal matrices ,  ,  ,  , we have Using Lemma 2, we have where , , .
From (15)–(17), and using (18), we finally get where , is defined in (11). Obviously, if , it implies for any .
And On the other hand, the following inequality holds: Combining Lemma 3, we have Thus, where On the other hand, we have Therefore, Thus, according to Definition 1, we can conclude that the equilibrium point of system (5) is globally exponentially stable. This completes the proof.

Remark 5. Differential from the results in [22, 23, 29], we divide the time delay interval into dynamical subintervals, and a novel Lyapunov-Krasovskii functional is introduced. This brings more degrees of freedom to ensure the global exponential stability. Therefore, Theorem 4 is less conservative than some previous results.

Remark 6. In Theorem 4, by setting , similar to the proof of Theorem 4, we can derive a criterion to guarantee the global exponential stability of RNNs with time-varying delay when is unknown or is not differentiable.

4. Illustrating Example

In this section, an illustrating example is given to verify the effectiveness and advantage of the criteria proposed in this paper.

Example 7. Consider system (5) with the following parameters:

At first, we suppose that the time delay interval is divided into 2 subintervals, and . While the upper bound , the exponential convergence rates for various obtained from Theorem 4 and those in [22, 23, 29] are listed in Table 1. In addition, while the exponential convergence rate of , the upper bounds of for various from Theorem 4 and those in [22, 23, 29] are listed in Table 2.

tab1
Table 1: Allowable exponential convergence rate for variable and .
tab2
Table 2: Allowable upper bound of for variable and .

Thus, from Tables 1 and 2, we can say that, the result in this paper is much effective and less conservative than those in [22, 23, 29]. Figure 1 shows the state response of Example 7 with constant delay , when the initial value is .

540951.fig.001
Figure 1: State response curves with and for Example 7.

5. Conclusion

In this paper, we consider the global exponential stability for RNNs with time-varying delay. By dividing the time delay interval into dynamical subintervals, a novel Lyapunov-Krasovskii functional is introduced. A less conservative LMI-based delay-dependent stability criterion is derived based on Lyapunov stability theory. Furthermore, an illustrating example is given to show the effectiveness of the proposed result.

Acknowledgments

This work is supported by Guangxi Science Foundation Grant (0832067), the Foundation Grant of Guangxi Key Laboratory of Automobile Components and Vehicle Technology (13-A-03-01), and the Opening Project of Guangxi Key Laboratory of Automobile Components and Vehicle Technology (2012KFZD03).

References

  1. Y. H. Chen and S. C. Fang, “Solving convex programming problems with equality constraints by neural networks,” Computers and Mathematics with Applications, vol. 36, no. 7, pp. 41–68, 1998. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  2. Y. H. Chen and S. C. Fang, “Neurocomputing with time delay analysis for solving convex quadratic programming problems,” IEEE Transactions on Neural Networks, vol. 11, no. 1, pp. 230–240, 2000. View at Publisher · View at Google Scholar · View at Scopus
  3. J. A. Farrell and A. N. Michel, “A synthesis procedure for Hopfield's continuous-time associative memory,” IEEE Transactions on Circuits and Systems, vol. 37, no. 7, pp. 877–884, 1990. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  4. A. N. Michel, J. A. Farrell, and H. F. Sun, “Analysis and synthesis techniques for Hopfield type synchronous discrete time neural networks with application to associative memory,” IEEE Transactions on Circuits and Systems, vol. 37, no. 11, pp. 1356–1366, 1990. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  5. T. Nishikawa, Y. C. Lai, and F. C. Hoppensteadt, “Capacity of oscillatory associative-memory networks with error-free retrieval,” Physical Review Letters, vol. 92, no. 10, Article ID 108101, 4 pages, 2004. View at Publisher · View at Google Scholar · View at Scopus
  6. L. O. Chua and L. Yang, “Cellular neural networks: applications,” IEEE Transactions on Circuits and Systems, vol. 35, no. 10, pp. 1273–1290, 1988. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  7. R. Cancelliere, M. Gai, and A. Slavova, “Application of polynomial cellular neural networks in diagnosis of astrometric chromaticity,” Applied Mathematical Modelling, vol. 34, no. 12, pp. 4243–4252, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  8. S. S. Young, P. D. Scott, and N. M. Nasrabadi, “Object recognition using multilayer Hopfield neural network,” IEEE Transactions on Image Processing, vol. 6, no. 3, pp. 357–372, 1997. View at Scopus
  9. L. Chen, W. Xue, and N. Tokuda, “Classification of 2-dimensional array patterns: assembling many small neural networks is better than using a large one,” Neural Networks, vol. 23, no. 6, pp. 770–781, 2010. View at Publisher · View at Google Scholar · View at Scopus
  10. L. Olien and J. Bélair, “Bifurcations, stability, and monotonicity properties of a delayed neural network model,” Physica D, vol. 102, no. 3-4, pp. 349–363, 1997. View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  11. S. Arik, “An analysis of global asymptotic stability of delayed cellular neural networks,” IEEE Transactions on Neural Networks, vol. 13, no. 5, pp. 1239–1242, 2002. View at Publisher · View at Google Scholar · View at Scopus
  12. J. Cao and D. Zhou, “Stability analysis of delayed cellular neural networks,” Neural Networks, vol. 11, no. 9, pp. 1601–1605, 1998. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  13. J. Wei and S. Ruan, “Stability and bifurcation in a neural network model with two delays,” Physica D, vol. 130, no. 3-4, pp. 255–272, 1999. View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  14. X. Liao, G. Chen, and E. N. Sanchez, “Delay-dependent exponential stability analysis of delayed neural networks: an LMI approach,” Neural Networks, vol. 15, no. 7, pp. 855–866, 2002. View at Publisher · View at Google Scholar · View at Scopus
  15. H. Huang, J. Cao, and J. Wang, “Global exponential stability and periodic solutions of recurrent neural networks with delays,” Physics Letters A, vol. 298, no. 5-6, pp. 393–404, 2002. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  16. B. Chen and J. Wang, “Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays,” Neural Networks, vol. 20, no. 10, pp. 1067–1080, 2007. View at Publisher · View at Google Scholar · View at Scopus
  17. H. Jiang and Z. Teng, “Global eponential stability of cellular neural networks with time-varying coefficients and delays,” Neural Networks, vol. 17, no. 10, pp. 1415–1425, 2004. View at Publisher · View at Google Scholar · View at Scopus
  18. S. Senan and S. Arik, “New results for exponential stability of delayed cellular neural networks,” IEEE Transactions on Circuits and Systems II, vol. 52, no. 3, pp. 154–158, 2005. View at Publisher · View at Google Scholar · View at Scopus
  19. Q. Zhang, X. Wei, and J. Xu, “An analysis on the global asymptotic stability for neural networks with variable delays,” Physics Letters A, vol. 328, no. 2-3, pp. 163–169, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  20. H. Zhao and G. Wang, “Delay-independent exponential stability of recurrent neural networks,” Physics Letters A, vol. 333, no. 5-6, pp. 399–407, 2004. View at Publisher · View at Google Scholar · View at Scopus
  21. T. L. Liao, J. J. Yan, C. J. Cheng, and C. C. Hwang, “Globally exponential stability condition of a class of neural networks with time-varying delays,” Physics Letters A, vol. 339, no. 3-5, pp. 333–342, 2005. View at Publisher · View at Google Scholar · View at Scopus
  22. Q. Zhang, X. Wei, and J. Xu, “Delay-dependent exponential stability of cellular neural networks with time-varying delays,” Chaos, Solitons & Fractals, vol. 23, no. 4, pp. 1363–1369, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  23. Y. He, M. Wu, and J. H. She, “Delay-dependent exponential stability of delayed neural networks with time-varying delay,” IEEE Transactions on Circuits and Systems II, vol. 53, no. 7, pp. 553–557, 2006. View at Publisher · View at Google Scholar · View at Scopus
  24. Y. He, G. P. Liu, D. Rees, and M. Wu, “Stability analysis for neural networks with time-varying interval delay,” IEEE Transactions on Neural Networks, vol. 18, no. 6, pp. 1850–1854, 2007. View at Publisher · View at Google Scholar · View at Scopus
  25. T. Li, L. Guo, C. Sun, and C. Lin, “Further results on delay-dependent stability criteria of neural networks with time-varying delays,” IEEE Transactions on Neural Networks, vol. 19, no. 4, pp. 726–730, 2008. View at Publisher · View at Google Scholar · View at Scopus
  26. C. D. Zheng, H. G. Zhang, and Z. S. Wang, “Improved robust stability criteria for delayed cellular neural networks via the LMI approach,” IEEE Transactions on Circuits and Systems II, vol. 57, no. 1, pp. 41–45, 2010. View at Publisher · View at Google Scholar · View at Scopus
  27. S. Mou, H. Gao, J. Lam, and W. Qiang, “A new criterion of delay-dependent asymptotic stability for Hopfield neural networks with time delay,” IEEE Transactions on Neural Networks, vol. 19, no. 3, pp. 532–535, 2008. View at Publisher · View at Google Scholar · View at Scopus
  28. H. Zhang, Z. Liu, G. B. Huang, and Z. Wang, “Novel weighting-delay-based stability criteria for recurrent neural networks with time-varying delay,” IEEE Transactions on Neural Networks, vol. 21, no. 1, pp. 91–106, 2010. View at Publisher · View at Google Scholar · View at Scopus
  29. M. Wu, F. Liu, P. Shi, Y. He, and R. Yokoyama, “Exponential stability analysis for neural networks with time-varying delay,” IEEE Transactions on Systems, Man, and Cybernetics B, vol. 38, no. 4, pp. 1152–1156, 2008. View at Publisher · View at Google Scholar · View at Scopus
  30. X. M. Zhang, M. Wu, J. H. She, and Y. He, “Delay-dependent stabilization of linear systems with time-varying state and input delays,” Automatica, vol. 41, no. 8, pp. 1405–1412, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  31. H. Zhang, Z. Wang, and D. Liu, “Robust exponential stability of recurrent neural networks with multiple time-varying delays,” IEEE Transactions on Circuits and Systems II, vol. 54, no. 8, pp. 730–734, 2007. View at Publisher · View at Google Scholar · View at Scopus