Mathematical Problems in Engineering

Volume 2013, Article ID 639219, 12 pages

http://dx.doi.org/10.1155/2013/639219

## Stability Analysis for Delayed Neural Networks: Reciprocally Convex Approach

^{1}Space Control and Inertial Technology Research Center, Harbin Institute of Technology, Harbin 150001, China^{2}Designing Institute of Hubei Space Technology Academy, Wuhan 430034, China

Received 25 November 2012; Accepted 14 January 2013

Academic Editor: Ligang Wu

Copyright © 2013 Hongjun Yu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

This paper is concerned with global stability analysis for a class of continuous neural networks with time-varying delay. The lower and upper bounds of the delay and the upper bound of its first derivative are assumed to be known. By introducing a novel Lyapunov-Krasovskii functional, some delay-dependent stability criteria are derived in terms of linear matrix inequality, which guarantee the considered neural networks to be globally stable. When estimating the derivative of the LKF, instead of applying Jensen’s inequality directly, a substep is taken, and a slack variable is introduced by reciprocally convex combination approach, and as a result, conservatism reduction is proved to be more obvious than the available literature. Numerical examples are given to demonstrate the effectiveness and merits of the proposed method.

#### 1. Introduction

In recent years, important application in the fields of pattern recognition, signal processing, optimization and associative memories, and so forth has made neural networks (NNs) the eye of attention. Stability analysis of NNs with time-varying delay, as a result, has received much attention ever since, because time delay is frequently encountered in NNs, owing to finite switching speed of amplifiers in the communication and response of neurons, and could cause instability and oscillations in the system. Both delay-independent and delay-dependent stability criteria have been brought up. While delay-independent criteria tend to be more conservative, more attention is given to delay-dependent criteria, because they could make use of the length of the delay.

Global stability of various recurrent neural networks has been investigated in [1–4]. In stability analysis of neural networks, the qualitative properties primarily concerned are uniqueness, global asymptotic stability, and global exponential stability of their equilibria. When the system with time delay is described by the following dynamic equation: exponential and asymptotic stability analysis has been done by [5–10]. The stability criteria in [5, 9, 10] are delay independent, while those in [6, 8, 11] are delay dependent, and [12] includes both. Also, [8–10] adopted the delay-partitioning approach. After some changes are made on system description, such as setting , adding a new term , and introducing a new activation function to substitute for , asymptotic stability criteria are derived in [13–17], and uniqueness analysis is done in [5, 18].

Various methods have been proposed to reduce conservatism when deriving stability criteria. For example, the free-weighting matrix approach noticed in [6, 7, 19–22] is proved to be very effective since bounding techniques on some cross-product terms are avoided. Stability analyses of NNs with multiple and single time-varying delays are done by [17, 23], respectively. Moreover, a new free-weighting matrix approach is brought up in [24] to estimate the derivative of Lyapunov functional without missing any negative quadratic terms, and thus, improved delay-dependent stability criteria are established. Along with free-weighting matrix approach, [14, 25, 26] adopted the delay partitioning idea to solve delay-dependent stability problem, and the proposed methods have significantly reduced conservatism.

When deriving stability criteria for delayed systems, two kinds of approaches are usually used, namely, Lyapunov function approaches [27–29] and Lyapunov-Krasovskii functional (LKF) approaches [17, 23, 24, 30–33]. The former makes no restriction on the derivative of time delay and usually gives a simpler stability criterion or delay-independent criterion while the latter, expressed in the form of LMI, takes the derivative of time delay into account, gives a delay-dependent criterion, and thus can be less conservative since LKF makes use of more information about the system. Discretized LKF method developed in [34] is another method for stability analysis, and [13] made some necessary adjustments to make the method compatible with robust stability problem for NNs with uncertain delays.

In addition, the range of time-varying delay for NNs is mostly considered to have a lower bound of zero, as seen in [7, 11, 23, 24], while in practice it may not be restricted to 0, and setting to zero would result in increased conservatism. It is the same with the derivative of time delay. In many papers, time delay of NNs is either constant or unknown, as noticed in [14, 25, 26], while an upper or lower bound could be assumed.

In this paper, the stability problem for continuous NNs with time-varying delay is taken into consideration. A novel LKF is brought up, and changes are made to deal with different cases concerning time delay and its derivative. When estimating the derivative of LKF, instead of applying Jensen’s inequality directly, a substep is taken, and a slack variable is introduced, and consequently, conservatism reduction is proved to be more obvious than existing results. Numerical examples are given, and analysis is made to demonstrate the effectiveness and merits of the proposed method.

In Section 1, a brief introduction is presented, and some notations are defined. In Section 2, the stability problem is formulated, and some preliminaries are given. In Section 3, new criteria in the form of one theorem and three corollaries for NNs with time-varying delay are presented. In Section 4, numerical examples are presented, along with results from the other literature. The paper is concluded in Section 5.

*Notations*. The notations used throughout the paper are standard. The superscript “” stands for matrix transposition; denotes the -dimensional Euclidean space; the notation means that is a real positive definite; and represent the identity matrix and a zero matrix, respectively; stands for a block-diagonal matrix; denotes the minimum (maximum) eigenvalue of symmetric matrix ; denotes the Euclidean norm of a vector and its induced norm of a matrix. In symmetric block matrices, we use an asterisk () to represent a term that is induced by symmetry. Matrices, if their dimensions are not explicitly stated, are assumed to be compatible for algebraic operations.

#### 2. Model Descriptions and Preliminaries

The dynamic behavior of a continuous-time neural networks with time delay can be described by the following state equation: where is the state vector of the neural network; is a positive matrix; and are the connection weight and the delayed connection weight matrices, respectively. represents the activation function vector of neurons, and is a constant external bias vector. denotes axonal signal transmission delay, which is nonnegative, bounded, and has , , and will be written as for short throughout the paper. The initial conditions associated with system (2) are of the form where is the Banach space of continuous functions mapping interval into .

The following assumptions are made on system (2) throughout this paper.(H1) The activation functions are bounded and monotonically nondecreasing on .(H2) The activation functions satisfy

It is known that bounded activation functions always guarantee the existence of an equilibrium point for model (2). For convenience of exposition, in the following, we will shift the equilibrium point of model (2) to the origin. The transformation puts system (2) into the following form: where is the state vector of the transformed system, and with , , , . Obviously, the equilibrium point of system (2) is globally stable if and only if the origin of system (5) is globally stable. Assume that , then the functions satisfy Rewriting (6), we can get where and .

Lemma 1 (see [35]). *For positive definite , scalars and , and a vector function , the following inequality holds:
*

#### 3. Main Results

In this section, some delay-dependent sufficient conditions of the global stability for the neural networks with time-varying delay in (5) are derived. First, we consider the case where the upper bound of is known, and correspondingly, the global stability condition is given as follows.

Theorem 2. *Suppose that in reference system (5), the time delay satisfies , . Under the condition given in (6), if there exist matrices , , , , , , , , , , and such that
**
where , , , and are defined as
**
where , , , , , , and are defined as
**
then system (5) is globally stable. Moreover,
**
where is defined as
*

*Proof. *We choose an LKF as
where
The derivatives of , , are given, respectively, by
By Lemma 1, we can get
By (16) and (17), we have
where is defined as
Then by (7), (18), and (19), we have
It is clear that if (9) and (20) hold, then, for any , we have . It follows that

From (15), we get
By a similar method in [9], we have
Thus, , where is defined in (13).

Moreover,

Therefore, we have
which shows that system (5) is globally stable. This completes the proof.

*Remark 3. *Theorem 2 presents a stability criterion for the delayed neural network. When coping with , instead of using Jensen’s inequality directly, we use a substep which can make the method less conservative, which can be noticed as reciprocally convex combination approach in [36]. It follows that
If no substep is taken, it will follow that

We can see that compared to (27), (26) is relatively free since could be more than nonnegative, and consequently, its LMI could suffer bigger delay. For the same reason, if the middle term could be found between and 0, conservatism would be further reduced.

*Remark 4. *Based on Theorem 2, we can determine the maximum admissible delay and at a known upper bound of . Moreover, the relationship between and could be specified using and . As to the case when is unknown, we can refer to Corollary 5, where some changes are made on the LKF.

Corollary 5. *Suppose that the time delay in reference system (5) satisfies . Under the condition given by (6) and (7), if there exist matrices , , , , , , , , , and such that the following matrix inequalities hold:
**
where , , , and are defined as
**
where , , , , , , and are defined in (11), then system (5) is globally stable. Moreover,
**
where is defined as
*

*Proof. *We choose an LKF as
where

Since the result can be obtained directly from Theorem 2, the rest of the proof for Corollary 5 is omitted.

*Remark 6. *Corollary 5 presents the stability criterion when the upper bound of is unknown. Since is unknown and under current conditions, it cannot be estimated or substituted, the first term in of should be changed or eliminated from the LKF. In Corollary 5, the term is eliminated because the other two terms in can serve the same function, and it is unnecessary to keep any extra or . The rest of the terms were reserved because they will not generate any -related terms when estimating the derivative of the LKF.

Corollary 7. *Suppose that the time delay in reference system (5) satisfies and . Under the condition given by (6) and (7), if there exist matrices , , , , ,, , , , and such that the following matrix inequalities hold:
**
where , , , and are defined as
**
where , , , , , and are defined as
**
then system (5) is globally stable. Moreover,
**
where is defined as
*

*Proof. *We choose an LKF as
where

Since the result can be obtained directly from Theorem 2, the rest of the proof for Corollary 7 is omitted.

*Remark 8. *Corollary 7 presents the stability criterion when the lower bound is zero. If is zero, the second term in of should be changed or eliminated from the LKF. In Corollary 7, the term is eliminated because there is no need to introduce an extra variable , while and other matrices can serve the same function. Moreover, and in can be merged because when is zero, they have the same form. But and can be reserved, because when estimating the upper bound of the LKF, it is still useful to introduce a like Theorem 2.

Corollary 9. *Suppose that the time delay in reference system (5) satisfies . Under the condition given by (6) and (7), if there exist matrices , , , , , , , , , , and such that the following matrix inequalities hold:
**
where , , , and are defined as
**
where , , , , , and are defined in (16), then system (5) is globally stable. Moreover,
**
where is defined as
*

*Proof. *We choose an LKF as
where

Since the result can be obtained directly from Theorem 2, the rest of the proof for Corollary 9 is omitted.

*Remark 10. *Corollary 9 presents the stability criterion when is zero and the upper bound of time delay’s derivative, or , is unknown. If is zero and is unknown, the first and second terms of and in should be eliminated from the LKF. There is no need to introduce an extra variable or keep any or , and the reason is the same with Corollaries 5 and 7. Moreover, the terms in can be merged, while and should be reserved for the same reason with Corollary 7.

#### 4. Numerical Examples

In this section, examples are provided to demonstrate the advantages of the proposed stability criteria.

*Example 11. *Consider the delayed neural network in (5) with the following parameters, which has also been investigated by [13, 39]:
Our objective is to find the allowable maximum time delay such that the system is stable under different and . The simulation results from the available literature are shown in Table 1, along with results from Theorem 2 and Corollaries 5, 7, and 9. It is clear that the conservatism reduction proves to be more obvious than those in [23, 24, 37, 38].

In addition, when , , , and take the same values as (48), and takes a different value as results under the same system with (49) are shown in Table 2. Since is specified in (49), allowable maximum time delay is expected to be different from those of (48). As shown in Table 2, with getting bigger, the difference between and becomes smaller, to ensure stability of (49). Moreover, allowable maximum of (49) is apparently smaller than their counterparts of (48) because in (49) is a positive definite, while its counterpart in (48) is zero, which means that in (49) is more closely related to than (48).

*Example 12. *Consider the delayed system in (5) with

This example is used to demonstrate the effectiveness of Theorem 2 in Table 3 and Corollaries 5, 7, and 9 in Table 4. Both and have taken positive definite values, and allowable maximum time delay under different and is presented in Tables 3 and 4.

As seen in Table 3, values of change periodically with different under the same , but the difference between and decreases readily with increasing under the same . It can be expected that whatever value takes, when is big enough, and would converge to a single point. Moreover, it can be seen in Table 3 that the point is the same with the convergence point under , which is also the allowable maximum constant time delay.

Table 4 is used to demonstrate the effectiveness of Corollaries 5, 7, and 9. Allowable maximum time delay is presented under unknown for Corollary 5, for Corollary 7, and both unknown and for Corollary 9. It can be seen that when is unknown, is apparently smaller than otherwise. Moreover, sums of and are about the same with those under but smaller than those under , which signifies that and have a near-linear relationship under .

#### 5. Conclusion

This paper has investigated the global stability of the neural networks with time-varying delay. By introducing a novel LKF, delay-dependent global stability criteria have been obtained. By reciprocally convex combination approach, a substep is taken, and a slack variable is introduced to estimate the derivative of LKF, and as a result, the proposed method is expected to be less conservative than the available literature. The proposed criteria have been formulated in terms of linear matrix inequalities and, thus, can be readily solved by standard computing software. Numerical examples are given, and analysis is made under different ranges and derivatives of time delay. The conservatism reduction has been proved to be more obvious than existing results.

#### Acknowledgments

This work was partially supported by the National Natural Science Foundation of China (61174126, 61222301), the Program for New Century Excellent Talents in University (NCET-09-0063), Key Laboratory Opening Funding of Technology of Micro-Spacecraft (HIT.KLOF.2009099) and the Fundamental Research Funds for the Central Universities (HIT.BRET2.2010011).

#### References

- M. W. Hirsch, “Convergent activation dynamics in continuous time networks,”
*Neural Networks*, vol. 2, no. 5, pp. 331–349, 1989. View at Google Scholar - S. Hu and J. Wang, “Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks,”
*IEEE Transactions on Automatic Control*, vol. 47, no. 5, pp. 802–807, 2002. View at Publisher · View at Google Scholar · View at MathSciNet - H. Huang, D. W. C. Ho, and J. Lam, “Stochastic stability analysis of fuzzy Hopfield neural networks with time-varying delays,”
*IEEE Transactions on Circuits and Systems II*, vol. 52, no. 5, pp. 251–255, 2005. View at Publisher · View at Google Scholar · View at Scopus - D. G. Kelly, “Stability in contractive nonlinear neural networks,”
*IEEE Transactions on Biomedical Engineering*, vol. 37, no. 3, pp. 231–242, 1990. View at Google Scholar - S. Arik, “Global asymptotic stability of a larger class of neural networks with constant time delay,”
*Physics Letters A*, vol. 311, no. 6, pp. 504–511, 2003. View at Publisher · View at Google Scholar · View at Scopus - Y. Ge, T. Li, and S. Fei, “Master-slave synchronization of stochastic neural networks with mixed time-varying delays,”
*Mathematical Problems in Engineering*, vol. 2012, Article ID 730941, 18 pages, 2012. View at Publisher · View at Google Scholar · View at MathSciNet - Y. Li, J. Tian, J. Zhao, L. Zhang, and T. Li, “Improved stability analysis for neural networks with time-varying delay,”
*Mathematical Problems in Engineering*, vol. 2012, Article ID 950269, 19 pages, 2012. View at Publisher · View at Google Scholar - S. Mou, H. Gao, W. Qiang, and K. Chen, “New delay-dependent exponential stability for neural networks with time delay,”
*IEEE Transactions on Systems, Man, and Cybernetics B*, vol. 38, no. 2, pp. 571–576, 2008. View at Publisher · View at Google Scholar · View at Scopus - X. Su, Z. Li, Y. Feng, and L. Wu, “New global exponential stability criteria for interval-delayed neural networks,”
*Proceedings of the Institution of Mechanical Engineers I*, vol. 225, no. 1, pp. 125–136, 2011. View at Publisher · View at Google Scholar · View at Scopus - L. Wu, Z. Feng, and W. X. Zheng, “Exponential stability analysis for delayed neural networks with switching parameters: average dwell time approach,”
*IEEE Transactions on Neural Networks*, vol. 21, no. 9, pp. 1396–1407, 2010. View at Publisher · View at Google Scholar · View at Scopus - S. Lv, J. Tian, and S. Zhong, “Delay-dependent stability analysis for recurrent neural networks with time-varying delays,”
*Mathematical Problems in Engineering*, vol. 2012, Article ID 910140, 14 pages, 2012. View at Publisher · View at Google Scholar - L. Wu and W. X. Zheng, “Weighted ${\mathscr{H}}_{\infty}$ model reduction for linear switched systems with time-varying delay,”
*Automatica*, vol. 45, no. 1, pp. 186–193, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - W. H. Chen and W. X. Zheng, “Improved delay-dependent asymptotic stability criteria for delayed neural networks,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 12, pp. 2154–2161, 2008. View at Publisher · View at Google Scholar · View at Scopus - S. Mou, H. Gao, J. Lam, and W. Qiang, “A new criterion of delay-dependent asymptotic stability for Hopfield neural networks with time delay,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 3, pp. 532–535, 2008. View at Publisher · View at Google Scholar · View at Scopus - R. Yang, Z. Zhang, and P. Shi, “Exponential stability on stochastic neural networks with discrete interval and distributed delays,”
*IEEE Transactions on Neural Networks*, vol. 21, no. 1, pp. 169–175, 2010. View at Publisher · View at Google Scholar · View at Scopus - G. Zhang, T. Li, and S. Fei, “Further stability criterion on delayed recurrent neural networks based on reciprocal convex technique,”
*Mathematical Problems in Engineering*, vol. 2012, Article ID 829037, 14 pages, 2012. View at Publisher · View at Google Scholar · View at MathSciNet - H. Zhang, Z. Wang, and D. Liu, “Global asymptotic stability of recurrent neural networks with multiple time-varying delays,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 5, pp. 855–873, 2008. View at Publisher · View at Google Scholar · View at Scopus - B. Ammar, F. Cherif, and A. M. Alimi, “Existence and uniqueness of Pseudo almost-periodic solutions of recurrent neural networks with time-varying coefficients and mixed delays,”
*IEEE Transactions on Neural Networks and Learning Systems*, vol. 23, no. 1, pp. 109–118, 2012. View at Google Scholar - Y. Chen, W. Bi, and W. Li, “Stability analysis for neural networks with time-varying delay: a more general delay decomposition approach,”
*Neurocomputing*, vol. 73, no. 4–6, pp. 853–857, 2010. View at Publisher · View at Google Scholar · View at Scopus - Z. Wang, Y. Liu, and X. Liu, “State estimation for jumping recurrent neural networks with discrete and distributed delays,”
*Neural Networks*, vol. 22, no. 1, pp. 41–48, 2009. View at Publisher · View at Google Scholar · View at Scopus - L. Wu, X. Su, P. Shi, and J. Qiu, “A new approach to stability analysis and stabilization of discrete-time T-S fuzzy time-varying delay systems,”
*IEEE Transactions on Systems, Man, and Cybernetics B*, vol. 41, no. 1, pp. 273–286, 2011. View at Publisher · View at Google Scholar · View at Scopus - H. Zhang, Z. Liu, G. B. Huang, and Z. Wang, “Novel weighting-delay-based stability criteria for recurrent neural networks with time-varying delay,”
*IEEE Transactions on Neural Networks*, vol. 21, no. 1, pp. 91–106, 2010. View at Publisher · View at Google Scholar · View at Scopus - C. Hua, C. Long, and X. Guan, “New results on stability analysis of neural networks with time-varying delays,”
*Physics Letters A*, vol. 352, no. 4, pp. 335–340, 2006. View at Google Scholar - Y. He, G. Liu, and D. Rees, “New delay-dependent stability criteria for neural networks with yime-varying delay,”
*IEEE Transactions on Neural Networks*, vol. 18, no. 1, pp. 310–314, 2007. View at Publisher · View at Google Scholar · View at Scopus - L. Hu, H. Gao, and W. X. Zheng, “Novel stability of cellular neural networks with interval time-varying delay,”
*Neural Networks*, vol. 21, no. 10, pp. 1458–1463, 2008. View at Publisher · View at Google Scholar · View at Scopus - R. Yang, H. Gao, and P. Shi, “Novel robust stability criteria for stochastic Hopfield neural networks with time delays,”
*IEEE Transactions on Systems, Man, and Cybernetics B*, vol. 39, no. 2, pp. 467–474, 2009. View at Publisher · View at Google Scholar · View at Scopus - L. Wu, X. Su, and P. Shi, “Sliding mode control with bounded ${\mathcal{L}}_{2}$ gain performance of Markovian jump singular time-delay systems,”
*Automatica*, vol. 48, no. 8, pp. 1929–1933, 2012. View at Publisher · View at Google Scholar · View at MathSciNet - H. Ye, A. N. Michel, and K. Wang, “Robust stability of nonlinear time-delay systems with applications to neural networks,”
*IEEE Transactions on Circuits and Systems I*, vol. 43, no. 7, pp. 532–543, 1996. View at Publisher · View at Google Scholar · View at MathSciNet - Z. Zeng and J. Wang, “Improved conditions for global exponential stability of recurrent neural networks with time-varying delays,”
*IEEE Transactions on Neural Networks*, vol. 17, no. 3, pp. 623–635, 2006. View at Publisher · View at Google Scholar · View at Scopus - J. H. Park, “A new stability analysis of delayed cellular neural networks,”
*Applied Mathematics and Computation*, vol. 181, no. 1, pp. 200–205, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - X. Su, P. Shi, L. Wu, and S. Nguang, “Induced
*ℓ*_{2}filtering of fuzzy stochastic systems with time-varying delays,”*IEEE Transactions on Systems, Man, and Cybernetics B*, no. 19, pp. 1–14. View at Publisher · View at Google Scholar - L. Wu and W. X. Zheng, “Passivity-based sliding mode control of uncertain singular time-delay systems,”
*Automatica*, vol. 45, no. 9, pp. 2120–2127, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - R. Yang, P. Shi, G.-P. Liu, and H. Gao, “Network-based feedback control for systems with mixed delays based on quantization and dropout compensation,”
*Automatica*, vol. 47, no. 12, pp. 2805–2809, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - K. Gu, “Discretized LMI set in the stability problem of linear uncertain time-delay systems,”
*International Journal of Control*, vol. 68, no. 4, pp. 923–934, 1997. View at Publisher · View at Google Scholar · View at MathSciNet - S. Boyd, V. Balakrishnan, E. Feron, and L. E. Ghaoui,
*Linear Matrix Inequalities in Systems and Control*, SIMA, Philadelphia, Pa, USA, 1994. - P. Park, J. W. Ko, and C. Jeong, “Reciprocally convex approach to stability of systems with time-varying delays,”
*Automatica*, vol. 47, no. 1, pp. 235–238, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Y. He, G. P. Liu, D. Rees, and M. Wu, “Stability analysis for neural networks with time-varying interval delay,”
*IEEE Transactions on Neural Networks*, vol. 18, no. 6, pp. 1850–1854, 2007. View at Publisher · View at Google Scholar · View at Scopus - O. M. Kwon and J. H. Park, “Improved delay-dependent stability criterion for neural networks with time-varying delays,”
*Physics Letters A*, vol. 373, no. 5, pp. 529–535, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - H. Shao and Q. L. Han, “New delay-dependent stability criteria for neural networks with two additive time-varying delay components,”
*IEEE Transactions on Neural Networks*, vol. 22, no. 5, pp. 812–818, 2011. View at Publisher · View at Google Scholar · View at Scopus