Research Article | Open Access

M. J. Park, O. M. Kwon, E. J. Cha, "On Stability Analysis for Generalized Neural Networks with Time-Varying Delays", *Mathematical Problems in Engineering*, vol. 2015, Article ID 387805, 11 pages, 2015. https://doi.org/10.1155/2015/387805

# On Stability Analysis for Generalized Neural Networks with Time-Varying Delays

**Academic Editor:**Jun Cheng

#### Abstract

This paper deals with the problem of stability analysis for generalized neural networks with time-varying delays. With a suitable Lyapunov-Krasovskii functional (LKF) and Wirtinger-based integral inequality, sufficient conditions for guaranteeing the asymptotic stability of the concerned networks are derived in terms of linear matrix inequalities (LMIs). By applying the proposed methods to two numerical examples which have been utilized in many works for checking the conservatism of stability criteria, it is shown that the obtained results are significantly improved comparing with the previous ones published in other literature.

#### 1. Introduction

Neural networks have been successfully applied to various science and engineering problems such as pattern recognition, optimization, medical diagnosis, and image and signal processing [1–4]. In addition to this, the model of neural networks can take a variety of forms such as bidirectional associative memory (BAM) [5] and Cohen-Grossberg [6] neural networks. Therefore, the stability analysis of neural networks has been extensively studied because the application of neural networks heavily depends on the dynamic behavior of its equilibrium points and it is a prerequisite job to check the stability of the concerned networks. Recently, a model of neural networks having time-varying delays has been considered and its asymptotic stability has been extensively investigated due to the fact that the occurrence of time delays, which are caused by the inherent communication time among the neurons and the finite switching speed of amplifiers in hardware implementation of the networks, can harm the system performance and stability. With this regard, a lot of weight has been placed on the stability analysis for local field and static neural networks with time-varying delays. For local field neural networks, a new activation condition which has not been considered was proposed to reduce the conservatism of stability criteria in [7]. Two delay-partitioning approaches were utilized to get further enhanced delay-dependent stability criteria for neural networks with time-varying delays in [8–11]. Very recently, a new augmented Lyapunov-Krasovskii functional was proposed to enhance the feasible region of stability condition of delayed neural networks [12]. For static neural networks, improved delay-dependent stability criteria were presented in [13, 14]. By constructing an augmented Lyapunov-Krasovskii functional, both delay-independent and delay-dependent stability criteria were proposed in [15] with some new techniques. The robust global asymptotic stability of generalized static neural networks with linear fractional uncertainties and time-varying delays was proposed in [16] within a novel input-output framework. Very recently, interval time-varying delays were considered to investigate the problem of delay-dependent stability of static neural networks [17].

The main issue of delay-dependent stability analysis for dynamic systems with time-delays is to enhance the feasible region comparing with the existing results. In delay-dependent stability analysis, maximum delay bounds for guaranteeing the asymptotic stability have been considered as one of the index for checking the conservatism of stability condition. The reduction of conservatism in delay-dependent stability criteria mainly depends on the construction of LKF and the utilization of some techniques to estimate the time-derivative of LKF. In LKF point of view, discretized form [18], triple integral form [19–21], and augmented LKF in [12, 22] and [23, 24] have been proposed to reduce the conservatism of stability condition. In techniques for estimating the time-derivative of LKFs, Park’s inequality [25], Jensen inequality [18], model transformation [26, 27], free-weighting matrices techniques [28, 29], convex combination technique [30], reciprocally convex optimization [31], and the Wirtinger-based integral inequality [32] are remarkable results which promotes the development of delay-dependent stability analysis. Among the techniques in estimating the time-derivative of LKFs, Wirtinger-based integral inequality, which provides more tighter lower bound of integral terms than Jensen's inequality, has been applied to stability analysis of delayed neural networks [33, 34] very recently. However, in stability analysis for neural networks with time-varying delays, the application of the Wirtinger-based integral inequality with an augmented LKF has not been fully investigated yet and thus there is still room for further improvements on the reduction of conservatism.

With the motivation mentioned above, the problem of stability analysis for generalized neural network with time-varying delays is investigated in this paper. Inspired by the works [35, 36], a generalized neural network which contains the static neural networks and the local field neural networks as special cases is considered in this work. Thus, the general model of neural networks used in this work is a superordinate concept to the model utilized in most literature including the works [37–41]. In Lemma 3, a modified Wirtinger-based integral inequality which changes the intervals of integral terms will be introduced as Lemma 3. Then, by constructing a suitable augmented Lyapunov-Krasovskii functional and utilizing Lemma 3, an improved stability condition such that the considered neural networks are asymptotically stable is derived in terms of linear matrix inequalities (LMIs). When the information about the time-derivative of time-varying delays is unknown, a stability condition will be presented as Corollary 10. The advantage and superiority of the main results will be shown via two numerical examples which have been utilized in many previous works to check the conservatism of stability criteria.

*Notation*. and denote the -dimensional Euclidean space with vector norm and the set of all real matrices, respectively. and are the sets of symmetric and positive definite matrices, respectively. , , and denote identity matrix, and zero matrices, respectively. (<0) means symmetric positive (negative) definite matrix. stands for a basis for the null space of . represents the block diagonal matrix. For any square matrix and any vectors , respectively, we define and . means that the elements of matrix include the scalar value of affinely.

#### 2. Problem Statement and Preliminaries

Consider the following generalized neural networks with time-varying delays: where denotes the number of neurons in a neural network, is the state vector of neuron, denotes the activation function of neuron, is the system matrix, are the interconnection weight matrices, and is a constant input vector.

is the time-varying delay satisfying where is a known positive scalar and is any constant one.

The activation functions of neuron satisfy the following assumption.

*Assumption 1. *The activation functions of neuron with are continuous, bounded and satisfy where and are known real constants.

*Remark 2. *In Assumption 1, and can be allowed to be positive, negative, or zero. As mentioned in [40], Assumption 1 describes the class of globally Lipschitz continuous and monotone nondecreasing activation when and . And the class of globally Lipschitz continuous and monotone increasing activation functions can be described when .

For analyzing stability of the neural networks (1), the equilibrium point whose uniqueness has been reported in [41] is shifted to the origin by utilizing the transformation , which leads the system (1) to the following form: where is the state vector of the transformed system, with , , and with .

It should be noted that the activation functions satisfy the following condition [42]:

The aim of this paper is to investigate the delay-dependent stability analysis of system (4) which will be introduced in next section. Before deriving our main results, the following lemmas will be utilized in deriving the main results.

Lemma 3 (Modified Wirtinger-based integral inequality). *Consider a given matrix . Then, for all continuous function in , the following inequality holds: *

*Proof. *From the term in the original Wirtinger-based integral inequality [32], since the inequality (7) holds.

Lemma 4 (see [43]). *Let , , and such that . Then, the following statements are equivalent:*(1)*,*(2)*, where is a right orthogonal complement of .*

Lemma 5 (see [44]). *For symmetric appropriately dimensional matrices , , matrix , the following two statements are equivalent:*(1)*,*(2)*there exists a matrix of appropriate dimension such that *

#### 3. Main Results

In this section, two delay-dependent stability criteria for system (4) will be proposed. For the sake of simplicity of matrix and vector representation, which will be used in Theorem 6 and Corollary 10 are defined as block entry matrices. (e.g., ). The other notations for some vectors and matrices are defined in the appendix.

Now, the following theorem is given as a main result.

Theorem 6. *For given scalars , , and diagonal matrices and , system (4) is asymptotically stable for and , if there exist matrices , , , , , , , , , , satisfying the following LMIs: *

*Proof. *Let us consider the following candidate for the appropriate Lyapunov-Krasovskii functional: where The results of the time-derivative of are as follows: By applying Leibniz integral rule to , it follows that Considering Leibniz integral rule and the two zero equalities inspired by the work of [45] to leads towhere and are any symmetric matrices.

It should be noted that By applying Lemma 3 and the use of (17), we have With the similar process presented in (18), it can be obtained that With (18) and (19) and utilizing reciprocally convex optimization [31], if , an upper bound of can be obtained as In addition, from (6), for any positive diagonal matrices , the following inequality holds: Also, inspired by the authors’ work of [7], from (5), the following conditions hold: Therefore, for any positive diagonal matrices , the following inequality holds: From (12)–(23) and by application of the S-procedure [46], an upper bound of can be written as By Lemma 4, the condition with is equivalent to Then, by Lemma 5, the condition (26) is equivalent to the following inequality with any matrix :The above condition is affinely dependent on . Therefore, if inequalities (10) and (11) hold, then inequality (27) is satisfied, which means that system (4) is asymptotically stable for and . It should be noted that holds if inequalities (11) and (12) are feasible. This completes our proof.

*Remark 7. *Theorem 6 utilized which was inspired by the author’s work [22]. In [22], the problem of delay-dependent stability for neural networks with interval time-varying delays was addressed. However, when estimating the time-derivative of value obtained by , Wirtinger-based integral inequality has not been used in [22], which means that there is still room for further improvement in enhancing the feasible region of stability condition. In (18) and (19), the integral terms were estimated by the use of Lemma 3 for the first time. Thus, more cross terms such as were utilized in the stability condition of Theorem 6.

*Remark 8. *It should be noted that when is larger than one, the condition presented in Theorem 6 is infeasible since the term cannot be negative. When is unknown or larger than 1, then, by not considering of the functional in (12), the following corollary can be obtained.

*Remark 9. *In Lemma 3, the term is included in (7) instead of in [32]. Thus, the integral terms such as and of Theorem 6 can be utilized as the element of the augmented vector .

Corollary 10. *For given scalars , and diagonal matrices and , system (4) is asymptotically stable for , if there exist matrices , , , , , , , , satisfying the following LMIs: **where and are defined in the appendix with .*

*Proof. *The proof of Corollary 10 is very similar to the proof of Theorem 6. Thus, it is omitted.

#### 4. Numerical Examples

In this section, two numerical examples are introduced to show the improvements of the proposed methods.

*Example 1. *Consider the neural networks (4) with the parameters This example has been utilized in many previous works to check the conservatism of delay-dependent stability. In Table 1, the comparison of our results with the previous works of [7–9], and [10–12] are conducted. When is 0.8 or 0.9, Theorem 6 provides significantly larger delay bounds than those listed in Table 1. Also, When is larger than one or unknown, the result of Corollary 10 is the largest value comparing with those in [7, 8, 10–34]. Therefore, the result of Table 1 shows that the proposed Theorem 6 and Corollary 10 effectively reduce the conservatism of stability criteria.

*Example 2. *Consider the static recurrent neural networks where The maximum delay bounds obtained by Theorem 6 and Corollary 10 and the results of [13–33] are listed in Table 2. From this table, it can be confirmed that the proposed stability condition also gives larger delay bounds, which support the less conservatism of Theorem 6 and Corollary 10.

#### 5. Conclusion

In this paper, two improved delay-dependent stability criteria for generalized neural networks with time-varying delays have been proposed by the use of the augmented Lyapunov stability theorem and Lemma 3. In Theorem 6, by constructing the suitable augmented Lyapunov-Krasovskii functional and utilizing Lemma 3, the delay-dependent sufficient condition for asymptotic stability of the concerned network was derived. When is larger than one or unknown, the stability condition was also presented as Corollary 10 based on the result of Theorem 6. Via two numerical examples which have been dealt with in many previous works to check the conservatism of stability criteria, it was shown that Theorem 6 and Corollary 10 provide larger delay bounds than those of the recent works.

#### Appendix

Consider