Abstract

The robust stability of uncertain discrete-time recurrent neural networks with time-varying delay is investigated. By decomposing some connection weight matrices, new Lyapunov-Krasovskii functionals are constructed, and serial new improved stability criteria are derived. These criteria are formulated in the forms of linear matrix inequalities (LMIs). Compared with some previous results, the new results are less conservative. Three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed method.

1. Introductionn

In recent years, recurrent neural networks (see [17]), such as Hopfield neural networks, cellular neural networks, and other networks have been widely investigated and successfully applied in all kinds of science areas such as pattern recognition, image processing, and fixed-point computation. However, because of the finite switching speed of neurons and amplifiers, time delay is unavoidable in nature and technology. It can make important effects on the stability of dynamic systems. Thus, the studies on stability are of great significance. There has been a growing research interest on the stability analysis problems for delayed neural networks, and many excellent papers and monographs have been available. On the other hand, during the design of neural network and its hardware implementation, the convergence of a neural network may often be destroyed by its unavoidable uncertainty due to the existence of modeling error, the deviation of vital data, and so on. Therefore, the studies on robust convergence of delayed neural network have been a hot research direction. Up to now, many sufficient conditions, either delay-dependent or delay-independent, have been proposed to guarantee the global robust asymptotic or exponential stability for different class of delayed neural networks (see [813]).

It is worth pointing out that most neural networks have been assumed to be in continuous time, but few in discrete time. In practice, the discrete-time neural networks are more applicable to problems that are inherently temporal in nature or related to biological realities. And they can ideally keep the dynamic characteristics, functional similarity, and even the physical or biological reality of the continuous-time networks under mild restriction. Thus, the stability analysis problems for discrete-time neural networks have received more and more interest, and some stability criteria have been proposed in literature (see [1425]). In [14], Liu et al. researched a class of discrete-time RNNs with time-varying delay, and proposed a delay-dependent condition guaranteeing the global exponential stability. By using a similar technique to that in [21], the result obtained in [14] has been improved by Song and Wang in [15]. The results in [15] are further improved by Zhang et al. in [16] by introducing some useful terms. In [17], Yu et al. proposed a new less conservative result than that obtained in [16] via constructing a new augment Lyapunov-Krasovskii functional.

In this paper, the connection weight matrix is decomposed, and some new Lyapunov-Krasovskii functionals are constructed. Combined with linear matrix inequality (LMI) technique, serial new improved stability criteria are derived. Numerical examples show that these new criteria are less conservative than those obtained in [1417].

Notation 1. The notations are used in our paper except where otherwise specified. denotes a vector or a matrix norm; are real and n-dimension real number sets, respectively; is nonnegative integer set. is identity matrix; represents the elements below the main diagonal of a symmetric block matrix; Real matrix denotes that is a positive definite (negative definite) matrix; ; denotes the minimum and maximum eigenvalue of a real matrix.

2. Preliminaries

Consider a discrete-time recurrent neural network with time-varying delays [17] described by where denotes the neural state vector; , are the neuron activation functions; is the external input vector; Positive integer represents the transmission delay that satisfies , where are known positive integers representing the lower and upper bounds of the delay. , , . with describes the rate with which the th neuron will reset its potential to the resting state in isolation when disconnected from the networks and external inputs; represent the weighting matrices; denote the time-varying structured uncertainties which are of the following form: where are known real constant matrices with appropriate dimensions, is unknown time-varying matrix function satisfying .

The nominal of can be defined as

To obtain our main results, we need to introduce the following assumption, definition and lemmas.

Assumption 1. For any , , where are known constant scalars.
As pointed out in [16] under Assumption 1, system (2.3) has equilibrium points. Assume that is an equilibrium point of (2.3) and let , . Then, system (2.3), can be transformed into the following form: where , , . From Assumption 1, for any , , functions satisfy and

Remark 2.1. Assumption 1 is widely used for dealing with the stability problem for neural networks. As pointed out in [13, 14, 16, 17, 26, 27], constants can be positive, negative, and zero. Thus, this assumption is less restrictive than traditional Lipschitz condition.

Definition 2.2. The delayed discrete-time recurrent neural network in (2.5) is said to be globally exponentially stable if there exist two positive scalars   and such that

Lemma 2.3 (Tchebychev Inequality [28]). For any given vectors , the following inequality holds: Lemma 2.4 (see [29]). For given matrices and of appropriate dimensions, then for all satisfying , if and only if there is an , such that Lemma 2.5 (see [16]). If Assumption 1 holds, then for any positive-definite diagonal matrix , the following inequality holds: where , .Lemma 2.6 (see [30]). Given constant symmetric matrices where and , then if and only if Lemma 2.7 (see [13]). Let and be real constant matrices with appropriate dimensions, matrix satisfying , then, for any , .

3. Main Results

To obtain our main results, we decompose the connection weight matrix as follows: Then, we can get the following stability results.

Theorem 3.1. For any given positive scalars , then, under Assumption 1, system (2.5) without uncertainty is globally exponentially stable for any time-varying delay satisfying , if there exist positive-definite matrices , positive-definite diagonal matrices , and arbitrary matrices with appropriate dimensions, such that the following LMI holds: where

Proof. Construct a new augmented Lyapunov-Krasovskii functional candidate as follows: where ; 0 is zero matrix with appropriate dimensions: Set , , , , , . Define . Then along the solution of system (2.5) we have On the other hand, since , , we have Noting that Combining (3.12)–(3.17), (3.36)–(3.40), similar to the proof of Theorem 3.1, one can easily obtain this result, which completes the proof.

Remark 3.10. Compared with the augmented Lyapunov functional constructed in Theorem 3.1, this new augmented Lyapunov functional include the term , which makes the conservatism of the stability criterion be reduced further (details for more, see Example 4.2).

4. Numerical Examples

In this section, three numerical examples will be presented to show the validity of the main results derived above.

Example 4.1. For the convenience of comparison, let us consider a delayed discrete-time recurrent neural network in (2.5) with parameters given by The activation functions are given by . It is easy to see that the activation functions satisfy Assumption 1 with , . For , references [1517] gave out the allowable upper bound of the time-varying delay, respectively. Decompose matrix as , where Table 1 shows that our results are less conservative than these previous results.

Example 4.2. Consider an uncertain delayed discrete-time recurrent neural network in (2.1) with parameters given by The activation functions are given by , . It is easy to see that the activation functions satisfy Assumption 1 with , . For , references [16, 17] gave out the allowable upper bound of the time-varying delay, respectively. Decompose matrix as , where Set , by using the MATLAB toolbox, the allowable upper bounds for given are showed in Table 2. Obviously, our results are less conservative than these previous results.

Example 4.3. Consider an uncertain delayed discrete-time recurrent neural network in (2.1) with parameters given by And the activation functions are the same as given in Example 4.2. Decompose matrix as , where Set , by using the MATLAB toolbox, the allowable upper bounds for given are showed in Table 3.
The free-weighting matrices are obtained as follows when :

5. Conclusion

By decomposing some connection weight matrices , combined with linear matrix inequality (LMI) technique, some new augmented Lyapunov-Krasovskii functionals are constructed, and serial new improved sufficient conditions ensuring exponential stability or robust exponential stability are obtained. Numerical examples show that the new criteria derived in this paper are less conservative than some previous results obtained in the references cited therein.

Acknowlegments

This work was supported by the program for New Century Excellent Talents in University (NCET-06-0811) and the Research Fund for the Doctoral Program of Guizhou College of Finance and Economics (200702).