Abstract

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.

1. Introduction

During the past decades, recurrent neural network (RNN) has been successfully applied in many fields, such as signal processing, pattern classification, associative memory design, and optimization. Therefore, the study of RNN has attracted considerable attention and various issues of neural networks have been investigated (see, e.g., [14] and the references therein). As the integration and communication delay is unavoidably encountered in implementation of RNN and is often the main source of instability and oscillations, much efforts have been expended on the problem of stability of RNNs with time delays (see, e.g., [514]).

RNNs can be classified as local field networks and static neural networks based on the difference of basic variables (local field states or neuron states) [15]. Recently, the stability of static recurrent neural networks (SRNNs) with time-varying delay was investigated in [16], where sufficient conditions were obtained guaranteeing the global asymptotic stability of the neural network. Nevertheless, some negative semi-definite terms were ignored in [16], which lead to the conservatism of the derived result. By retaining these terms and considering the low bound of the delay, some improved stability conditions were derived for SRNNs with interval time-varying delay in [17]. In [18], an input-output framework was proposed to investigate the stability of SRNNs with linear fractional uncertainties and delays. Based on the augmented Lyapunov-Krasovskii functional approach, some new conditions were derived to assure the stability of SRNNs in [1922], but the results can be further improved.

In this paper, the problem of stability of SRNNs with time-varying delay is investigated based on the complete delay-decomposing approach [12]. By employing a reciprocally convex technique, some sufficient conditions are derived in the forms of linear matrix inequalities (LMIs). The effectiveness and the merit are illustrated by a numerical example.

Notations. Through this paper, and stand for the transpose and the inverse of the matrix , respectively;    means that the matrix is symmetric and positive definite (semipositive definite); denotes the -dimensional Euclidean space; denotes a block-diagonal matrix; is the Euclidean norm of ; the symbol within a matrix represents the symmetric terms of the matrix; for example, . Matrices, if not explicitly stated, are assumed to have compatible dimensions.

2. System Description

Consider the following delayed neural network: where and denote the neuron state vector and the input vector, respectively; is the neuron activation function; is the initial condition; and are known interconnection weight matrices; and is the time-varying delay and satisfies Furthermore, the neuron activation functions satisfy the following assumption.

Assumption 1. The neuron activation functions are bounded and satisfy where for . For simplicity, denote .

Under Assumption 1, there exists an equilibrium of (1). Hence, by the transformation , (1) can be transformed into where is the state vector; is the initial condition; and the transformed neuron activation functions satisfies

Notice that there exists an equilibrium point in neural network (5), corresponding to the initial condition . Based on the analysis above, the problem of analyzing the stability of system (1) at equilibrium is changed into a problem of analyzing the zero stability of system (5).

Before presenting our main results, we first introduce two lemmas, which are useful in the stability analysis of the considered neural network.

Lemma 2 (see [23]). Let be a constant real matrix, and suppose with such that the subsequent integration is well defined. Then, one has where .

Lemma 3 (see [24]). Let be given finite functions, and they have positive values for arbitrary value of independent variable in an open subset of . The reciprocally convex combination of    in satisfies subject to

3. Main Results

In the sequel, following the method proposed in [13], we decompose the delay interval into equidistant subintervals, where is a given integer; that is, with . Thus, for any , there should exist an integer , such that . Then the Lyapunov-Krasovskii functional candidate is chosen as with where are to be determined, ,  , and   denotes the th row of matrix .

Remark 4. Notice that a novel term being continuous at is included in the Lyapunov-Krasovskii functional (10), which plays an important role in reducing conservativeness of the derived result.

Next, we develop some new delay-dependent stability criteria for the delayed neural networks described by (5) and (6) with satisfying (2) and (3). By employing the Lyapunov-Krasovskii functional (10), the following theorem is obtained.

Theorem 5. For a given positive integer , scalars and , the origin of system (5) with the activation function satisfying (6) and a time-varying delay satisfying conditions (3) is globally asymptotically stable if there exist and , , , and , , with appropriate dimensions, such that, for , where with

Proof. From Assumption 1, it can be deduced that, for any diagonal matrices ,  ,
Now, calculating the derivative of along the solutions of neural network (5) yields where By Lemmas 2 and 3, it can be deduced that where .
Next, we introduce a new vector as where Then, rewrite system (5) as
Adding the right sides of (18) to (19) and applying (21) yield where For all , if , which is equivalent to LMIs (14) in the sense of Schur complement [25], then for any . Note that is continuous at , so the system (5) is globally asymptotically stable. This completes the proof.

Remark 6. In the proof of Theorem 5, and are not simply enlarged to as [16] does. By employing reciprocally convex approach to consider this information, Theorem 5 may be less conservative, which will be verified by the simulation results in the next section.

Remark 7. In previous works such as [16, 19], considerable attention has been paid to the case that the derivative of the time-varying delay satisfies (3). However, in the case of satifying the treatment in [16, 19] means that in (27) is enlarged to , which may lead to conservativeness inevitably. By contrast, the case above can be taken fully into account by replacing with in Theorem 5.

For the case that the time-varying delay is nondifferentiable or is unknown, setting ,  , in Theorem 5, a delay-dependent and rate-independent criterion is easily derived as follows.

Corollary 8. For a given positive integer , scalars , the origin of system (5) with the activation function satisfying (6) and a time-varying delay satisfying condition (2) is globally asymptotically stable if there exist with appropriate dimensions, such that, for , LMIs in (15) and (29) hold where , and are defined in Theorem 5.

4. Numerical Examples

In this section, we will provide a numerical example to show the effectiveness of the presented criteria.

Example 1. Consider neural network (1) with the following parameters: The activation functions satisfy (6) with

This example has been discussed in [1622]. By using Theorem 5 and Corollary 8 with , for various , the upper bounds that guarantee the global asymptotic stability of neural network (1) are computed and listed in Table 1. It can be concluded that the upper bounds obtained by our method are much better than those in [1622]. Obviously, the conditions proposed in this paper are an improvement over the existing ones.

5. Conclusions

This paper has studied the stability of SRNNs by constructing a complete delay-decomposing Lyapunov-Krasovskii functional. Some improved delay-dependent stability conditions have been derived by utilizing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, which are formulated in linear matrix inequalities (LMIs). Finally, a numerical example has been provided to show the effectiveness of the proposed methods.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (nos. 61304064, 61363073, 61273157, and 61262032), the Hunan Provincial Natural Science Foundation of China (no. 13JJ6058), the Research Foundation of Education Bureau of Hunan Province (no. 13A075), and the Natural Science Foundation of Hunan University of Technology (no. 2012HZX06).