Abstract
This paper deals with the problem of delay-dependent stability criterion of uncertain periodic switched recurrent neural networks with time-varying delays. When uncertain discrete-time recurrent neural network is a periodic system, it is expressed as switched neural network for the finite switching state. Based on the switched quadratic Lyapunov functional approach (SQLF) and free-weighting matrix approach (FWM), some linear matrix inequality criteria are found to guarantee the delay-dependent asymptotical stability of these systems. Two examples illustrate the exactness of the proposed criteria.
1. Introduction
Recurrent neural networks (RNNs) are a very important tool for many application areas such as associative memory, pattern recognition, signal processing, model identification, and combinatorial optimization. With the development of research on RNNs in theory and application, the model is more and more complex. When the continuous-time RNNs are simulated using computer, they should be discretized into discrete-time RNNs [1–3]. Simultaneously, in implementations of artificial neural networks, time-varying delay may occur due to finite switching speeds of the amplifiers and communication time [4, 5]. Therefore, researchers have considered that discrete-time RNNs with time-varying delay are incorporated in the processing and/or transmission parts of the network architectures [6–9]. Parameter uncertainties and nonautonomous phenomena often exist in real systems due to modeling inaccuracies [4]. Particularly when we consider a long-term dynamical behaviors of the system and consider seasonality of the changing environment, the parameters of the system usually will change with time [10–14]. In order to model those systems with neural networks, the uncertain (or switched or jumping) neural networks with time-varying delay appear in many papers [6, 15–24]. So in this paper we consider the stability of the following discrete-time recurrent neural networks with time-varying delay: where is the state vector associated with neurons, is a diagonal matrix with positive entries, and are, respectively, the connection weight matrix and the delayed connection weight matrix, is input vector, and are the neuron activation function vectors, and is nonnegative differential time-varying functions which denote the time delays and satisfy
In most literatures it is required that parameter uncertainty matrices, such as , , , and , should be in the form where , , , and are given constant matrices of appropriate dimensions and is an uncertain matrix such that In practice, however, , , , and are generally difficult to have the decomposition of matrices for , , , , and . In addition, periodic oscillation in recurrent neural networks is an interesting dynamic behavior as many biological and cognitive activities require repetition [7, 10, 11, 25]. Simultaneously, periodic oscillations in recurrent neural networks have been found in many applications such as associative memories, pattern recognition, machine learning, and robot motion control [25]. So, if (1.1) is an uncertain periodic neural network in which the period is less than a constant , then (1.1) can be expressed as switched neural network for the finite switching state, that is, if (1.1) is a neural network with period (), then , and , which is corresponding to a switched neural network set: , where , , , , and . Suppose is the number of elements of ; then (1.1) is actually modified by where is a switching rule defined by with . Moreover, means the sub-recurrent neural network (sub-RNN) , which is corresponding to , is active.
The dynamic behaviors of those models are foundations for applications. Under (1.5) most papers discuss the stability of uncertain neural networks with the common Lyapunov function approach [5, 6, 15–23, 25]. To the best of the authors’ knowledge, up to now, there is scarcely any paper that studies the uncertain periodic neural networks using the SQLF. This situation motivates this research.
Motivated by the above discussions, the authors intend to study a problem of the delay-dependent stability criterion of uncertain discrete-time recurrent neural networks with time-varying delays that the uncertain recurrent neural networks have a finite number of sub-RNNs, and the sub-RNNs may change from one to another according to arbitrary switching and restricted switching. The contributions of this paper are the following. (1) Using a switching graph, uncertain periodic recurrent neural networks with time-varying delays are transformed into switched recurrent neural networks; (2) the derivative of the SQLF (3.7) of the literature [8] is improved in (3.11), please see Remark 4.3 and Table 3; (3) based on the switching graph, the delay-dependent stability criteria of switched recurrent neural networks are studied by FWM and SQLF. Then an effective LMI approach is developed to solve the problem.
This paper is organized as follows. In Section 2, we give some basic definitions. We analyze the stability of the system (2.2) with the SQLF and FWM in Section 3. Some examples are given in Section 4. Section 5 offers the conclusions of this paper.
2. Preliminaries
In many electronic circuits, nonmonotonic functions can be more appropriate to describe the neuron activation in designing and implementing an artificial neural network [7]; hence, we have the following assumption.
For any , there exist constants , , , and such that
Under the assumption, the equilibrium points of UDNN (1.1) exist by the fixed point theorem [1]. In the following, let be the equilibrium point of (1.1); then . The systems (1.1) and (1.5) are, respectively, shifted to the following form: For convenience, the switching graph is defined.
Definition 2.1. Let be a switching graph, where is the set of sub-RNNs and is the set of weighted arcs . (or 0) represents the sub-RNN switches (or does not switch) to the sub-RNN .
Remark 2.2. When , if , that is sub-RNN cannot switch to any other sub-RNN, we suppose that the uncertain neural networks will always stay in the sub-RNN that means and .
Throughout this paper, the superscript stands for the transpose of a matrix, means that the matrix is positive definite, and the symmetric terms in a symmetric matrix are denoted by , for example,
3. Asymptotical Stability of Uncertain Periodic Switched Recurrent Neural Networks
Theorem 3.1. Let and be positive integers such that . Based on a switching graph , the system (2.2) is asymptotical stable if, when , there exist the corresponding symmetric matrices , , , , , , , , , , and any appropriate dimensional matrices , , and such that the following LMIs hold: where
Proof. Suppose that ; then we have and .
We consider the following SQLF:
It is clear that the following equations are true:
Firstly, we prove that under the SQLF is less than 0. Suppose that and , that means the sub-RNN switches to the sub-RNN ; we obtain
In order to strictly guarantee should be less than 0. In the switching graph if there exists sub-RNN , which satisfied , which means the cannot switch to any other sub-RNN, the equation can be grounded, otherwise the switching sequence must be , and there exist such that . Because the affection of the sub-RNNs and on the whole system is before time , after the changes to a periodic sequence . Suppose that ; then in switching sequence the following LMIs all hold:
then the solution of (3.13) is . Thus, we suppose that
Similar to , together with , , , and , we suppose that
On the other hand, for any appropriately dimensioned matrices , , and the following equations are true:
where .
In addition, for any semipositive definite matrix and , the following equations hold:
From the assumption, we have
Similar to the conclusion in [8], for the following inequalities are also true:
Then we add the terms on the right side of (3.16)–(3.23) to yield
where .
And is defined in (3.18). , , , and are defined in (3.1)–(3.4). Therefore, when the corresponding LMIs satisfy , , , and , .
Secondly, based on the switching graph , when , all corresponding are less than 0 that means the system (2.2) is asymptotical stable. This completes the proof of Theorem 3.1.
Remark 3.2. Using the method in [8], it is easily to know that the system is the globally exponentially stable.
Remark 3.3. In of the literature [8], may lead to considerable conservativeness. Then, it is improved as Please see Table 3.
Combined with Theorem 3.1, we consider the common Lyapunov function approach; then we have the following.
Corollary 3.4. Let and be positive integers such that . The system (2.2) is asymptotical stable if there exist symmetric matrices , , , , , , , , , , and any appropriate dimensional matrices , , and such that the following LMIs hold: where
4. Examples
Example 4.1. Consider the discrete-time recurrent neural network (2.2) with
Then we have
Employing the LMIs in Theorem 3.1 yields upper bounds on that guarantee the stability of system (1.1) for various lower bounds , which are listed in Table 1. When and , it can be seen from Figure 1 that all the state solutions corresponding to the 10 random initial points are convergent asymptotically to the unique equilibrium , and according to Theorem 3.1, LMIs (3.1)–(3.4) are solvable in Matlab 7.0.1.
Example 4.2. Consider the discrete-time recurrent neural network (2.2) with
Then we have
Employing the LMIs in [8] and those in Corollary 3.4 yields upper bounds on that guarantee the stability of system for various lower bounds , which are listed in Table 2. It is clear that the obtained upper bounds of this paper are better than those of [8]. It can be seen from Figure 2 that, when and , all the state solutions corresponding to the 10 random initial points are convergent asymptotically to the unique equilibrium .
Remark 4.3. Employing the LMIs in [8] and those in Corollary 3.4 yields upper bounds on that guarantee the stability of system (1.1) of the Example 1 of [8] for various lower bounds , which are listed in Table 3.
5. Conclusions
This paper was dedicated to the delay-dependent stability of uncertain periodic switched recurrent neural networks with time-varying delay. A less conservative LMI-based globally stability criterion is obtained with the switched quadratic Lyapunov functional approach and free-weighting matrix approach for periodic uncertain discrete-time recurrent neural networks with a time-varying delay. One example illustrates the exactness of the proposed criterion. Another example demonstrates that the proposed method is an improvement over the existing one.
Acknowledgments
This work was supported by the Sichuan Science and Technology Department under Grant 2011JY0114. The authors would like to thank the Associate Editor and the anonymous reviewers for their detailed comments and valuable suggestions which greatly contributed to this paper.