Table of Contents Author Guidelines Submit a Manuscript
Discrete Dynamics in Nature and Society
Volume 2011 (2011), Article ID 325371, 14 pages
http://dx.doi.org/10.1155/2011/325371
Research Article

Delay-Dependent Stability Criteria of Uncertain Periodic Switched Recurrent Neural Networks with Time-Varying Delays

1College of Computer Science, Panzhihua University, Panzhihua 617000, China
2School of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing 219004, China

Received 20 June 2011; Revised 29 August 2011; Accepted 17 September 2011

Academic Editor: Baodong Zheng

Copyright © 2011 Xing Yin et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This paper deals with the problem of delay-dependent stability criterion of uncertain periodic switched recurrent neural networks with time-varying delays. When uncertain discrete-time recurrent neural network is a periodic system, it is expressed as switched neural network for the finite switching state. Based on the switched quadratic Lyapunov functional approach (SQLF) and free-weighting matrix approach (FWM), some linear matrix inequality criteria are found to guarantee the delay-dependent asymptotical stability of these systems. Two examples illustrate the exactness of the proposed criteria.

1. Introduction

Recurrent neural networks (RNNs) are a very important tool for many application areas such as associative memory, pattern recognition, signal processing, model identification, and combinatorial optimization. With the development of research on RNNs in theory and application, the model is more and more complex. When the continuous-time RNNs are simulated using computer, they should be discretized into discrete-time RNNs [13]. Simultaneously, in implementations of artificial neural networks, time-varying delay may occur due to finite switching speeds of the amplifiers and communication time [4, 5]. Therefore, researchers have considered that discrete-time RNNs with time-varying delay are incorporated in the processing and/or transmission parts of the network architectures [69]. Parameter uncertainties and nonautonomous phenomena often exist in real systems due to modeling inaccuracies [4]. Particularly when we consider a long-term dynamical behaviors of the system and consider seasonality of the changing environment, the parameters of the system usually will change with time [1014]. In order to model those systems with neural networks, the uncertain (or switched or jumping) neural networks with time-varying delay appear in many papers [6, 1524]. So in this paper we consider the stability of the following discrete-time recurrent neural networks with time-varying delay: 𝑢𝑊(𝑘+1)=(𝐴+Δ𝐴(𝑘))𝑢(𝑘)+1+Δ𝑊1𝑔+𝑊(𝑘)(𝑢(𝑘))2+Δ𝑊2(𝑘)𝑓(𝑢(𝑘𝑑(𝑘)))+(𝐼+Δ𝐼(𝑘)),(1.1) where 𝑢(𝑘)={𝑢1(𝑘),𝑢2(𝑘),,𝑢𝑛(𝑘)}𝑅𝑛 is the state vector associated with 𝑛 neurons, 𝐴=diag{𝑎1,𝑎2,,𝑎𝑛} is a diagonal matrix with positive entries, 𝑊1 and 𝑊2 are, respectively, the connection weight matrix and the delayed connection weight matrix, 𝐼 is input vector, 𝑔(𝑢)={𝑔1(𝑢),𝑔2(𝑢),,𝑔𝑛(𝑢)} and 𝑓(𝑢)={𝑓1(𝑢),𝑓2(𝑢),,𝑓𝑛(𝑢)} are the neuron activation function vectors, and 𝑑(𝑘) is nonnegative differential time-varying functions which denote the time delays and satisfy0𝑑1𝑑(𝑘)𝑑2.(1.2)

In most literatures it is required that parameter uncertainty matrices, such as Δ𝐴(𝑘), Δ𝑊1(𝑘), Δ𝑊2(𝑘), and Δ𝐼(𝑘), should be in the formΔ𝐴(𝑘)Δ𝑊1(𝑘)Δ𝑊2𝐸(𝑘)Δ𝐼(𝑘)=𝐷𝐹(𝑘)𝑎𝐸𝑤1𝐸𝑤2𝐸𝑖,(1.3) where 𝐸𝑎, 𝐸𝑤1, 𝐸𝑤2, and 𝐸𝑖 are given constant matrices of appropriate dimensions and 𝐹(𝑘) is an uncertain matrix such that𝐹𝑇(𝑘)𝐹(𝑘)𝐼.(1.4) In practice, however, Δ𝐴(𝑘), Δ𝑊1(𝑘), Δ𝑊2(𝑘), and Δ𝐼(𝑘) are generally difficult to have the decomposition of matrices for 𝐷, 𝐸𝑎, 𝐸𝑤1, 𝐸𝑤2, and 𝐸𝑖. In addition, periodic oscillation in recurrent neural networks is an interesting dynamic behavior as many biological and cognitive activities require repetition [7, 10, 11, 25]. Simultaneously, periodic oscillations in recurrent neural networks have been found in many applications such as associative memories, pattern recognition, machine learning, and robot motion control [25]. So, if (1.1) is an uncertain periodic neural network in which the period is less than a constant 𝑏, then (1.1) can be expressed as switched neural network for the finite switching state, that is, if (1.1) is a neural network with period 𝑎 (0<𝑎𝑏), then Δ𝐴(𝑘)=Δ𝐴(𝑘+𝑎),Δ𝑊1(𝑘)=Δ𝑊1(𝑘+𝑎),Δ𝑊2(𝑘)=Δ𝑊2(𝑘+𝑎), and Δ𝐼(𝑘)=Δ𝐼(𝑘+𝑎), which is corresponding to a switched neural network set: Ω=𝑏𝑎=1𝑈𝑎, where 𝑈𝑎={𝑆𝑎𝑖=(𝐴𝑎𝑖,𝑊1𝑎𝑖,𝑊2𝑎𝑖,𝐼𝑎𝑖)0<𝑖𝑎}, 𝐴𝑎𝑖=𝐴+Δ𝐴(𝑘+𝑖), 𝑊1𝑎𝑖=𝑊1+Δ𝑊1(𝑘+𝑖), 𝑊2𝑎𝑖=𝑊2+Δ𝑊2(𝑘+𝑖), and 𝐼𝑎𝑖=𝐼+Δ𝐼(𝑘+𝑖). Suppose 𝑁 is the number of elements of Ω; then (1.1) is actually modified by𝑢(𝑘+1)=𝐴𝑟(𝑘)𝑢(𝑘)+𝑊1𝑟(𝑘)𝑔(𝑢(𝑘))+𝑊2𝑟(𝑘)𝑓(𝑢(𝑘𝑑(𝑘)))+𝐼𝑟(𝑘),(1.5) where 𝑟(𝑘) is a switching rule defined by 𝑟(𝑘)𝑁Ω with Ω={𝑆1,𝑆2,,𝑆𝑁}. Moreover, 𝑟(𝑘)=𝑗 means the sub-recurrent neural network (sub-RNN) 𝑆𝑗, which is corresponding to (𝐴𝑗,𝑊1𝑗,𝑊2𝑗,𝐼𝑗), is active.

The dynamic behaviors of those models are foundations for applications. Under (1.5) most papers discuss the stability of uncertain neural networks with the common Lyapunov function approach [5, 6, 1523, 25]. To the best of the authors’ knowledge, up to now, there is scarcely any paper that studies the uncertain periodic neural networks using the SQLF. This situation motivates this research.

Motivated by the above discussions, the authors intend to study a problem of the delay-dependent stability criterion of uncertain discrete-time recurrent neural networks with time-varying delays that the uncertain recurrent neural networks have a finite number of sub-RNNs, and the sub-RNNs may change from one to another according to arbitrary switching and restricted switching. The contributions of this paper are the following. (1) Using a switching graph, uncertain periodic recurrent neural networks with time-varying delays are transformed into switched recurrent neural networks; (2) the derivative of the SQLF (3.7) of the literature [8] is improved in (3.11), please see Remark 4.3 and Table 3; (3) based on the switching graph, the delay-dependent stability criteria of switched recurrent neural networks are studied by FWM and SQLF. Then an effective LMI approach is developed to solve the problem.

This paper is organized as follows. In Section 2, we give some basic definitions. We analyze the stability of the system (2.2) with the SQLF and FWM in Section 3. Some examples are given in Section 4. Section 5 offers the conclusions of this paper.

2. Preliminaries

In many electronic circuits, nonmonotonic functions can be more appropriate to describe the neuron activation in designing and implementing an artificial neural network [7]; hence, we have the following assumption.

For any 𝑗{1,2,,𝑛}, there exist constants 𝑔𝑗, 𝑔+𝑗, 𝑓𝑗, and 𝑓+𝑗 such that 𝑔𝑗𝑔𝑗𝜃1𝑔𝑗𝜃2𝜃1𝜃2𝑔+𝑗,𝜃1,𝜃2𝑓𝑅,𝑗𝑓𝑗𝜃1𝑓𝑗𝜃2𝜃1𝜃2𝑓+𝑗,𝜃1,𝜃2𝑅.(2.1)

Under the assumption, the equilibrium points of UDNN (1.1) exist by the fixed point theorem [1]. In the following, let 𝑢={𝑢1,𝑢2,,𝑢𝑛} be the equilibrium point of (1.1); then 𝑥()=𝑢()𝑢. The systems (1.1) and (1.5) are, respectively, shifted to the following form:𝑥𝑊(𝑘+1)=(𝐴+Δ𝐴(𝑘))𝑥(𝑘)+1+Δ𝑊1𝑔𝑊(𝑘)(𝑥(𝑘))+2+Δ𝑊2𝑓(𝑘)(𝑥(𝑘𝜏(𝑘))),𝑥(𝑘+1)=𝐴𝑟(𝑘)𝑥(𝑘)+𝑊1𝑟(𝑘)𝑔(𝑥(𝑘))+𝑊2𝑟(𝑘)𝑓(𝑥(𝑘𝑑(𝑘))),𝑟(𝑘)Ω.(2.2) For convenience, the switching graph is defined.

Definition 2.1. Let Γ=(Ω,𝐖) be a switching graph, where Ω is the set of sub-RNNs 𝑆𝑖 and 𝐖 is the set of weighted arcs 𝐰𝑖𝑗{0,1}. 𝐰𝑖𝑗=1 (or 0) represents the sub-RNN 𝑆𝑖 switches (or does not switch) to the sub-RNN 𝑆𝑗.

Remark 2.2. When 𝐰𝑖𝑗=1, if 𝑁𝑙=1𝐰𝑗𝑙=0, that is sub-RNN 𝑆𝑗 cannot switch to any other sub-RNN, we suppose that the uncertain neural networks will always stay in the sub-RNN 𝑆𝑖 that means 𝐰𝑖𝑖=1 and 𝐰𝑖𝑗=0.

Throughout this paper, the superscript 𝑇 stands for the transpose of a matrix, 𝑃>0 means that the matrix 𝑃 is positive definite, and the symmetric terms in a symmetric matrix are denoted by , for example, =𝑂𝑀𝑂𝑁𝑀𝑂𝑇𝑁,𝐺1𝑔=diag1𝑔+1,𝑔2𝑔+2,,𝑔𝑛𝑔+𝑛,𝐺2𝑔=diag1+𝑔+12𝑔,2+𝑔+22𝑔,,𝑛+𝑔+𝑛2,𝐹1𝑓=diag1𝑓+1,𝑓2𝑓+2,,𝑓𝑛𝑓+𝑛,𝐹2𝑓=diag1+𝑓+12𝑓,2+𝑓+22𝑓,,𝑛+𝑓+𝑛2.(2.3)

3. Asymptotical Stability of Uncertain Periodic Switched Recurrent Neural Networks

Theorem 3.1. Let 𝑑1 and 𝑑2 be positive integers such that 0𝑑1𝑑2. Based on a switching graph Γ, the system (2.2) is asymptotical stable if, when 𝐰𝑖𝑗=1(𝑆𝑖,𝑆𝑗Ω), there exist the corresponding symmetric matrices 𝑃𝑖=𝑃𝑇𝑖>0, 𝑄1=𝑄𝑇1>0, 𝑄2=𝑄𝑇2>0, 𝑅=𝑅𝑇>0, 𝑍1=𝑍𝑇1>0, 𝑍2=𝑍𝑇2>0, 𝑋𝑖𝑗=(𝑋𝑖𝑗)𝑇0, 𝑈𝑖𝑗=(𝑈𝑖𝑗)𝑇0, 𝐻𝑖𝑗=diag{1𝑖𝑗,2𝑖𝑗,,𝑛𝑖𝑗}0, 𝑂𝑖𝑗=diag{𝑜1𝑖𝑗,𝑜2𝑖𝑗,,𝑜𝑛𝑖𝑗}0, and any appropriate dimensional matrices 𝑁𝑖𝑗, 𝑀𝑖𝑗, and 𝑇𝑖𝑗 such that the following LMIs hold: Φ𝑖𝑗=Φ𝑖𝑗11Φ𝑖𝑗12Φ𝑖𝑗13Φ𝑖𝑗14Φ𝑖𝑗15Φ𝑖𝑗16Φ𝑖𝑗22Φ𝑖𝑗23Φ𝑖𝑗2400Φ𝑖𝑗33Φ𝑖𝑗3400Φ𝑖𝑗440𝐹2𝑂𝑖𝑗Φ𝑖𝑗55Φ𝑖𝑗56Φ𝑖𝑗66Λ<0,(3.1)1𝑖𝑗=𝑋𝑖𝑗𝑁𝑖𝑗𝑍1Λ>0,(3.2)2𝑖𝑗=𝑈𝑖𝑗𝑇𝑖𝑗𝑍2Λ>0,(3.3)3𝑖𝑗=𝑋𝑖𝑗+𝑈𝑖𝑗𝑀𝑖𝑗𝑍1+𝑍2>0,(3.4) where Φ𝑖𝑗11=𝐴𝑇𝑖𝑃𝑗𝐴𝑖𝑃𝑖+𝑄1+𝑄2+𝑑2𝑑1𝐴+1𝑅+𝑖𝐸𝑇𝑑2𝑍1𝑗+𝑑2𝑑1𝑍2𝑗𝐴𝑖+𝑁𝐸1𝑖𝑗𝑇+𝑁1𝑖𝑗+𝑑2𝑋𝑖𝑗11+𝑑2𝑑1𝑈𝑖𝑗11𝐺1𝐻𝑖𝑗,Φ𝑖𝑗12=𝑁2𝑖𝑗𝑇+𝑇1𝑖𝑗+𝑑2𝑋𝑖𝑗12+𝑑2𝑑1𝑈𝑖𝑗12,Φ𝑖𝑗13=𝑁3𝑖𝑗𝑇𝑀1𝑖𝑗+𝑑2𝑋𝑖𝑗13+𝑑2𝑑1𝑈𝑖𝑗13,Φ𝑖𝑗14=𝑁1𝑖𝑗+𝑁4𝑖𝑗𝑇+𝑀1𝑖𝑗𝑇1𝑖𝑗+𝑑2𝑋𝑖𝑗14+𝑑2𝑑1𝑈𝑖𝑗14,Φ𝑖𝑗15=𝐴𝑇𝑖𝑃𝑗𝑊1𝑖+𝐴𝑖𝐸𝑇𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊1𝑖𝐺2𝐻𝑖𝑗,Φ𝑖𝑗16=𝐴𝑇𝑖𝑃𝑗𝑊2𝑖+𝐴𝑖𝐸𝑇𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊2𝑖,Φ𝑖𝑗22=𝑄1𝑇𝑅+2𝑖𝑗𝑇+𝑇2𝑖𝑗+𝑑2𝑋𝑖𝑗22+𝑑2𝑑1𝑈𝑖𝑗22,Φ𝑖𝑗23=𝑀2𝑖𝑗+𝑇3𝑖𝑗𝑇+𝑑2𝑋𝑖𝑗23+𝑑2𝑑1𝑈𝑖𝑗23,Φ𝑖𝑗24=𝑁2𝑖𝑗+𝑇4𝑖𝑗𝑇+𝑀2𝑖𝑗𝑇2𝑖𝑗+𝑑2𝑋𝑖𝑗24+𝑑2𝑑1𝑈𝑖𝑗24,Φ𝑖𝑗33=𝑄2𝑅𝑀3𝑖𝑗𝑀3𝑖𝑗𝑇+𝑑2𝑋𝑖𝑗33+𝑑2𝑑1𝑈𝑖𝑗33,Φ𝑖𝑗34=𝑁3𝑖𝑗𝑀4𝑖𝑗𝑇+𝑀3𝑖𝑗𝑇3𝑖𝑗+𝑑2𝑋𝑖𝑗34+𝑑2𝑑1𝑈𝑖𝑗34,Φ𝑖𝑗44=𝑁4𝑖𝑗𝑁4𝑖𝑗𝑇+𝑀4𝑖𝑗+𝑀4𝑖𝑗𝑇𝑇4𝑖𝑗𝑇4𝑖𝑗𝑇+𝑑2𝑋𝑖𝑗44+𝑑2𝑑1𝑈𝑖𝑗44𝐹1𝑂𝑖𝑗,Φ𝑖𝑗55=𝑊𝑇1𝑖𝑃𝑗𝑊1𝑖+𝑊𝑇1𝑖𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊1𝑖𝐻𝑖𝑗,Φ𝑖𝑗56=𝑊𝑇1𝑖𝑃𝑗𝑊2𝑖+𝑊𝑇1𝑖𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊2𝑖,Φ𝑖𝑗66=𝑊𝑇2𝑖𝑃𝑗𝑊2𝑖+𝑊𝑇2𝑖𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊2𝑖𝑂𝑖𝑗.(3.5)

Proof. Suppose that 𝑦(𝑙)=𝑥(𝑙+1)𝑥(𝑙); then we have 𝑥(𝑘+1)=𝑥(𝑘)+𝑦(𝑘) and 𝑥(𝑘)=𝑥(𝑘𝑑(𝑘))+𝑘1𝑖=𝑘𝑑(𝑘)𝑦(𝑙).
We consider the following SQLF: 𝑉𝑟(𝑘)(𝑘)=𝑉1𝑟(𝑘)(𝑘)+𝑉2𝑟(𝑘)(𝑘)+𝑉3𝑟(𝑘)(𝑘)+𝑉4𝑟(𝑘)𝑉(𝑘),1𝑟(𝑘)(𝑘)=𝑥𝑇(𝑘)𝑃𝑟(𝑘)𝑉𝑥(𝑘),2𝑟(𝑘)(𝑘)=𝑘1𝑙=𝑘𝑑1𝑥𝑇(𝑙)𝑄1𝑟(𝑘)𝑥(𝑙)+𝑘1𝑙=𝑘𝑑2𝑥𝑇(𝑙)𝑄2𝑟(𝑘)𝑥(𝑙),(3.6)𝑉3𝑟(𝑘)(𝑘)=𝑘𝑑1𝜃=𝑘𝑑2𝑘1𝑙=𝜃𝑥𝑇(𝑙)𝑅𝑟(𝑘)𝑥(𝑙),(3.7)𝑉4𝑟(𝑘)(𝑘)=1𝜃=𝑑2𝑘1𝑙=𝑘+𝜃𝑦𝑇(𝑙)𝑍1𝑟(𝑘)𝑦(𝑙)+𝑑11𝜃=𝑑2𝑘1𝑙=𝑘+𝜃𝑦𝑇(𝑙)𝑍2𝑟(𝑘)𝑦(𝑙).(3.8)
It is clear that the following equations are true: 𝑘1𝑙=𝑘𝑑2𝑦𝑇(𝑙)𝑍1𝑟(𝑘)𝑦(𝑙)=𝑘𝑑(𝑘)1𝑙=𝑘𝑑2𝑦𝑇(𝑙)𝑍1𝑟(𝑘)𝑦(𝑙)+𝑘1𝑙=𝑘𝑑(𝑘)𝑦𝑇(𝑙)𝑍1𝑟(𝑘)𝑦(𝑙),𝑘𝑑11𝑙=𝑘𝑑2𝑦𝑇(𝑙)𝑍2𝑟(𝑘)𝑦(𝑙)=𝑘𝑑(𝑘)1𝑙=𝑘𝑑2𝑦𝑇(𝑙)𝑍2𝑟(𝑘)𝑦(𝑙)+𝑘𝑑11𝑙=𝑘𝑑(𝑘)𝑦𝑇(𝑙)𝑍2𝑟(𝑘)𝑦(𝑙).(3.9)
Firstly, we prove that under 𝐰𝑖𝑗=1 the SQLF is less than 0. Suppose that 𝑟(𝑘)=𝑖 and 𝑟(𝑘+1)=𝑗, that means the sub-RNN 𝑆𝑖 switches to the sub-RNN 𝑆𝑗; we obtain Δ𝑉1𝑖(𝑘)=𝑉1𝑗(𝑘+1)𝑉1𝑖𝐴(𝑘)=𝑖𝑥(𝑘)+𝑊1𝑖𝑔(𝑥(𝑘))+𝑊2𝑖𝑓(𝑥(𝑘𝑑(𝑘)))𝑇×𝑃𝑗𝐴𝑖𝑥(𝑘)+𝑊1𝑖𝑔(𝑥(𝑘))+𝑊2𝑖𝑓(𝑥(𝑘𝑑(𝑘)))𝑥𝑇(𝑘)𝑃𝑖𝑥(𝑘),Δ𝑉2𝑖(𝑘)=𝑉2𝑗(𝑘+1)𝑉2𝑖(𝑘)=𝑥𝑇𝑄(𝑘)1𝑗+𝑄2𝑗𝑥(𝑘)𝑥𝑇𝑘𝑑1𝑄1𝑖𝑥𝑘𝑑1𝑥𝑇𝑘𝑑2𝑄2𝑖𝑥𝑘𝑑2+𝑘1𝑙=𝑘+1𝑑1𝑥𝑇𝑄(𝑙)1𝑗𝑄1𝑖+𝑥(𝑙)𝑘1𝑙=𝑘+1𝑑2𝑥𝑇𝑄(𝑙)2𝑗𝑄2𝑖𝑥(𝑙),(3.10)Δ𝑉3𝑖(𝑘)=𝑉3𝑗(𝑘+1)𝑉3𝑖𝑑(𝑘)2𝑑1𝑥+1𝑇(𝑘)𝑅𝑗𝑥(𝑘)𝑥𝑇𝑘𝑑1𝑅𝑖𝑥𝑘𝑑1𝑥𝑇𝑘𝑑2𝑅𝑖𝑥𝑘𝑑2𝑥𝑇(𝑘𝑑(𝑘))𝑅𝑖+𝑥(𝑘𝑑(𝑘))𝑘1𝑙=𝑘+1𝑑1𝑥𝑇𝑅(𝑙)𝑗𝑅𝑖𝑥(𝑙)+𝑘𝑑1𝜃=𝑘+1𝑑2𝑘1𝑙=𝜃𝑥𝑇𝑅(𝑙)𝑗𝑅𝑖𝑥(𝑙),(3.11)Δ𝑉4𝑖(𝑘)=𝑉4𝑗(𝑘+1)𝑉4𝑖(𝑘)=𝑑2𝑦𝑇(𝑘)𝑍1𝑗𝑦𝑑(𝑘)+2𝑑1𝑦𝑇(𝑘)𝑍2𝑗𝑦(𝑘)𝑘1𝑙=𝑘𝑑(𝑘)𝑦𝑇(𝑙)𝑍1𝑖𝑦(𝑙)𝑘𝑑11𝑙=𝑘𝑑(𝑘)𝑦𝑇(𝑙)𝑍2𝑖𝑦(𝑙)𝑘𝑑(𝑘)1𝑙=𝑘𝑑2𝑦𝑇𝑍(𝑙)1𝑖+𝑍2𝑖𝑦(𝑙)+1𝜃=𝑑2𝑘1𝑙=𝑘+1+𝜃𝑦𝑇𝑍(𝑙)1𝑗𝑍1𝑖𝑦+(𝑙)𝑑11𝜃=𝑑2𝑘1𝑙=𝑘+1+𝜃𝑦𝑇𝑍(𝑙)2𝑗𝑍2𝑖𝑦(𝑙).(3.12)
In order to strictly guarantee Δ𝑉2𝑖(𝑘)<0,𝑘1𝑙=𝑘+1𝑑1𝑥𝑇(𝑙)(𝑄1𝑗𝑄1𝑖)𝑥(𝑙) should be less than 0. In the switching graph Γ if there exists sub-RNN 𝑆𝑖(𝑆𝑖Ω), which satisfied 𝑁𝑗=1𝐰𝑖𝑗=0, which means the 𝑆𝑖 cannot switch to any other sub-RNN, the equation 𝑄1𝑖=𝑄1𝑗 can be grounded, otherwise the switching sequence must be 𝛽={𝑆𝛼1𝑆𝛼2,,𝑆𝛼𝑗,𝑆𝛼𝐿𝑆𝛼𝐿+1}(𝑆𝛼𝑗Ω), and there exist 𝑙 such that 𝑆𝛼𝐿+1=𝑆𝛼𝑙(1𝑙<𝐿). Because the affection of the sub-RNNs 𝑆1,𝑆2,𝑆𝑙1 and 𝑆𝑙 on the whole system is before time 𝛼𝑙, after 𝛼𝑙 the 𝛽 changes to a periodic sequence 𝛽={𝑆𝛼𝑙,𝑆𝛼𝐿𝑆𝛼𝐿+1}. Suppose that 𝑖=𝛼𝑙=𝛼𝐿+1; then in switching sequence 𝛽 the following LMIs all hold: 𝑄1𝛼𝑙𝑄1𝛼𝑙+1𝑄01𝑖𝑄1(𝑙+1),𝑄01𝛼𝑙+1𝑄1𝛼𝑙+2𝑄0,1𝛼𝐿𝑄1𝛼𝐿+1𝑄01(𝐿)𝑄1𝑖;0(3.13) then the solution of (3.13) is 𝑄1𝑖=𝑄1𝛼𝑙=𝑄1𝛼𝑙+1==𝑄1𝛼𝐿. Thus, we suppose that 𝑄1=𝑄1𝑖=𝑄1𝑗.(3.14) Similar to 𝑘1𝑙=𝑘+1𝑑1𝑥𝑇(𝑙)(𝑄1𝑗𝑄1𝑖)𝑥(𝑙), together with 𝑘1𝑙=𝑘+1𝑑2𝑥𝑇(𝑙)(𝑄2𝑗𝑄2𝑖)𝑥(𝑙), 𝑘1𝑙=𝑘+1𝑑1𝑥𝑇(𝑙)(𝑅𝑗𝑅𝑖)𝑥(𝑙), 𝑘𝑑1𝜃=𝑘+1𝑑2𝑘1𝑙=𝜃𝑥𝑇(𝑙)(𝑅𝑗𝑅𝑖)𝑥(𝑙), 1𝜃=𝑑2𝑘1𝑙=𝑘+1+𝜃𝑦𝑇(𝑙)(𝑍1𝑗𝑍1𝑖)𝑦(𝑙) and 𝑑11𝜃=𝑑2𝑘1𝑙=𝑘+1+𝜃𝑦𝑇(𝑙)(𝑍2𝑗𝑍2𝑖)𝑦(𝑙), we suppose that 𝑄2=𝑄2𝑖=𝑄2𝑗,𝑅=𝑅𝑖=𝑅𝑗,𝑍(3.15)1=𝑍1𝑖=𝑍1𝑗𝑍,(3.16)2=𝑍2𝑖=𝑍2𝑗.(3.17) On the other hand, for any appropriately dimensioned matrices 𝑁𝑖𝑗, 𝑀𝑖𝑗, and 𝑇𝑖𝑗 the following equations are true: 2𝜉𝑇(𝑘)𝑁𝑖𝑗𝑥(𝑘)𝑥(𝑘𝑑(𝑘))𝑘1𝑙=𝑘𝑑(𝑘)𝑦(𝑙)=0,(3.18)2𝜉𝑇(𝑘)𝑀𝑖𝑗𝑥(𝑘𝑑(𝑘))𝑥𝑘𝑑2𝑘𝑑(𝑘)1𝑙=𝑘𝑑2𝑦(𝑙)=0,2𝜉𝑇(𝑘)𝑇𝑖𝑗𝑥𝑘𝑑1𝑥(𝑘𝑑(𝑘))𝑘𝑑11𝑙=𝑘𝑑(𝑘)𝑦(𝑙)=0,(3.19) where 𝜉(𝑘)=[𝑥𝑇(𝑘),𝑥𝑇(𝑘𝑑1),𝑥𝑇(𝑘𝑑2),𝑥𝑇(𝑘𝑑(𝑘))]𝑇.
In addition, for any semipositive definite matrix 𝑋𝑖𝑗=(𝑋𝑖𝑗)𝑇 and 𝑈𝑖𝑗=(𝑈𝑖𝑗)𝑇, the following equations hold: 𝑑2𝜉𝑇(𝑘)𝑋𝑖𝑗𝜉(𝑘)𝑘𝑑(𝑘)1𝑙=𝑘𝑑2𝜉𝑇(𝑘)𝑋𝑖𝑗𝜉(𝑘)𝑘1𝑙=𝑘𝑑(𝑘)𝜉𝑇(𝑘)𝑋𝑖𝑗𝑑𝜉(𝑘)=0,2𝑑1𝜉𝑇(𝑘)𝑈𝑖𝑗𝜉(𝑘)𝑘𝑑(𝑘)1𝑙=𝑘𝑑2𝜉𝑇(𝑘)𝑈𝑖𝑗𝜉(𝑘)𝑘𝑑11𝑙=𝑘𝑑(𝑘)𝜉𝑇(𝑘)𝑈𝑖𝑗𝜉(𝑘)=0.(3.20) From the assumption, we have 𝑔𝑙(𝑥(𝑘))𝑔𝑙𝑥𝑙×𝑔(𝑘)𝑙(𝑥(𝑘))𝑔+𝑙𝑥𝑙𝑓(𝑘)0,𝑙=1,2,,𝑛,𝑙𝑥𝑘𝑑𝑘𝑓𝑙𝑥𝑙𝑘𝑑𝑘×𝑓𝑙𝑥𝑘𝑑𝑘𝑓+𝑙𝑥𝑙𝑘𝑑𝑘0,𝑙=1,2,,𝑛.(3.21)
Similar to the conclusion in [8], for 𝐻𝑖𝑗=diag{1,2,,𝑛}0,𝑂𝑖𝑗=diag{𝑜1,𝑜2,,𝑜𝑛}0 the following inequalities are also true: 𝑥(𝑘)𝑔(𝑥(𝑘))𝑇𝐺1𝐻𝑖𝑗𝐺2𝐻𝑖𝑗𝐻𝑖𝑗𝑥(𝑘)𝑔(𝑥(𝑘))0,(3.22)𝑥(𝑘𝑑(𝑘))𝑓(𝑥(𝑘𝑑(𝑘)))𝑇𝐹1𝑂𝑖𝑗𝐹2𝑂𝑖𝑗𝑂𝑖𝑗𝑥(𝑘𝑑(𝑘))𝑓(𝑥(𝑘𝑑(𝑘)))0.(3.23)
Then we add the terms on the right side of (3.16)–(3.23) to yield Δ𝑉𝑖(𝑘)=Δ𝑉1𝑖(𝑘)+Δ𝑉2𝑖(𝑘)+Δ𝑉3𝑖(𝑘)+Δ𝑉4𝑖(𝑘)𝜂𝑇1(𝑘)Φ𝑖𝑗𝜂1(𝑘)𝑘1𝑙=𝑘𝑑(𝑘)𝜂𝑇2(𝑘,𝑙)Λ1𝑖𝑗𝜂2(𝑘,𝑙)𝑘𝑑11𝑙=𝑘𝑑(𝑘)𝜂𝑇2(𝑘,𝑙)Λ2𝑖𝑗𝜂2(𝑘,𝑙)𝑘𝑑(𝑘)1𝑙=𝑘𝑑2𝜂𝑇2(𝑘,𝑙)Λ3𝑖𝑗𝜂2(𝑘,𝑙),(3.24) where 𝜂𝑇1(𝑘)=[𝜉𝑇(𝑘),𝑔𝑇(𝑥(𝑘)),𝑓𝑇(𝑥(𝑘𝑑(𝑘)))]𝑇,𝜂𝑇2(𝑘,𝑙)=[𝜉𝑇(𝑘),𝑦𝑇(𝑙)]𝑇.
And 𝜉(𝑘) is defined in (3.18). Φ𝑖𝑗, Λ1𝑖𝑗, Λ2𝑖𝑗, and Λ3𝑖𝑗 are defined in (3.1)–(3.4). Therefore, when the corresponding LMIs satisfy Φ𝑖𝑗<0, Λ1𝑖𝑗0, Λ2𝑖𝑗0, and Λ3𝑖𝑗0,𝑆𝑖,𝑆𝑗Ω, Δ𝑉𝑖(𝑘)0.
Secondly, based on the switching graph Γ, when 𝐰𝑖𝑗=1(𝑆𝑖,𝑆𝑗Ω), all corresponding Δ𝑉𝑖(𝑘) are less than 0 that means the system (2.2) is asymptotical stable. This completes the proof of Theorem 3.1.

Remark 3.2. Using the method in [8], it is easily to know that the system is the globally exponentially stable.

Remark 3.3. In 𝑉3𝑖(𝑘) of the literature [8], 𝑘𝑑(𝑘)𝑙=𝑘𝑑2𝑥𝑇(𝑙)𝑅𝑖𝑥(𝑙)𝑥𝑇(𝑘𝑑(𝑘))𝑅𝑖𝑥(𝑘𝑑(𝑘)) may lead to considerable conservativeness. Then, it is improved as 𝑘𝑑(𝑘)𝑙=𝑘𝑑2𝑥𝑇(𝑙)𝑅𝑖𝑥(𝑙)𝑥𝑇(𝑘𝑑2)𝑅𝑖𝑥(𝑘𝑑2)𝑥𝑇(𝑘𝑑(𝑘))𝑅𝑖𝑥(𝑘𝑑(𝑘))𝑥𝑇(𝑘𝑑1)𝑅𝑖𝑥(𝑘𝑑1). Please see Table 3.
Combined with Theorem 3.1, we consider the common Lyapunov function approach; then we have the following.

Corollary 3.4. Let 𝑑1 and 𝑑2 be positive integers such that 0𝑑1𝑑2. The system (2.2) is asymptotical stable if there exist symmetric matrices 𝑃=𝑃𝑇>0, 𝑄1=𝑄𝑇1>0, 𝑄2=𝑄𝑇2>0, 𝑅=𝑅𝑇>0, 𝑍1=𝑍𝑇1>0, 𝑍2=𝑍𝑇2>0, 𝑋=𝑋𝑇0, 𝑈=𝑈𝑇0, 𝐻=diag{1,2,,𝑛}0, 𝑂=diag{𝑜1,𝑜2,,𝑜𝑛}0, and any appropriate dimensional matrices 𝑁, 𝑀, and 𝑇 such that the following LMIs hold: Φ𝑖=Φ11Φ12Φ13Φ14Φ15Φ16Φ22Φ23Φ2400Φ33Φ3400Φ440𝐹2𝑂Φ55Φ56Φ66<0,𝑆𝑖ΛΩ,1=𝑋𝑁𝑍1Λ>0,2=𝑈𝑇𝑍2Λ>0,3=𝑋+𝑈𝑀𝑍1+𝑍2>0,(3.25) where Φ11=𝐴𝑇𝑖𝑃𝐴𝑖𝑃+𝑄1+𝑄2+𝑑2𝑑1𝐴+1𝑅+𝑖𝐸𝑇𝑑2𝑍1+𝑑2𝑑1𝑍2𝐴𝑖𝐸+𝑁𝑇1+𝑁1+𝑑2𝑋11+𝑑2𝑑1𝑈11𝐺1Φ𝐻,12=𝑁𝑇2+𝑇1+𝑑2𝑋12+𝑑2𝑑1𝑈12,Φ13=𝑁𝑇3𝑀1+𝑑2𝑋13+𝑑2𝑑1𝑈13,Φ14=𝑁1+𝑁𝑇4+𝑀1𝑇1+𝑑2𝑋14+𝑑2𝑑1𝑈14,Φ15=𝐴𝑇𝑖𝑃𝑊1𝑖+𝐴𝑖𝐸𝑇𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊1𝑖𝐺2Φ𝐻,16=𝐴𝑇𝑖𝑃𝑊2𝑖+𝐴𝑖𝐸𝑇𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊2𝑖,Φ22=𝑄1𝑅+𝑇𝑇2+𝑇2+𝑑2𝑋22+𝑑2𝑑1𝑈22,Φ23=𝑀2+𝑇𝑇3+𝑑2𝑋23+𝑑2𝑑1𝑈23,Φ24=𝑁2+𝑇𝑇4+𝑀2𝑇2+𝑑2𝑋24+𝑑2𝑑1𝑈24,Φ33=𝑄2𝑅𝑀3𝑀𝑇3+𝑑2𝑋33+𝑑2𝑑1𝑈33,Φ34=𝑁3𝑀𝑇4+𝑀3𝑇3+𝑑2𝑋34+𝑑2𝑑1𝑈34,Φ44=𝑁4𝑁𝑇4+𝑀4+𝑀𝑇4𝑇4𝑇𝑇4𝐹1𝑂+𝑑2𝑋44+𝑑2𝑑1𝑈44Φ𝑅,55=𝑊𝑇1𝑖𝑃𝑊1𝑖+𝑊𝑇1𝑖𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊1𝑖Φ𝐻,56=𝑊𝑇1𝑖𝑃𝑊2𝑖+𝑊𝑇1𝑖𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊2𝑖,Φ66=𝑊𝑇2𝑖𝑃𝑊2𝑖+𝑊𝑇2𝑖𝑑2𝑍1+𝑑2𝑑1𝑍2𝑊2𝑖𝑂.(3.26)

4. Examples

Example 4.1. Consider the discrete-time recurrent neural network (2.2) with 𝐴1=0.70000.20000.5,𝑊11=0.10.20.10.30.1000.10.4,𝑊21=,𝐴0.30.10.20.20.10.100.020.072=0.10000.20000.4,𝑊12=0.40.200.20.200.30.10.5,𝑊22=,𝐴0.30.10.200.0300.50.20.53=0.10000.40000.4,𝑊13=0.200.30.040.300.10.60.1,𝑊23=,,𝑔0.080.10.20.20.40.0500.20.1Γ=0100011001(𝑥)=tanh(0.4𝑥),𝑔2(𝑥)=tanh(0.2𝑥),𝑔3𝑓(𝑥)=tanh(0.8𝑥),1(𝑥)=tanh(0.6𝑥),𝑓2(𝑥)=tanh(0.4𝑥),𝑓3(𝑥)=tanh(0.2𝑥).(4.1) Then we have 𝐺1=𝐹1=000000000,𝐺2=0.20000.10000.4,𝐹2=0.30000.20000.1.(4.2)
Employing the LMIs in Theorem 3.1 yields upper bounds on 𝑑2 that guarantee the stability of system (1.1) for various lower bounds 𝑑1, which are listed in Table 1. When 𝑑1=1 and 𝑑2=3, it can be seen from Figure 1 that all the state solutions corresponding to the 10 random initial points are convergent asymptotically to the unique equilibrium 𝑥={0,0,0}, and according to Theorem 3.1, LMIs (3.1)–(3.4) are solvable in Matlab 7.0.1.

tab1
Table 1: Allowable upper bound of 𝑑2 with given 𝑑1.
325371.fig.001
Figure 1: Global convergence of states 𝑥1, 𝑥2, and 𝑥3 in Example 4.1, when 𝑑1=1 and 𝑑2=3.

Example 4.2. Consider the discrete-time recurrent neural network (2.2) with 𝐴=0.10000.20000.3,𝑊1=0.50.20.20.10.100.40.10.3,𝑊2=,𝑔0.20.10.200.30.100.30.031(𝑥)=𝑓1(𝑥)=tanh(0.6𝑥),𝑔2(𝑥)=𝑓2(𝑥)=tanh(0.4𝑥),𝑔3(𝑥)=𝑓3(𝑥)=tanh(0.2𝑥).(4.3) Then we have 𝐺1=𝐹1=000000000,𝐺2=𝐹2=0.30000.20000.1.(4.4)
Employing the LMIs in [8] and those in Corollary 3.4 yields upper bounds on 𝑑2 that guarantee the stability of system for various lower bounds 𝑑1, which are listed in Table 2. It is clear that the obtained upper bounds of this paper are better than those of [8]. It can be seen from Figure 2 that, when 𝑑1=1 and 𝑑2=3, all the state solutions corresponding to the 10 random initial points are convergent asymptotically to the unique equilibrium 𝑥={0,0,0}.

tab2
Table 2: Allowable upper bound of 𝑑2 with given 𝑑1.
tab3
Table 3: Allowable upper bound of 𝑑2 with given 𝑑1.
325371.fig.002
Figure 2: Global convergence of states 𝑥1, 𝑥2, and 𝑥3 in Example 4.2, when 𝑑1=1 and 𝑑2=3.

Remark 4.3. Employing the LMIs in [8] and those in Corollary 3.4 yields upper bounds on 𝑑2 that guarantee the stability of system (1.1) of the Example  1 of [8] for various lower bounds 𝑑1, which are listed in Table 3.

5. Conclusions

This paper was dedicated to the delay-dependent stability of uncertain periodic switched recurrent neural networks with time-varying delay. A less conservative LMI-based globally stability criterion is obtained with the switched quadratic Lyapunov functional approach and free-weighting matrix approach for periodic uncertain discrete-time recurrent neural networks with a time-varying delay. One example illustrates the exactness of the proposed criterion. Another example demonstrates that the proposed method is an improvement over the existing one.

Acknowledgments

This work was supported by the Sichuan Science and Technology Department under Grant 2011JY0114. The authors would like to thank the Associate Editor and the anonymous reviewers for their detailed comments and valuable suggestions which greatly contributed to this paper.

References

  1. Y. Liu, Z. Wang, A. Serrano, and X. Liu, “Discrete-time recurrent neural networks with time-varying delays: exponential stability analysis,” Physics Letters Section A, vol. 362, no. 5-6, pp. 480–488, 2007. View at Publisher · View at Google Scholar · View at Scopus
  2. Q. Song and Z. Wang, “A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays,” Physics Letters Section A, vol. 368, no. 1-2, pp. 134–145, 2007. View at Publisher · View at Google Scholar · View at Scopus
  3. B. Zhang, S. Xu, and Y. Zou, “Improved delay-dependent exponential stability criteria for discrete-time recurrent neural networks with time-varying delays,” Neurocomputing, vol. 72, no. 1–3, pp. 321–330, 2008. View at Publisher · View at Google Scholar · View at Scopus
  4. H. Huang, G. Feng, and J. Cao, “Robust state estimation for uncertain neural networks with time-varying delay,” IEEE Transactions on Neural Networks, vol. 19, no. 8, pp. 1329–1339, 2008. View at Publisher · View at Google Scholar · View at Scopus
  5. Y. Shen and J. Wang, “An improved algebraic criterion for global exponential stability of recurrent neural networks with time-varying delays,” IEEE Transactions on Neural Networks, vol. 19, no. 3, pp. 528–531, 2008. View at Publisher · View at Google Scholar · View at Scopus
  6. Q. Song and J. Cao, “Global dissipativity on uncertain discrete-time neural networks with time-varying delays,” Discrete Dynamics in Nature and Society, vol. 2010, Article ID 810408, 19 pages, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  7. Y. Liu, Z. Wang, and X. Liu, “Asymptotic stability for neural networks with mixed time-delays: the discrete-time case,” Neural Networks, vol. 22, no. 1, pp. 67–74, 2009. View at Publisher · View at Google Scholar · View at Scopus
  8. M. Wu, F. Liu, P. Shi, Y. He, and R. Yokoyama, “Improved free-weighting matrix approach for stability analysis of discrete-time recurrent neural networks with time-varying delay,” IEEE Transactions on Circuits and Systems II, vol. 55, no. 7, pp. 690–694, 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. C. Li, S. Wu, G. G. Feng, and X. Liao, “Stabilizing effects of impulses in discrete-time delayed neural networks,” IEEE Transactions on Neural Networks, vol. 22, no. 2, pp. 323–329, 2011. View at Publisher · View at Google Scholar
  10. H. Xiang, K. M. Yan, and B. Y. Wang, “Existence and global exponential stability of periodic solution for delayed high-order Hopfield-type neural networks,” Physics Letters Section A, vol. 352, no. 4-5, pp. 341–349, 2006. View at Publisher · View at Google Scholar · View at Scopus
  11. H. Xiang, K. M. Yan, and B. Y. Wang, “Existence and global stability of periodic solution for delayed discrete high-order Hopfield-type neural networks,” Discrete Dynamics in Nature and Society, vol. 2005, no. 3, pp. 281–297, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  12. G. Shaikhet and L. Shaikhet, “Stability of stochastic linear difference equations with varying delay,” in Advances in Systems, Signals, Control and Computers, V. Bajic, Ed., pp. 101–104, IAAMSAD and SA branch of the Academy of Nonlinear Sciences, Durban, South Africa, 1998. View at Google Scholar
  13. J. Li, Y. Diao, M. Li, and X. Yin, “Stability analysis of Discrete Hopfield neural networks with the nonnegative definite monotone increasing weight function matrix,” Discrete Dynamics in Nature and Society, vol. 2009, Article ID 673548, 2009. View at Publisher · View at Google Scholar
  14. J. Li, Y. Diao, J. Mao, Y. Zhang, and X. Yin, “Stability analysis of discrete hopfield neural networks with weight function matrix,” Lecture Notes in Computer Science, vol. 5370, pp. 760–768, 2008. View at Publisher · View at Google Scholar · View at Scopus
  15. X. Wu, Y. Wang, L. Huang, and Y. Zuo, “Robust exponential stability criterion for uncertain neural networks with discontinuous activation functions and time-varying delays,” Neurocomputing, vol. 73, no. 7–9, pp. 1265–1271, 2010. View at Publisher · View at Google Scholar
  16. H. Wu, W. Feng, and X. Liang, “New stability criteria for uncertain neural networks with interval time-varying delays,” Cognitive Neurodynamics, vol. 2, no. 4, pp. 363–370, 2008. View at Publisher · View at Google Scholar · View at Scopus
  17. C. Y. Lu, H. H. Tsai, T. J. Su, J. S. H. Tsai, and C. W. Liao, “A delay-dependent approach to passivity analysis for uncertain neural networks with time-varying Delayd,” Neural Processing Letters, vol. 27, no. 3, pp. 237–246, 2008. View at Publisher · View at Google Scholar · View at Scopus
  18. X. Lou and B. Cui, “Delay-dependent criteria for global robust periodicity of uncertain switched recurrent neural networks with time-varying delay,” IEEE Transactions on Neural Networks, vol. 19, no. 4, pp. 549–557, 2008. View at Publisher · View at Google Scholar · View at Scopus
  19. J. Liang, Z. Wang, Y. Liu, and X. Liu, “Robust synchronization of an array of coupled stochastic discrete-time delayed neural networks,” IEEE Transactions on Neural Networks, vol. 19, no. 11, pp. 1910–1921, 2008. View at Publisher · View at Google Scholar · View at Scopus
  20. H. Zhang and Y. Wang, “Stability analysis of Markovian jumping stochastic Cohen-Grossberg neural networks with mixed time delays,” IEEE Transactions on Neural Networks, vol. 19, no. 2, pp. 366–370, 2008. View at Publisher · View at Google Scholar · View at Scopus
  21. Z. Wang, Y. Liu, and X. Liu, “State estimation for jumping recurrent neural networks with discrete and distributed delays,” Neural Networks, vol. 22, no. 1, pp. 41–48, 2009. View at Publisher · View at Google Scholar · View at Scopus
  22. J. Zhou, Q. Song, and J. Yang, “Stochastic passivity of uncertain neural networks with time-varying delays,” Abstract and Applied Analysis, vol. 2009, Article ID 725846, 16 pages, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  23. E. Zhu, Y. Wang, Y. Wang, H. Zhang, and J. Zou, “Stability analysis of recurrent neural networks with random delay and markovian switching,” Journal of Inequalities and Applications, vol. 2010, Article ID 191546, 2010. View at Publisher · View at Google Scholar
  24. J. Li, W. Wu, J. Yuan, Q. Tan, and X. Yin, “Delay-dependent stability criterion of arbitrary switched linear systems with time-varying delay,” Discrete Dynamics in Nature and Society, vol. 2010, Article ID 347129, 16 pages, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  25. B. Chen and J. Wang, “Global exponential periodicity of a class of recurrent neural networks with oscillating parameters and time-varying delays,” IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1440–1448, 2005. View at Publisher · View at Google Scholar · View at Scopus