About this Journal Submit a Manuscript Table of Contents
Advances in Artificial Neural Systems
Volume 2011 (2011), Article ID 941426, 7 pages
http://dx.doi.org/10.1155/2011/941426
Research Article

On the Global Dissipativity of a Class of Cellular Neural Networks with Multipantograph Delays

Science of Mathematics College, Tianjin Normal University, Tianjin 300387, China

Received 9 May 2011; Revised 23 September 2011; Accepted 8 October 2011

Academic Editor: Tingwen Huang

Copyright © 2011 Liqun Zhou. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

For the first time the global dissipativity of a class of cellular neural networks with multipantograph delays is studied. On the one hand, some delay-dependent sufficient conditions are obtained by directly constructing suitable Lyapunov functionals; on the other hand, firstly the transformation transforms the cellular neural networks with multipantograph delays into the cellular neural networks with constant delays and variable coefficients, and then constructing Lyapunov functionals, some delay-independent sufficient conditions are given. These new sufficient conditions can ensure global dissipativity together with their sets of attraction and can be applied to design global dissipative cellular neural networks with multipantograph delays and easily checked in practice by simple algebraic methods. An example is given to illustrate the correctness of the results.

1. Introduction

In recent years, cellular neural networks (CNNs) and delayed cellular neural networks (DCNNs) have been investigated widely because of their extensive applications in pattern recognition, image processing, association, synchronization problem, and many other fields. In such applications, it is of prime importance to ensure that the designed CNNs be stable. Therefore, the stability of CNNs with or without delay has received much attention (see, [110]). As pointed out in [11, 12], the global dissipativity is also an important concept in dynamical neural networks. The concept of global dissipativity in dynamical systems is a more general concept and it has found applications in the areas such as stability theory, chaos and synchronization theory, system norm estimation, and robust control [11, 12]. Liao and Wang in [11] addressed the global dissipativity of a general class of continuous time recurrent neural networks and derived some sufficient conditions for the global dissipativity and global exponentially dissipativity. Arik in [12] analyzed the global dissipation of several classes of neural networks and derived some sufficient conditions for the global dissipativity of neural networks systems. To date, most research on DCNNs has been restricted to simple cases of constant delays [14]. Some Papers that considered variable or distributed are those in [57, 9, 10, 1315], and variable delays usually are considered to bounded delays. To the best of our knowledge, few authors have considered dynamical behavior for the CNNs with pantograph delays. Pantograph delay is also a kind of the objective existence, such as in Web quality of service (QoS). The pantograph delay systems as important mathematical models often arise in some fields such as physics, biology systems, and control theory. In this paper, our focus is on global dissipativity of multipantograph delayed cellular neural networks. The Lipschitz continuous activation functions are considered, and by constructing suitable Lyapunov functionals and applying matrix theory, some delay-dependent and delay-independent sufficient conditions are obtained. The main contributions of this paper include the derivations of new global attractive sets and characterization of global dissipativity. These properties play important roles in the design and applications of global dissipative CNNs with multipantograph delays, and are of great interest in many applications, such as chaos and synchronization theory, and robust control.

2. Model and Preliminaries

Consider the model of multipantograph delayed CNNs described by the following functional differential equations: ̇𝑥𝑖(𝑡)=𝑑𝑖𝑥𝑖(𝑡)+𝑛𝑗=1𝑎𝑖𝑗𝑓𝑗𝑥𝑗(𝑡)+𝑏𝑖𝑗𝑓𝑗𝑥𝑗𝑞1𝑡+𝑐𝑖𝑗𝑓𝑗𝑥𝑗𝑞2𝑡+𝐼𝑖,𝑥𝑖(0)=𝑥𝑖0,𝑖=1,2,,𝑛,(1) for 𝑡>0, where 𝑥𝑖(𝑡) denotes the potential (or voltage) of the cell 𝑖 at time 𝑡; 𝑓𝑖() denotes a nonlinear activation function; 𝐼𝑖 denotes the 𝑖th component of an external input source introduced from outside the network to the cell 𝑖 at time 𝑡; 𝑑𝑖>0 denotes the rate with which the cell 𝑖 resets its potential to the resting state when isolated from other cells and inputs at time 𝑡; 𝑎𝑖𝑗,𝑏𝑖𝑗, and 𝑐𝑖𝑗 are constants which denote the strengths of connectivity between the cells 𝑗 and 𝑖 at time 𝑡,𝑞1𝑡, and 𝑞2𝑡, respectively; 𝑞1,𝑞2 are pantograph constants and satisfy 0<𝑞1,𝑞2<1, and 𝑞1𝑡=𝑡(1𝑞1)𝑡,𝑞2𝑡=𝑡(1𝑞2)𝑡, in which (1𝑞1)𝑡,(1𝑞2)𝑡 correspond to the time delay required in processing and transmitting a signal from the 𝑗th cell to the 𝑖th cell; 𝑥𝑖0,𝑖=1,2,,𝑛 is constant which denotes initial value of 𝑥𝑖(𝑡),𝑖=1,2,,𝑛 at initial time 𝑡=0. In this paper, we consider the following Lipschitz continuous activation functions: 𝑓0𝑖𝑥𝑖𝑓𝑖𝑦𝑖𝑥𝑖𝑦𝑖𝑙𝑖,𝑙𝑖>0,𝑓𝑖||𝑓(0)=0,𝑖𝑥𝑖||||𝑥as𝑖||,(2) for all 𝑥𝑖,𝑦𝑖𝑅,𝑖=1,2,,𝑛, and denote 𝐿=diag(𝑙1,𝑙2,,𝑙𝑛).

The model (1) can be described in a vector form: 𝑥𝑞̇𝑥(𝑡)=𝐷𝑥(𝑡)+𝐴𝑓(𝑥(𝑡))+𝐵𝑓1𝑡𝑥𝑞+𝐶𝑓2𝑡+𝐼,𝑥(0)=𝑥0,(3) where 𝑥=(𝑥1,𝑥2,,𝑥𝑛)𝑇 is the neuron state vector, 𝐼=(𝐼1,𝐼2,,𝐼𝑛)𝑇 is the bias vector, 𝐷=diag(𝑑1,𝑑2,,𝑑𝑛), 𝐴=(𝑎𝑖𝑗)𝑛×𝑛, 𝐵=(𝑏𝑖𝑗)𝑛×𝑛, 𝐶=(𝑐𝑖𝑗)𝑛×𝑛 are connection weight matrices, 𝑓()=(𝑓1(),𝑓2(),,𝑓𝑛())𝑇 is a vector-valued activation function, and 𝑥0=(𝑥10,𝑥20,,𝑥𝑛0) denotes initial value vector at initial time 𝑡=0.

Definition 1 (see [11]). The cellular neural network (1) is said to be a dissipative system, if there exists a compact set 𝑆𝑅𝑛, such that for all 𝑥0𝑅𝑛, there exist 𝑡0>0, when 𝑡𝑡0,𝑥(𝑡,𝑥0)𝑆, where 𝑥(𝑡,𝑥0) denotes the solution of (1) from initial state 𝑥0 and initial time 𝑡=0. In this case, 𝑆 is called a globally attractive set. A set 𝑆 is called positive invariant if for all 𝑥0𝑆 implies 𝑥(𝑡,𝑥0)𝑆 for 𝑡0.

The induced matrix norms 𝑝 are displayed as follows: 𝐴1=max𝑗𝑛𝑖=1||𝑎𝑖𝑗||,𝐴=max𝑖𝑛𝑗=1||𝑎𝑖𝑗||.(4)

The transformation 𝑦𝑖(𝑡)=𝑥𝑖(𝑒𝑡) [16] transforms the neural networks (1) and (3), respectively, into the following cellular neural networks with constant time delays and variable coefficients: ̇𝑦𝑖(𝑡)=𝑒𝑡𝑑𝑖𝑦𝑖(𝑡)+𝑛𝑗=1𝑎𝑖𝑗𝑓𝑗𝑦𝑗(𝑡)+𝑏𝑖𝑗𝑓𝑗𝑦𝑗𝑡𝜏1+𝑐𝑖𝑗𝑓𝑗𝑦𝑗𝑡𝜏2+𝐼𝑖,𝑦𝑖(𝑠)=𝜑𝑖(𝑠),𝜏𝑠0,𝑖=1,2,𝑛,(5)̇𝑦(𝑡)=𝑒𝑡𝑦𝐷𝑦(𝑡)+𝐴𝑓(𝑦(𝑡))+𝐵𝑓𝑡𝜏1𝑦+𝐶𝑓𝑡𝜏2,+𝐼𝑦(𝑠)=𝜙(𝑠),𝜏𝑠0,(6) where 𝜏1=log𝑞1>0, 𝜏2=log𝑞2>0, 𝜏=max{𝜏1,𝜏2}, 𝜙(𝑠)=(𝜑1(𝑠),𝜑2(𝑠),,𝜑𝑛(𝑠))𝑇, in which 𝜑𝑖𝐶([𝜏,0],𝑅),𝑖=1,2,,𝑛 are continuous functions.

Remark 2. The models (1), (3), (5), and (6) of this paper are different from the models in [11, 12]. The models in [11, 12] are neural networks with constant delays, but in this paper the model (1) or (3) is the cellular neural network with multipantograph delays which are unbounded functions; the model (5) or (6) is the cellular neural network with constant delays and variable coefficients. The results in [11, 12] cannot be applied to the models (1), (3), (5), and (6) in this paper. Therefore, our results establish new criteria for the global dissipativity of cellular neural networks with multipantograph delays.

Definition 3 (see [17]). The cellular neural network (5) is said to be a dissipative system, if there exists a compact set 𝑆𝑅𝑛 such that for any compact set 𝜙𝑅𝑛, there exist 𝑡0=𝑡0(𝜙), when 𝑡𝑡0,𝑦(𝑡,𝜙)𝑆, where 𝑦(𝑡,𝜙) denotes the solution of (5) from initial state 𝜙. In this case, 𝑆 is called a globally attractive set. A set 𝑆 is called positive invariant if for all 𝜙𝑆 implies 𝑦(𝑡,𝜙)𝑆 for 𝑡𝑡0.

Lemma 4. For any vectors 𝑎,𝑏𝑅𝑛, the inequality 2𝑎𝑇𝑏𝑎𝑇𝑋𝑎+𝑏𝑇𝑋𝑏(7) holds, in which 𝑋 is any matrix with 𝑋>0.

3. Main Results

Theorem 5. If condition (2) is satisfied and the following condition holds: 𝐴+𝐴𝑇+𝐵𝐵𝑇+𝐶𝐶𝑇+1𝑞1+1𝑞2𝐼0,(8) then the cellular neural network (3) is a dissipative system and the set 𝑆1={𝑥|𝑓𝑖(𝑥𝑖(𝑡))|𝑙𝑖|𝐼𝑖|/𝑑𝑖,𝑖=1,2,,𝑛}is a positive invariant and globally attractive set.

Proof. The following positive definite and radially unbounded Lyapunov functional will be used: 𝑉(𝑥(𝑡))=2𝑛𝑖=1𝑥𝑖0(𝑡)𝑓𝑖(𝑠)𝑑𝑠+𝑛𝑖=1𝑡𝑞1𝑡1𝑞1𝑓2𝑖𝑥𝑖+(𝜁)𝑑𝜁𝑛𝑖=1𝑡𝑞2𝑡1𝑞2𝑓2𝑖𝑥𝑖(𝜂)𝑑𝜂.(9) The time derivative of 𝑉(𝑥(𝑡)) along the trajectories of the system (1) is obtained as follows: ̇𝑉(𝑥(𝑡))=2𝑛𝑖=1𝑑𝑖𝑓𝑖𝑥𝑖𝑥(𝑡)𝑖(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑎𝑖𝑗𝑓𝑖𝑥𝑖𝑓(𝑡)𝑗𝑥𝑗(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑏𝑖𝑗𝑓𝑖𝑥𝑖𝑓(𝑡)𝑗𝑥𝑗𝑞1𝑡+2𝑛𝑛𝑖=1𝑗=1𝑐𝑖𝑗𝑓𝑖𝑥𝑖𝑓(𝑡)𝑗𝑥𝑗𝑞2𝑡+2𝑛𝑖=1𝑓𝑖𝑥𝑖𝐼(𝑡)𝑖+𝑛𝑖=11𝑞1𝑓2𝑖𝑥𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑥𝑖𝑞1𝑡+𝑛𝑖=11𝑞2𝑓2𝑖𝑥𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑥𝑖𝑞2𝑡2𝑛𝑖=1𝑑𝑖𝑙𝑖𝑓2𝑖𝑥𝑖(𝑡)+2𝑛𝑖=1||𝑓𝑖𝑥𝑖||||𝐼(𝑡)𝑖||+𝑓𝑇(𝑥(𝑡))𝐴+𝐴𝑇𝑓(𝑥(𝑡))+2𝑓𝑇𝑥𝑞1𝑡𝐵𝑇𝑓(𝑥(𝑡))+2𝑓𝑇𝑥𝑞2𝑡𝐶𝑇+1𝑓(𝑥(𝑡))𝑞1+1𝑞2𝑓𝑇(𝑥(𝑡))𝑓(𝑥(𝑡))𝑓𝑇𝑥𝑞1𝑡𝑓𝑥𝑞1𝑡𝑓𝑇𝑥𝑞2𝑡𝑓𝑥𝑞2𝑡.(10) In view of Lemma 4, we obtain 𝑓𝑇𝑥𝑞1𝑡𝑓𝑥𝑞1𝑡+2𝑓𝑇𝑥𝑞1𝑡𝐵𝑇𝑓(𝑥(𝑡))𝑓𝑇(𝑥(𝑡))𝐵𝐵𝑇𝑓(𝑥(𝑡)),𝑓𝑇𝑥𝑞2𝑡𝑓𝑥𝑞2𝑡+2𝑓𝑇𝑥𝑞2𝑡𝐶𝑇𝑓(𝑥(𝑡))𝑓𝑇(𝑥(𝑡))𝐶𝐶𝑇𝑓(𝑥(𝑡)).(11) Using the above inequality in (10) results in ̇𝑉(𝑥(𝑡))2𝑛𝑖=1𝑑𝑖𝑙𝑖𝑓2𝑖𝑥𝑖(𝑡)+2𝑛𝑖=1||𝑓𝑖𝑥𝑖||||𝐼(𝑡)𝑖||+𝑓𝑇(𝑥(𝑡))𝐴+𝐴𝑇+𝐵𝐵𝑇+𝐶𝐶𝑇+1𝑞1+1𝑞2𝐼𝑓(𝑥(𝑡))<0,(12) for 𝑥𝑖𝑅𝑛𝑆1, implying that the set 𝑆1 is a positive invariant and globally attractive set.

Theorem 6. If condition (2) is satisfied and the following condition holds: 𝐴+𝐴𝑇+𝐵𝐵𝑇+𝐶𝐶𝑇+2𝐼0,(13) then the cellular neural network (6) is a dissipative system and the set 𝑆2={𝑦|𝑓𝑖(𝑦𝑖(𝑡))|𝑙𝑖|𝐼𝑖|/𝑑𝑖,𝑖=1,2,,𝑛} is a positive invariant and globally attractive set.

Proof. The following positive definite and radially unbounded Lyapunov functional will be used: 𝑉(𝑡,𝑦(𝑡))=2𝑛𝑖=1𝑒𝑡𝑦𝑖0(𝑡)𝑓𝑖(𝑠)𝑑𝑠+𝑛𝑖=1𝑡𝑡𝜏1𝑓2𝑖𝑦𝑖+(𝜁)𝑑𝜁𝑛𝑖=1𝑡𝑡𝜏2𝑓2𝑖𝑦𝑖(𝜂)𝑑𝜂.(14) The time derivative of 𝑉(𝑦(𝑡)) along the trajectories of the system (5) is obtained as follows: ̇𝑉(𝑡,𝑦(𝑡))=2𝑛𝑖=1𝑒𝑡𝑦𝑖0(𝑡)𝑓𝑖(𝑠)𝑑𝑠2𝑛𝑖=1𝑑𝑖𝑓𝑖𝑦𝑖𝑦(𝑡)𝑖(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑎𝑖𝑗𝑓𝑖𝑦𝑖𝑓(𝑡)𝑗𝑦𝑗(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑏𝑖𝑗𝑓𝑖𝑦𝑖𝑓(𝑡)𝑗𝑦𝑗𝑡𝜏1+2𝑛𝑛𝑖=1𝑗=1𝑐𝑖𝑗𝑓𝑖𝑦𝑖𝑓(𝑡)𝑗𝑦𝑗𝑡𝜏2+2𝑛𝑖=1𝑓𝑖𝑦𝑖(𝐼𝑡)𝑖+𝑛𝑖=1𝑓2𝑖𝑦𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑦𝑖𝑡𝜏1+𝑛𝑖=1𝑓2𝑖𝑦𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑦𝑖𝑡𝜏22𝑛𝑖=1𝑑𝑖𝑙𝑖𝑓2𝑖𝑦𝑖(𝑡)+2𝑛𝑖=1||𝑓𝑖𝑦𝑖||||𝐼(𝑡)𝑖||+𝑓𝑇(𝑦(𝑡))𝐴+𝐴𝑇𝑓(𝑦(𝑡))+2𝑓𝑇𝑦𝑡𝜏1𝐵𝑇𝑓(𝑦(𝑡))+2𝑓𝑇𝑦𝑡𝜏2𝐶𝑇𝑓(𝑦(𝑡))+2𝑓𝑇(𝑦(𝑡))𝑓(𝑦(𝑡))𝑓𝑇𝑦𝑡𝜏1𝑓𝑦𝑡𝜏1𝑓𝑇𝑦𝑡𝜏2𝑓𝑦𝑡𝜏22𝑛𝑖=1𝑑𝑖𝑙𝑖𝑓2𝑖𝑦𝑖(𝑡)+2𝑛𝑖=1||𝑓𝑖𝑦𝑖||||𝐼(𝑡)𝑖||+𝑓𝑇×(𝑦(𝑡))𝐴+𝐴𝑇+𝐵𝐵𝑇+𝐶𝐶𝑇+2𝐼𝑓(𝑦(𝑡))<0,for𝑦𝑖𝑅𝑛𝑆2.(15) So the set 𝑆2 is a positive invariant and globally attractive set.

Corollary 7. If condition (2) is satisfied and the following condition holds: 𝐴+𝐴𝑇+𝐵𝐵𝑇+𝐶𝐶𝑇+2𝐼0,(16) then the cellular neural network (3) is a dissipative system and the set 𝑆3={𝑥|𝑓𝑖(𝑥𝑖(𝑒𝑡))|𝑙𝑖|𝐼𝑖|/𝑑𝑖,𝑖=1,2,,𝑛} is a positive invariant and globally attractive set.

Theorem 8. If condition (2) is satisfied and the matrix 𝑄 given by 𝑄=𝑃𝐴𝐿1𝐷+𝐴𝐿1𝐷𝑇𝑃+𝑃𝐵𝐵𝑇𝑃+𝑃𝐶𝐶𝑇1𝑃+𝑞1+1𝑞2𝐼(17) is negative definite, then the cellular neural network (3) is a dissipative system and the set 𝑆4=𝑥𝑛𝑖=1𝑓𝑖𝑥𝑖+𝑝(𝑡)𝑖𝐼𝑖𝜆𝑀(𝑄)2𝑛𝑖=1𝑝𝑖𝐼𝑖𝜆𝑀(𝑄)2,𝑖=1,2,,𝑛(18) is a positive invariant and globally attractive set, where 𝐿=diag(𝑙1,𝑙2,,𝑙𝑛), 𝑃=diag(𝑝1,𝑝2,,𝑝𝑛), and 𝜆𝑀(𝑄) is the maximum eigenvalue of the matrix 𝑄.

Proof. Let us employ the following positive definite and radially unbounded Lyapunov functional: 𝑉(𝑥(𝑡))=2𝑛𝑖=1𝑝𝑖𝑥𝑖0(𝑡)𝑓𝑖(𝑠)𝑑𝑠+𝑛𝑖=1𝑡𝑞1𝑡1𝑞1𝑓2𝑖𝑥𝑖+(𝜁)𝑑𝜁𝑛𝑖=1𝑡𝑞2𝑡1𝑞2𝑓2𝑖𝑥𝑖(𝜂)𝑑𝜂.(19) The time derivative of 𝑉(𝑥(𝑡)) along the trajectories of the system (1) is obtained as follows: ̇𝑉(𝑥(𝑡))=2𝑛𝑖=1𝑝𝑖𝑑𝑖𝑓𝑖𝑥𝑖𝑥(𝑡)𝑖(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑝𝑖𝑎𝑖𝑗𝑓𝑖𝑥𝑖𝑓(𝑡)𝑗𝑥𝑗(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑝𝑖𝑏𝑖𝑗𝑓𝑖𝑥𝑖𝑓(𝑡)𝑗𝑥𝑗𝑞1𝑡+2𝑛𝑛𝑖=1𝑗=1𝑝𝑖𝑐𝑖𝑗𝑓𝑖𝑥𝑖𝑓(𝑡)𝑗𝑥𝑗𝑞2𝑡+2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑥𝑖(𝐼𝑡)𝑖+𝑛𝑖=11𝑞1𝑓2𝑖𝑥𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑥𝑖𝑞1𝑡+𝑛𝑖=11𝑞1𝑓2𝑖𝑥𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑥𝑖𝑞2𝑡2𝑛𝑖=1𝑝𝑖𝑑𝑖𝑙𝑖𝑓2𝑖𝑥𝑖(𝑡)+2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑥𝑖𝐼(𝑡)𝑖+𝑓𝑇(𝑥(𝑡))𝑃𝐴+𝐴𝑇𝑃𝑓(𝑥(𝑡))+2𝑓𝑇𝑥𝑞1𝑡𝐵𝑇𝑃𝑓(𝑥(𝑡))+2𝑓𝑇𝑥𝑞2𝑡𝐶𝑇+1𝑃𝑓(𝑥(𝑡))𝑞1+1𝑞1𝑓𝑇(𝑥(𝑡))𝑓(𝑥(𝑡))𝑓𝑇𝑥𝑞1𝑡𝑓𝑥𝑞1𝑡𝑓𝑇𝑥𝑞2𝑡𝑓𝑥𝑞2𝑡=2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑥𝑖𝐼(𝑡)𝑖+𝑓𝑇×𝑃(𝑥(𝑡))𝐴𝐿1𝐷+𝐴𝐿1𝐷𝑇𝑃𝑓(𝑥(𝑡))+2𝑓𝑇𝑥𝑞1𝑡𝐵𝑇𝑃𝑓(𝑥(𝑡))+2𝑓𝑇𝑥𝑞2𝑡𝐶𝑇+1𝑃𝑓(𝑥(𝑡))𝑞1+1𝑞1𝑓𝑇(𝑥(𝑡))𝑓(𝑥(𝑡))𝑓𝑇𝑥𝑞1𝑡𝑓𝑥𝑞1𝑡𝑓𝑇𝑥𝑞2𝑡𝑓𝑥𝑞2𝑡.(20) By Lemma 4, we obtain 𝑓𝑇𝑥𝑞1𝑡𝑓𝑥𝑞1𝑡+2𝑓𝑇𝑥𝑞1𝑡𝐵𝑇𝑃𝑓(𝑥(𝑡))𝑓𝑇(𝑥(𝑡))𝑃𝐵𝐵𝑇𝑃𝑓(𝑥(𝑡)),𝑓𝑇𝑥𝑞2𝑡𝑓𝑥𝑞2𝑡+2𝑓𝑇𝑥𝑞2𝑡𝐶𝑇𝑃𝑓(𝑥(𝑡))𝑓𝑇(𝑥(𝑡))𝑃𝐶𝐶𝑇𝑃𝑓(𝑥(𝑡)).(21) Using the above inequality in (20) results in ̇𝑉(𝑥(𝑡))2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑥𝑖𝐼(𝑡)𝑖+𝑓𝑇×𝑃(𝑥(𝑡))𝐴𝐿1𝐷+𝐴𝐿1𝐷𝑇𝑃𝑓+1(𝑥(𝑡))𝑞1+1𝑞1𝑓𝑇(𝑥(𝑡))𝑓(𝑥(𝑡))+𝑓𝑇(𝑥(𝑡))𝑃𝐵𝐵𝑇𝑃𝑓(𝑥(𝑡))+𝑓𝑇(𝑥(𝑡))𝑃𝐶𝐶𝑇𝑃𝑓(𝑥(𝑡))=2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑥𝑖𝐼(𝑡)𝑖+𝑓𝑇(𝑥(𝑡))𝑄𝑓(𝑥(𝑡))2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑥𝑖𝐼(𝑡)𝑖+𝑛𝑖=1𝜆𝑀(𝑄)𝑓2𝑖𝑥𝑖(𝑡)=𝜆𝑀(𝑄)𝑛𝑖=1𝑓𝑖𝑥𝑖(+𝑝𝑡)𝑖𝐼𝑖𝜆𝑀(𝑄)2𝑝𝑖𝐼𝑖𝜆𝑀(𝑄)2<0,when𝑓𝑖𝑥𝑖𝑅𝑛𝑆4,(22) implying that the set 𝑆4 is a positive invariant and globally attractive set.

Theorem 9. If condition (2) is satisfied and the matrix 𝑄 given by 𝑄=𝑃𝐴𝐿1𝐷+𝐴𝐿1𝐷𝑇𝑃+𝑃𝐵𝐵𝑇𝑃+𝑃𝐶𝐶𝑇𝑃+2𝐼(23) is negative definite, then the cellular neural network (6) is a dissipative system and the set 𝑆5=𝑦𝑛𝑖=1𝑓𝑖𝑦𝑖+𝑝(𝑡)𝑖𝐼𝑖𝜆𝑀(𝑄)2𝑛𝑖=1𝑝𝑖𝐼𝑖𝜆𝑀(𝑄)2,𝑖=1,2,,𝑛(24) is a positive invariant and globally attractive set where 𝑃=diag(𝑝1,𝑝2,,𝑝𝑛), and 𝜆𝑀(𝑄) is the maximum eigenvalue of the matrix 𝑄.

Proof. Let us employ the following positive definite and radially unbounded Lyapunov functional: 𝑉(𝑡,𝑦(𝑡))=2𝑛𝑖=1𝑝𝑖𝑒𝑡𝑦𝑖0(𝑡)𝑓𝑖(𝑠)𝑑𝑠+𝑛𝑖=1𝑡𝑡𝜏1𝑓2𝑖𝑦𝑖+(𝜁)𝑑𝜁𝑛𝑖=1𝑡𝑡𝜏2𝑓2𝑖𝑦𝑖(𝜂)𝑑𝜂.(25) The time derivative of 𝑉(𝑦(𝑡)) along the trajectories of the system (5) is obtained as follows: ̇𝑉(𝑡,𝑦(𝑡))=2𝑛𝑖=1𝑝𝑖𝑒𝑡𝑦𝑖0(𝑡)𝑓𝑖(𝑠)𝑑𝑠2𝑛𝑖=1𝑝𝑖𝑑𝑖𝑓𝑖𝑦𝑖𝑦(𝑡)𝑖(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑝𝑖𝑎𝑖𝑗𝑓𝑖𝑦𝑖𝑓(𝑡)𝑗𝑦𝑗(𝑡)+2𝑛𝑛𝑖=1𝑗=1𝑝𝑖𝑏𝑖𝑗𝑓𝑖𝑦𝑖𝑓(𝑡)𝑗𝑦𝑗𝑡𝜏1+2𝑛𝑛𝑖=1𝑗=1𝑝𝑖𝑐𝑖𝑗𝑓𝑖𝑦𝑖(𝑓𝑡)𝑗𝑦𝑗𝑡𝜏2+2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑦𝑖𝐼(𝑡)𝑖+𝑛𝑖=1𝑓2𝑖𝑦𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑦𝑖𝑡𝜏1+𝑛𝑖=1𝑓2𝑖𝑦𝑖(𝑡)𝑛𝑖=1𝑓2𝑖𝑦𝑖𝑡𝜏22𝑛𝑖=1𝑑𝑖𝑙𝑖𝑓2𝑖𝑦𝑖(𝑡)+2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑦𝑖𝐼(𝑡)𝑖+𝑓𝑇(𝑦(𝑡))𝑃𝐴+𝐴𝑇𝑃𝑓(𝑦(𝑡))+2𝑓𝑇𝑦𝑡𝜏1𝐵𝑇𝑃𝑓(𝑦(𝑡))+2𝑓𝑇𝑦𝑡𝜏2𝐶𝑇𝑃𝑓(𝑦(𝑡))+2𝑓𝑇(𝑦(𝑡))𝑓(𝑦(𝑡))𝑓𝑇𝑦𝑡𝜏1𝑓𝑥𝑡𝜏1𝑓𝑇𝑥𝑡𝜏2𝑓𝑥𝑡𝜏2=2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑦𝑖𝐼(𝑡)𝑖+𝑓𝑇×𝑃(𝑦(𝑡))𝐴𝐿1𝐷+𝐴𝐿1𝐷𝑇𝑃𝑓(𝑦(𝑡))+2𝑓𝑇𝑦𝑡𝜏1𝐵𝑇𝑃𝑓(𝑦(𝑡))+2𝑓𝑇𝑦𝑡𝜏2𝐶𝑇𝑃𝑓(𝑦(𝑡))+2𝑓𝑇(𝑦(𝑡))𝑓(𝑦(𝑡))𝑓𝑇𝑦𝑡𝜏1𝑓𝑦𝑡𝜏1𝑓𝑇𝑦𝑡𝜏2𝑓𝑦𝑡𝜏22𝑛𝑖=1𝑝𝑖𝑓𝑖𝑦𝑖𝐼(𝑡)𝑖+𝑓𝑇×𝑃(𝑦(𝑡))𝐴𝐿1𝐷+𝐴𝐿1𝐷𝑇𝑃𝑓(𝑦(𝑡))+2𝑓𝑇(𝑦(𝑡))𝑓(𝑦(𝑡))+𝑓𝑇(𝑦(𝑡))𝑃𝐵𝐵𝑇𝑃𝑓(𝑦(𝑡))+𝑓𝑇(𝑦(𝑡))𝑃𝐶𝐶𝑇𝑃𝑓(𝑦(𝑡))=2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑦𝑖𝐼(𝑡)𝑖+𝑓𝑇(𝑦(𝑡))𝑄𝑓(𝑥(𝑡))2𝑛𝑖=1𝑝𝑖𝑓𝑖𝑦𝑖𝐼(𝑡)𝑖+𝑛𝑖=1𝜆𝑀(𝑄)𝑓2𝑖𝑦𝑖(𝑡)=𝜆𝑀×(𝑄)𝑛𝑖=1𝑓𝑖𝑦𝑖+𝑝(𝑡)𝑖𝐼𝑖𝜆𝑀(𝑄)2𝑝𝑖𝐼𝑖𝜆𝑀(𝑄)2<0,for𝑓𝑖𝑦𝑖𝑅𝑛𝑆5.(26) So the set 𝑆5 is a positive invariant and globally attractive set.

Corollary 10. If condition (2) is satisfied and the matrix 𝑄 given by 𝑄=𝑃𝐴𝐿1𝐷+𝐴𝐿1𝐷𝑇𝑃+𝑃𝐵𝐵𝑇𝑃+𝑃𝐶𝐶𝑇𝑃+2𝐼(27) is negative definite, then the cellular neural network (3) is a dissipative system and the set 𝑆6=𝑥𝑛𝑖=1𝑓𝑖𝑥𝑖𝑒𝑡+𝑝𝑖𝐼𝑖𝜆𝑀(𝑄)2𝑛𝑖=1𝑝𝑖𝐼𝑖𝜆𝑀(𝑄)2,𝑖=1,2,,𝑛(28) is a positive invariant and globally attractive set, where 𝐿=diag(𝑙1,𝑙2,,𝑙𝑛), 𝑃=diag(𝑝1,𝑝2,,𝑝𝑛), and 𝜆𝑀(𝑄) is the maximum eigenvalue of the matrix 𝑄.

Remark 11. The above theorems and corollaries imply that the equilibrium of a neural network only lies in the positive invariant and globally attractive set, any properties of activation function that satisfied (2) can be utilized, and the condition using the Lasalle invariant principle is given.

4. Examples

Example 12. Consider the following system: ̇𝑥1(𝑡)̇𝑥21(𝑡)=30014𝑥1𝑥(𝑡)2+𝑓(𝑡)5101051𝑥1𝑓(𝑡)2𝑥2+𝑓(𝑡)11111𝑥1𝑞1𝑡𝑓2𝑥2𝑞1𝑡+𝑓11111𝑥1𝑞2𝑡𝑓2𝑥2𝑞2𝑡+23.(29) The activation functions 𝑓1(𝑥1)=sin((1/3)𝑥1)+(1/3)𝑥1, 𝑓2(𝑥2)=cos((1/2)𝑥2)+(1/4)𝑥2 are obviously Lipschitz continuous with Lipschitz constant 𝑙1=2/3,𝑙2=3/4, and the pantograph coefficient is 𝑞1=1/2,𝑞1=3/4. By some simple calculations, we obtain 𝐴+𝐴𝑇+𝐵𝐵𝑇+𝐶𝐶𝑇+42+38𝐼=30803(30) is negative definite. According to Theorem 5, the pantograph delay cellular neural network (29) is a dissipative system with a globally attractive set ||𝑓𝑆=𝑥1𝑥1||𝑙1||𝐼1||𝑑1||𝑓=4,2𝑥2||𝑙2||𝐼2||𝑑2=9.(31)

5. Conclusions

The pantograph delay systems as an important mathematical models often arise in some fields such as physics, biology systems, and control theory. In this paper, we firstly have studied the global dissipativity of a class of cellular neural networks with multipantograph delays. By using Lyapunov functional and matrix theory, we obtain some delay-dependent and delay-independent sufficient conditions which characterize global dissipation with their sets of attraction, which might have an impact in studying the uniqueness of equilibria, globally asymptotic stability, instability, and the existence of periodic solutions. And these sufficient conditions are easily checked in practice by simple algebraic methods. Remarkbly, our results hold for the classes of neural networks that are different from those considered in [11, 12]. One example is given to illustrate the correctness of our results.

Acknowledgments

This work is supported by the NSF of China (no. 60974144), Science and Technology Development Foundation of Tianjin Colleges and Universities (no. 20100813) and Foundation for Doctors of Tianjin Normal University (no. 52LX34).

References

  1. L. O. Chua and L. Yang, “Cellular neural networks: theory and applications,” IEEE Transactions on Circuits and Systems I, vol. 35, no. 10, pp. 1257–1290, 1998.
  2. T. Roska and L. O. Chua, “Cellular neural networks with non-linear and delay-type template elements and non-uniform grids,” International Journal of Circuit Theory and Applications, vol. 20, no. 4, pp. 469–481, 1992.
  3. N. Takahashi, “A new sufficient condition for complete stability of cellular neural networks with delay,” IEEE Transactions on Circuits and Systems I, vol. 47, no. 6, pp. 793–799, 2000.
  4. S. Mohamad and K. Gopalsamy, “Exponential stability of continuous-time and discrete-time cellular neural networks with delays,” Applied Mathematics and Computation, vol. 135, no. 1, pp. 17–38, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  5. Y. Zhang, A. H. Pheng, and S. L. Kwong, “Convergence analysis of cellular neural networks with unbounded delay,” IEEE Transactions on Circuits and Systems I, vol. 48, no. 6, pp. 680–687, 2001. View at Publisher · View at Google Scholar
  6. C. Sun and L. Li, “Dynamics of general neural networks with distributed delays,” in Proceeding of the 3rd International Symposium on Neural Networks, vol. 3971, part I of Lecture Notes in Computer Science, pp. 135–140, Chengdu, China, May 2006. View at Publisher · View at Google Scholar
  7. X. X. Liao and J. Wang, “Algebraic criteria for global exponential stability of cellular neural networks with multiple time delays,” IEEE Transactions on Circuits and Systems I, vol. 50, no. 2, pp. 268–275, 2003. View at Publisher · View at Google Scholar
  8. S. Arik and V. Tavsanoglu, “On the global asymptotic stability of delayed cellular neural networks,” IEEE Transactions on Circuits and Systems I, vol. 47, no. 4, pp. 571–574, 2000. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  9. Y. Yang and J. Cao, “Stability and periodicity in delayed cellular neural networks with impulsive effects,” Nonlinear Analysis, vol. 8, no. 1, pp. 362–374, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  10. C. Huang, L. Huang, and Z. Yuan, “Global stability analysis of a class of delayed cellular neural networks,” Mathematics and Computers in Simulation, vol. 70, no. 3, pp. 133–148, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  11. X. Liao and J. Wang, “Global dissipativity of continuous-time recurrent neural networks with time delay,” Physical Review E, vol. 68, no. 1, Article ID 016118, pp. 1–7, 2003.
  12. S. Arik, “On the global dissipativity of dynamical neural networks with time delays,” Physics Letters A, vol. 326, no. 1-2, pp. 126–132, 2004. View at Publisher · View at Google Scholar
  13. Z. Zeng and J. Wang, “Multiperiodicity and exponential attractivity evoked by periodic external inputs in delayed cellular neural networks,” Neural Computation, vol. 18, no. 4, pp. 848–870, 2006. View at Publisher · View at Google Scholar · View at PubMed · View at MathSciNet
  14. W. Su and Y. Chen, “Global asymptotic stability analysis for neutral stochastic neural networks with time-varying delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 14, no. 4, pp. 1576–1581, 2009. View at Publisher · View at Google Scholar
  15. L. Zhou and G. Hu, “Global exponential periodicity and stability of cellular neural networks with variable and distributed delays,” Applied Mathematics and Computation, vol. 195, no. 2, pp. 402–411, 2008. View at Publisher · View at Google Scholar
  16. Y. K. Liu, “Asymptotic behavior of functional differential equations with priportional time delays,” European Journal of Applied Mathematics, vol. 7, pp. 11–30, 1996.
  17. C. Huang, “Dissipativity of Runge-Kutta methods for dynamical systems with delays,” IMA Journal of Numerical Analysis, vol. 20, no. 1, pp. 153–166, 2000.