Abstract
For the first time the global dissipativity of a class of cellular neural networks with multipantograph delays is studied. On the one hand, some delay-dependent sufficient conditions are obtained by directly constructing suitable Lyapunov functionals; on the other hand, firstly the transformation transforms the cellular neural networks with multipantograph delays into the cellular neural networks with constant delays and variable coefficients, and then constructing Lyapunov functionals, some delay-independent sufficient conditions are given. These new sufficient conditions can ensure global dissipativity together with their sets of attraction and can be applied to design global dissipative cellular neural networks with multipantograph delays and easily checked in practice by simple algebraic methods. An example is given to illustrate the correctness of the results.
1. Introduction
In recent years, cellular neural networks (CNNs) and delayed cellular neural networks (DCNNs) have been investigated widely because of their extensive applications in pattern recognition, image processing, association, synchronization problem, and many other fields. In such applications, it is of prime importance to ensure that the designed CNNs be stable. Therefore, the stability of CNNs with or without delay has received much attention (see, [1–10]). As pointed out in [11, 12], the global dissipativity is also an important concept in dynamical neural networks. The concept of global dissipativity in dynamical systems is a more general concept and it has found applications in the areas such as stability theory, chaos and synchronization theory, system norm estimation, and robust control [11, 12]. Liao and Wang in [11] addressed the global dissipativity of a general class of continuous time recurrent neural networks and derived some sufficient conditions for the global dissipativity and global exponentially dissipativity. Arik in [12] analyzed the global dissipation of several classes of neural networks and derived some sufficient conditions for the global dissipativity of neural networks systems. To date, most research on DCNNs has been restricted to simple cases of constant delays [1–4]. Some Papers that considered variable or distributed are those in [5–7, 9, 10, 13–15], and variable delays usually are considered to bounded delays. To the best of our knowledge, few authors have considered dynamical behavior for the CNNs with pantograph delays. Pantograph delay is also a kind of the objective existence, such as in Web quality of service (QoS). The pantograph delay systems as important mathematical models often arise in some fields such as physics, biology systems, and control theory. In this paper, our focus is on global dissipativity of multipantograph delayed cellular neural networks. The Lipschitz continuous activation functions are considered, and by constructing suitable Lyapunov functionals and applying matrix theory, some delay-dependent and delay-independent sufficient conditions are obtained. The main contributions of this paper include the derivations of new global attractive sets and characterization of global dissipativity. These properties play important roles in the design and applications of global dissipative CNNs with multipantograph delays, and are of great interest in many applications, such as chaos and synchronization theory, and robust control.
2. Model and Preliminaries
Consider the model of multipantograph delayed CNNs described by the following functional differential equations: for , where denotes the potential (or voltage) of the cell at time ; denotes a nonlinear activation function; denotes the th component of an external input source introduced from outside the network to the cell at time ; denotes the rate with which the cell resets its potential to the resting state when isolated from other cells and inputs at time ; , and are constants which denote the strengths of connectivity between the cells and at time , and , respectively; are pantograph constants and satisfy , and , in which correspond to the time delay required in processing and transmitting a signal from the th cell to the th cell; is constant which denotes initial value of at initial time . In this paper, we consider the following Lipschitz continuous activation functions: for all , and denote .
The model (1) can be described in a vector form: where is the neuron state vector, is the bias vector, , , , are connection weight matrices, is a vector-valued activation function, and denotes initial value vector at initial time .
Definition 1 (see [11]). The cellular neural network (1) is said to be a dissipative system, if there exists a compact set , such that for all , there exist , when , where denotes the solution of (1) from initial state and initial time . In this case, is called a globally attractive set. A set is called positive invariant if for all implies for .
The induced matrix norms are displayed as follows:
The transformation [16] transforms the neural networks (1) and (3), respectively, into the following cellular neural networks with constant time delays and variable coefficients: where , , , , in which are continuous functions.
Remark 2. The models (1), (3), (5), and (6) of this paper are different from the models in [11, 12]. The models in [11, 12] are neural networks with constant delays, but in this paper the model (1) or (3) is the cellular neural network with multipantograph delays which are unbounded functions; the model (5) or (6) is the cellular neural network with constant delays and variable coefficients. The results in [11, 12] cannot be applied to the models (1), (3), (5), and (6) in this paper. Therefore, our results establish new criteria for the global dissipativity of cellular neural networks with multipantograph delays.
Definition 3 (see [17]). The cellular neural network (5) is said to be a dissipative system, if there exists a compact set such that for any compact set , there exist , when , where denotes the solution of (5) from initial state . In this case, is called a globally attractive set. A set is called positive invariant if for all implies for .
Lemma 4. For any vectors , the inequality holds, in which is any matrix with .
3. Main Results
Theorem 5. If condition (2) is satisfied and the following condition holds: then the cellular neural network (3) is a dissipative system and the set is a positive invariant and globally attractive set.
Proof. The following positive definite and radially unbounded Lyapunov functional will be used: The time derivative of along the trajectories of the system (1) is obtained as follows: In view of Lemma 4, we obtain Using the above inequality in (10) results in for , implying that the set is a positive invariant and globally attractive set.
Theorem 6. If condition (2) is satisfied and the following condition holds: then the cellular neural network (6) is a dissipative system and the set is a positive invariant and globally attractive set.
Proof. The following positive definite and radially unbounded Lyapunov functional will be used: The time derivative of along the trajectories of the system (5) is obtained as follows: So the set is a positive invariant and globally attractive set.
Corollary 7. If condition (2) is satisfied and the following condition holds: then the cellular neural network (3) is a dissipative system and the set is a positive invariant and globally attractive set.
Theorem 8. If condition (2) is satisfied and the matrix given by is negative definite, then the cellular neural network (3) is a dissipative system and the set is a positive invariant and globally attractive set, where , , and is the maximum eigenvalue of the matrix .
Proof. Let us employ the following positive definite and radially unbounded Lyapunov functional: The time derivative of along the trajectories of the system (1) is obtained as follows: By Lemma 4, we obtain Using the above inequality in (20) results in implying that the set is a positive invariant and globally attractive set.
Theorem 9. If condition (2) is satisfied and the matrix given by is negative definite, then the cellular neural network (6) is a dissipative system and the set is a positive invariant and globally attractive set where , and is the maximum eigenvalue of the matrix .
Proof. Let us employ the following positive definite and radially unbounded Lyapunov functional: The time derivative of along the trajectories of the system (5) is obtained as follows: So the set is a positive invariant and globally attractive set.
Corollary 10. If condition (2) is satisfied and the matrix given by is negative definite, then the cellular neural network (3) is a dissipative system and the set is a positive invariant and globally attractive set, where , , and is the maximum eigenvalue of the matrix .
Remark 11. The above theorems and corollaries imply that the equilibrium of a neural network only lies in the positive invariant and globally attractive set, any properties of activation function that satisfied (2) can be utilized, and the condition using the Lasalle invariant principle is given.
4. Examples
Example 12. Consider the following system: The activation functions , are obviously Lipschitz continuous with Lipschitz constant , and the pantograph coefficient is . By some simple calculations, we obtain is negative definite. According to Theorem 5, the pantograph delay cellular neural network (29) is a dissipative system with a globally attractive set
5. Conclusions
The pantograph delay systems as an important mathematical models often arise in some fields such as physics, biology systems, and control theory. In this paper, we firstly have studied the global dissipativity of a class of cellular neural networks with multipantograph delays. By using Lyapunov functional and matrix theory, we obtain some delay-dependent and delay-independent sufficient conditions which characterize global dissipation with their sets of attraction, which might have an impact in studying the uniqueness of equilibria, globally asymptotic stability, instability, and the existence of periodic solutions. And these sufficient conditions are easily checked in practice by simple algebraic methods. Remarkbly, our results hold for the classes of neural networks that are different from those considered in [11, 12]. One example is given to illustrate the correctness of our results.
Acknowledgments
This work is supported by the NSF of China (no. 60974144), Science and Technology Development Foundation of Tianjin Colleges and Universities (no. 20100813) and Foundation for Doctors of Tianjin Normal University (no. 52LX34).