Abstract

This paper aims at studying the problem of the dynamics of switched Cohen-Grossberg neural networks with mixed delays by using Lyapunov functional method, average dwell time (ADT) method, and linear matrix inequalities (LMIs) technique. Some conditions on the uniformly ultimate boundedness, the existence of an attractors, the globally exponential stability of the switched Cohen-Grossberg neural networks are developed. Our results extend and complement some earlier publications.

1. Introduction

In recent years, much attention has been devoted to the study of neural networks due to the fact that they have been fruitfully applied in classification of patterns and associative memories, image processing, parallel computation, optimization, and so on [13]. These applications rely crucially on the analysis of the dynamical behavior [47]. Various neural networks, such as Hopfield neural networks, cellular neural networks, bidirectional associative neural networks, and Cohen-Grossberg neural networks, have been successfully applied. Among them, the Cohen-Grossberg neural network (CGNN) [8] is an important one, which can be described as follows: where , ;   corresponds to the number of units in a neural network; denotes the potential (or voltage) of cell at time ; denotes a nonlinear output function; represents an amplification function; represents an appropriately behaved function; the connection matrix denotes the strengths of connectivity between cells, and if the output from neuron excites (resp., inhibits) neuron , (resp., ); denotes an external input source.

Neural network is nonlinearity; in the real world, nonlinear problems are not exceptional, but regular phenomena. Nonlinearity is the nature of matter and its development [9, 10]. In many practical cases, time delays are common phenomenon encountered in the implementation of neural networks, and they may cause the undesirable dynamic behaviors such as oscillation, divergence, or other poor performances. Time delay exists due to the finite speeds of the switching and transmission of signals in a network, which is degenerate to the instability of networks furthermore. For model (1), Ye et al. [11] introduced delays by considering the following differential equation: Then, the dynamics of delayed neural networks has been widely studied; see [1, 1118] for some recent results concerning mixed delays. The CGNN models with discrete delays and distributed delays can be characterized as follows:

System (3) for convenience can be rewritten as the following compact matrix form: where is the neural state vector, , , are appropriate dimensions functions, , , and is the constant external input.

With the rapid development of intelligent control, hybrid systems have been investigated extensively for their significance, which is in theory and application. As one of the most important classes of hybrid systems, switched systems have drawn increasing attention in the last decade [1921]. A typical switched systems are composed of a set of subsystems and a logical switching rule whose subsystem will be activated at each instant of time and orchestrates the switching among the subsystems [22]. In general, the switching rule is a piecewise constant function dependent on the state or time. The logical rule that orchestrates switching between these subsystems generates switching signals [23]. Recently, many results on the stability of switched system with time delay and parametric uncertainties have been reported [24, 25]. Switched system in which all subsystems are stable was studied in [26], and Hu and Michel used dwell time approach to analyse the local asymptotic stability of non-linear switched systems in [27]. Hespanha and Morse [28] extended this concept to develop the average dwell time approach subsequently. Furthermore, in [29], the stability results of switched system extended to the case when subsystems are both stable and unstable have been reported, and therefore we derive less conservative results. So, average dwell time (ADT) approach turns out to be an effective tool to study the stability of switched systems [28] and specially when not all subsystems are stable [29].

Meanwhile, neural networks as a special kind of complex networks, the connection topology of networks may change frequently and often lead to link failure or new link creation during the hardware implemtation. Hence, the abrupt changes in the network structure often occur, and switchings between some different topologies are inevitable [30]. Thus, the switched neural network was proposed and has successful applications in the field of high-speed signal processing and artificial intelligence [31, 32], and switched neural networks are also used to perform the gene selection in a DNA microarray analysis in [33]. Thus, it is of great meaning to discuss the switched neural networks. Recently, the stability of switching neural networks has been intensively investigated [3436]. Robust exponential stability and control for switched neutral-type neural networks were discussed in [34].

In [35], delay-dependent stability analysis for switched neural networks with time-varying delay was analyzed. In [36], by employing nonlinear measure and LMI techniques, some new sufficient conditions were obtained to ensure global robust asymptotical stability and global robust exponential stability of the unique equilibrium for a class of switched recurrent neural networks with time-varying delay.

By combining the theories of switched systems and Cohen-Grossberg neural networks, the mathematical model of the switched Cohen-Grossberg neural networks is discussed in detail, which can be written as follows:

The function is a piece-wise constant function of time, called a switching signal, which specifies that subsystem will be activated. denotes the number of subsystems. The switched sequence can be described as , where denotes the initial time and is the th switching instant. Moreover, means that the th subsystem is activated. For any , this means that the matrices can taken values in the finite set . Meanwhile, we assume that the state of the switched CGNN does not jump at the switching instants; that is, the trajectory is everywhere continuous.

However, these available literatures mainly consider the stability property of switching neural networks. In fact, except for stability property, boundedness and attractor are also foundational concepts of dynamical neural networks, which play an important role in investigation of the uniqueness of equilibrium point (periodic solutions), stability and synchronization and so on [13, 14]. To the best of the author’s knowledge, few authors have considered the uniformly ultimate boundedness and attractors for switched CGNN with discrete delays and distributed delays.

As is well known, compared with linear matrix inequalities (LMIs) result, algebraic result is more conservative, and criteria in terms of LMI can be easily checked by using the powerful Matlab LMI toolbox. This motivates us to investigate the problems of the uniformly ultimate boundedness and the existence of an attractor for switched CGNN in this paper. The illustrative examples are given to demonstrate the validity of the theoretical results.

The paper is organized as follows. In Section 2, preliminaries and problem formulation are introduced. Section 3 gives the sufficient conditions of uniformly ultimate boundedness (UUB) and the existence of an attractor for switched CGNN. It is the main result of this paper. In Section 4, an example is given to illustrate the effectiveness of the proposed approach. The conclusions are summarized in Section 5.

2. Problem Formulation

Throughout this paper, we use the following notations. The superscript “” stands for matrix transposition; denotes the -dimensional Euclidean space; the notation means that is real symmetric and positive definite; and represent the identity matrix and a zero matrix; stands for a block-diagonal matrix; denotes the minimum eigenvalue of symmetric matrix ; in symmetric block matrices or long matrix expressions, “” is used to represent a term that is induced by symmetry. Matrices, if their dimensions are not explicitly stated, are assumed to be compatible for algebraic operations.

Consider the following Cohen-Grossberg neural network model with mixed delays (discrete delays and distributed delays): where

The discrete delays and distributed delays are bounded as follows: where , , are scalars.

As usual, the initial conditions associated with system (6) are given in the form where is a differentiable vector-valued function.

Throughout this paper, we make the following assumptions. For any , there exist constants and , such that

Remark 1. The constants and can be positive, negative, or zero. Therefore, the activation functions are more general than the forms , , . For continuously bounded function , there exist positive constants , , such that There exist positive constants , such that So, to obtain main results of this paper, the following definitions and lemmas are introduced.

Definition 2 (see [15]). System (6) is uniformly ultimately bounded; if there is , for any constant , there is , such that for all , , , where the supremum norm .

Definition 3 (see [37]). The nonempty closed set is called the attractor for the solution of system (6) if the following formula holds: in which , .

Definition 4 (see [28]). For a switching signal and each , let denote the number of discontinuities of in the interval . If there exist and such that holds, then is called the average dwell time. is the chatter bound.

Remark 5. In Definition 4, it is obvious that there exists a positive number such that a switched signal has the ADT property, which means that the average dwell time between any two consecutive switchings is no less than a specified constant , Hespanha and Morse have proved that if is sufficiently large, then the switched system is exponentially stable. In addition, in [18], one can choose , but in our paper, we assume that , this is more preferable.

Lemma 6 (see [16]). For any positive definite constant matrix , scalar , and vector function , , then

Lemma 7 (see [38]). For any given symmetric positive definite matrix and scalars , , if there exists a vector function such that the following integration is well defined, then

3. Main Results

Theorem 8. For a given constant , if there is positive definite matrix , , , such that the following condition holds: where the symbol “” within the matrix represents the symmetric term of the matrix, and then system (6) is uniformly ultimately bounded.

Proof. Let us consider the following Lyapunov-Krasovskii functional: where We proceed to evaluate the time derivative of along the trajectory of system (6), and one can get According to assumption (), we obtain the following inequality: From assumption () and inequality (21), we obtain
Similarly, taking the time derivative of along the trajectory of system (6), we obtain According to Lemma 6, we can conclude that
Computing the derivative of along the trajectory of system (6) turns out to be where .
Denoting that , we obtain
Using Lemma 7, the following inequality is easily obtained:
From assumption , it follows that, for , Then, let So,
Using (20)–(27) and adding (29), we can derive Denote that
By integrating both sides of (31) in time interval , then we can obtain which implies that where , and
If one chooses , then for any constant and , there is , such that for all . According to Definition 2, we have for all . That is to say, system (6) is uniformly ultimately bounded. This completes the proof.

From (18), we know that there is a positive constant , such that

Thus, considering (34) and (36), we have the following result: where .

Theorem 9. If all of the conditions of Theorem 8 hold, then there exists an attractor for the solutions of system (6), where .

Proof. If one chooses , Theorem 8 shows that for any , there is , such that for all . Let . Clearly, is closed, bounded, and invariant. Furthermore, . Therefore, is an attractor for the solutions of system (6).

Corollary 10. In addition to the fact that all of the conditions of Theorem 8 hold, if , and , then system (6) has a trivial solution , and the trivial solution of system (6) is globally exponentially stable.

Proof. If , and , then it is obvious that system (6) has a trivial solution . From Theorem 8, one has where .
Therefore, the trivial solution of system (6) is globally exponentially stable. This completes the proof.

In this section, we will present conditions for uniformly ultimate boundedness and the existence of an attractor of the switching CGNN by applying the average dwell time.

Now, we can consider the switched Cohen-Grossberg neural networks with discrete delays and distributed delays as follows:

Theorem 11. For a given constant , if there is positive definite matrix , , , such that the following condition holds: where Then, system (39) is uniformly ultimately bounded for any switching signal with average dwell time satisfying where , , .

Proof. Define the Lyapunov functional candidate of the form When , the th subsystem is activated, and from Theorem 8 and (34), we can conclude that there is a positive constant , such that where
The system state is continuous. Therefore, it follows that If one chooses , then for any constant and , there is , such that for all . According to Definition 2, we have for all . That is to say, the switched Cohen-Grossberg neural networks system (39) is uniformly ultimately bounded. This completes the proof.

Theorem 12. If all of the conditions of Theorem 11 hold, then there exists an attractor for the solutions of system (39), where .

Proof. If we choose , Theorem 11 shows that for any , there is , such that for all . Let . Clearly, is closed, bounded, and invariant. Furthermore, .
Therefore, is an attractor for the solutions of system (39).

Corollary 13. In addition to the fact that all of the conditions of Theorem 8 hold, if and , then system (39) has a trivial solution , and the trivial solution of system (39) is globally exponentially stable.

Proof. If and , then it is obvious that the switched system (39) has a trivial solution . From Theorem 8, one has where .
It means that the trivial solution of the switched Cohen-Grossberg neural networks (39) is globally exponentially stable. This completes the proof.

Remark 14. It is noted that common Lyapunov function method requires all the subsystems of the switched system to share a positive definite radially unbounded common Lyapunov function. Generally speaking, this requirement is difficult to achieve. So, in this paper, we select a novel multiple Lyapunov function to study the uniformly ultimate boundedness and the existence of an attractor for switched Cohen-Grossberg neural networks. furthermore, this type of Lyapunov function enables us to establish less conservative results.

Remark 15. When , we have , , , , , then the switched Cohen-Grossberg neural networks (4) degenerates into a general Cohen-Grossberg neural networks with time-delay [15, 17]. Obviously, our result generalizes the previous result.

Remark 16. It is easy to see that is equivalent to existence of a common function for all subsystems, which implies that switching signals can be arbitrary. Hence, the results reported in this paper are more effective than arbitrary switching signal in the previous literature [16].

Remark 17. The constants , in assumption are allowed to be positive, negative, or zero, whereas the constant is restricted to be the zero in [1, 15], and the non-linear output function in [5, 18, 3437] is required to satisfy . However, in our paper, the assumption condition was deleted. Therefore, assumption of this paper is weaker than those given in [1, 5, 15, 18, 3437].

4. Illustrative Examples

In this section, we present an example to show the effectiveness and advantages of the proposed method and consider the switched neural networks with two subsystems.

Example. Consider the following switched Cohen-Grossberg neural network with discrete delays and distributed delays: where the behaved function is described by , and ; let Take the parameters as follows:

From assumptions , , we can gain , , , , , , , , and and .

Therefore, for and , by solving the inequality (41), we get Using (41), we can get the average dwell time .

5. Conclusion

In this paper, the dynamics of switched Cohen-Grossberg neural networks with average dwell time is investigated. A novel multiple Lyapunov-Krasovskii functional is designed to get new sufficient conditions guaranteeing the uniformly ultimate boundedness, the existence of an attractor, and the globally exponential stability. The derived conditions are expressed in terms of LMIs, which are more relaxed than algebraic formulation and can be easily checked by the effective LMI toolbox in Matlab in practice. Based on the method provided in this paper, stochastic disturbance, impulse, and reaction diffusion for switched systems will be considered in the future works.

Acknowledgments

This work was jointly supported by the National Natural Science Foundation of China under Grant no. 11101053 and 11101435, the Key Project of Chinese Ministry of Education under Grant no. 211118, the Excellent Youth Foundation of Educational Committee of Hunan Provincial no. 10B002, the Hunan Provincial NSF no. 11JJ1001, and National Science and Technology Major Projects of China no. 2012ZX10001001-006, the Scientific Research Fund of Hunan Provincial Education Department of China no. 12C0028.