Abstract

The stability for the switched Cohen-Grossberg neural networks with mixed time delays and -inverse Hölder activation functions is investigated under the switching rule with the average dwell time property. By applying multiple Lyapunov-Krasovskii functional approach and linear matrix inequality (LMI) technique, a delay-dependent sufficient criterion is achieved to ensure such switched neural networks to be globally exponentially stable in terms of LMIs, and the exponential decay estimation is explicitly developed for the states too. Two illustrative examples are given to demonstrate the validity of the theoretical results.

1. Introduction

In the past few decades, there has been increasing interest in different classes of neural networks such as Hopfield, Cellular, Cohen-Grossberg, and bidirectional associative neural networks due to their potential applications in many areas such as classification, signal and image processing, parallel computing, associate memories, optimization, and cryptography [15]. In the design of practical neural networks, the qualitative analysis of neural network dynamics plays an important role. To solve problems of optimization, neural control, and signal processing, neural networks have to be designed in such a way that, for a given external input, they exhibit only one globally asymptotically/exponentially stable equilibrium point. Hence, much effort has been made in the stability of neural networks, and a number of sufficient conditions have been proposed to guarantee the global asymptotic/exponential stability for neural networks with or without delays in recent years, see, for example, [626] and the references therein.

Recently, by combing the theories of the switched systems and neural networks, several classes of mathematics models of switched neural networks have been established. As a special class of switched systems, switched neural networks, whose individual subsystems are a set of neural networks, have found applications in fields of high speed signal processing, artificial intelligence, and gene selection in a DNA microarray analysis [2730].

Stability issues of switched neural networks have received great attention of researchers so far [3139]. In [31], based on the Lyapunov-Krasovskii method and LMI approach, some sufficient conditions were derived for global robust exponential stability of a class of switched Hopfield neural networks with time-varying delay under uncertainty. In [32], by combining Cohen-Grossberg neural networks with an arbitrary switching rule, the mathematical model of a class of switched Cohen-Grossberg neural networks with mixed time varying delays was established, and the robust stability for such switched Cohen-Grossberg neural networks was analyzed. In [33], by employing nonlinear measure and LMI techniques, some new sufficient conditions were obtained to ensure global robust asymptotical stability and global robust exponential stability of the unique equilibrium for a class of switched recurrent neural networks with time-varying delay. In [34], authors investigated a large class of switched recurrent neural networks with time-varying structured uncertainties and time-varying delay, some delay-dependent robust periodicity criteria guaranteeing the existence, uniqueness, and global asymptotic stability of periodic solution for all admissible parametric uncertainties were devised by employing free weighting matrices and LMIs. In [35], a new class of switched interval neural networks with discrete and distributed time-varying delays of neural type was developed, and a delay-dependent sufficient criterion was also obtained in terms of LMIs which guarantee the global robust exponential stability for the proposed switched interval neural networks. It should be pointed out that results in [3135] focused on the stability of switched neural networks under arbitrary switching rule by using common Lyapunov function method. However, common Lyapunov function method requires all the subsystems of the switched system to share a positive definite radially unbounded common Lyapunov function. Generally, this requirement is difficult to achieve.

In past few years, much attention has been paid to making use of the dwell time approach to deal with the analysis and synthesis of switched neural networks [3639]. It should be pointed out that the average dwell time method is regarded as an important and attractive method to find a suitable switching signal to guarantee switched system stability or improve other performance and has been widely applied to investigate the analysis and synthesis for switched system with or without time-delay, see for example, [4044]. Very recently, in [36], based on multiple Lyapunov functions method and LMI techniques, the authors presented some sufficient conditions in terms of LMIs which guarantee the robust exponential stability for uncertain switched Cohen-Grossberg neural networks with interval time-varying delay and distributed time-varying delay under the switching rule with the average dwell time property. In [37], by using the average dwell time method, the delay-dependent sufficient conditions were derived towards the robust exponential stability for a class of discrete-time switched Hopfield neural networks with time delay. In [38], by applying a new Lyapunov-Krasovskii functional and the average dwell time method, a delay-range-dependent exponential stability criteria and decay estimation are presented in terms of LMIs for switched Hopfield neural networks.

It should be noted that, all the results reported in [3139] are concerned with switched neural networks with Lipschitz activation functions. To the best of our knowledge, very little attention has been paid to the problem of delay-dependent stability for switched neural networks without Lipschitz activation functions, which motivates the work of this paper.

In this paper, our aim is to study the delay-dependent exponential stability problem for a class of switched Cohen-Grossberg neural networks with mixed time delays and -inverse Hölder activation functions. Here, it should be pointed out that -inverse Hölder activation functions are a class of non-Lipschitz functions. By applying Brouwer degree properties, LMI technique and constructing a novel Lyapunov-Krasovskii functional, the existence, uniqueness and global exponential stability of equilibrium point are proved for Cohen-Grossberg neural networks with mixed time delays and -inverse Hölder activation functions. By means of the multiple Lyapunov-Krasovskii functional and the average dwell time approach, a delay-dependent sufficient condition in terms of LMIs is presented to ensure to the considered switched neural networks to be globally exponentially stable, and a explicit expression for the exponential decay estimation is also obtained for the states. Two illustrative examples are given to demonstrate the validity of the theoretical results.

The rest of this paper is organized as follows. In Section 2, the model formulation and some preliminaries are given. In Section 3, the existence, uniqueness, and global exponential stability of equilibrium point are proved for Cohen-Grossberg neural networks with mixed time delays and -inverse Hölder activation functions. In Section 4, the global exponential stability criterion and state decay estimation are presented for the switched Cohen-Grossberg neural networks with mixed time delays and -inverse Hölder activation functions. In Section 5, two numerical examples are presented to demonstrate the validity of the proposed results. Some conclusions are made in Section 6.

Notations 1. Throughout this paper, denotes the set of real numbers, denotes the -dimensional Euclidean space, denotes the set of all real matrices. For any matrix , denotes the transpose of . denotes the inverse of . If is a real symmetric matrix, means that is positive definite (negative definite). Given the column vectors , . . denotes the derivative of , represents the symmetric form of matrix.

2. Neural Network Model and Preliminaries

The Cohen-Grossberg neural networks with mixed time delays can be described by the following differential equation system where is the vector of neuron states at time ; represents the amplification function; is the behaved function; is called the neuron activation function; , are the connection weight matrices; denotes the discrete and distributed time delay; denotes the external input. The initial value associated with (2.1) is assumed to be , and is a continuous function on .

In the following, some definitions and lemmas, which play important roles in the proof of our theorems below, are introduced.

Definition 2.1. The equilibrium point of the neural networks (2.1) is said to be globally exponentially stable, if there exist scalars, , and ,such that where is the solution of the system (2.1) with the initial value , and , is called the exponential convergence rate.

Definition 2.2 (see [23, 24]). A continuous function,is said to be an-inverse Hölder function, if(i) is a monotonic nondecreasing function,(ii)for any , there exist constants and which are correlated with , satisfying where is a constant.

The class of -inverse Hölder functions is denoted by . There are a great number of functions which belong to . For example, , and .

Remark 2.3. If a continuous function satisfies that where is a constant, then is said to be a Lipschitz-continuous function. When and is independent on , 1-inverse Hölder functions are called inverse Lipschitz functions. It is easy to see that -inverse Hölder functions are a class of non-Lipschitz functions.

Let function be locally Lipschitz continuous. According to Rademacher's theorem [45], is differentiable almost everywhere. Let denote the set of those points where is differentiable, then, is the Jacobian of at and the set is dense in . The generalized Jacobian of a locally Lipschitz function is defined as follows.

Definition 2.4. For any, the generalized Jacobianof a locally Lipschitz continuous  functionis a set of matrices defined by where denotes the convex hull of a set.

The generalized Jacobian is a natural generalization of the Jacobian for continuously differentiable functions, at those points , where is continuously differentiable, reduces to a single matrix which is Jacobian of .

Definition 2.5. For any switching signaland any finite constantssatisfying, letdenote the switching times on the time interval. If holds for, thenis said to be the average dwell time.

Lemma 2.6 (see [23, 24]). If , then for any , one has

Lemma 2.7 (see [23, 24]). If and , then there exist constants and , such that Moreover,

Lemma 2.8 (see [25]). Let be locally Lipschitz continuous. For any given , there exists an element in the union such that where denotes the segment connecting and .

Let be a nonempty, bounded, and open subset of . The closure and boundary of are denoted by and , respectively.

Lemma 2.9 (see [46]). Let be a continuous mapping. If , then Brouwer degree is constant . In this case, one has .
Let be a continuous mapping. If , then the equation has at least a solution in .

Lemma 2.10. Let and , then

Lemma 2.11 (Schur complement). Given constant symmetric matrices where and , then if and only if

Lemma 2.12 (Jensen's inequality). For any constant matrix , vector function , such that the integrations concerned are well defined, then

To give our main results in the next sections, we need to present the following assumptions. is continuous, and for all , . is locally Lipschitz continuous, and there exists a constant such that for all at which is differentiable, ., .

3. Exponential Stability of the Cohen-Grossberg Neural Network

In this section, the Cohen-Grossberg neural network (2.1) is considered the main results on the existence and stability of equilibrium point of the neural network (2.1) will be presented in the following theorem.

Theorem 3.1. Under the assumptions , if there exist two positive definite matrices , two positive definite diagonal matrices , and a scalar such that is satisfied, where , , , , then the Cohen-Grossberg neural network (2.1) has one unique equilibrium point which is globally exponentially stable.

Proof. We should prove this theorem in three steps.
Step 1. In this step, we will prove the existence of the equilibrium point.
Let . By the assumption , is an equilibrium point of the system (2.1) if and only if . Rewrite as where . Obviously, , and . By Lemma 2.8 and the assumption , it can follow that , where with .
Let . Define the mappingas where .
From (3.1) and the definition of negative definite matrix, we can obtain By Lemma 2.11, (3.4) is equivalent to
By means of Lemma 2.10, we have
By using (3.5) and (3.6), we have where denotes the th element of vector .
By virtue of Lemma 2.7, there exist constants and , , such that
Let , . Define , . Noting that is a compact subset of , and is continuous on , it follows that can reach its the minimum on .
Let , , , and . Set and , then there exist two index sets and , such that where . Furthermore, there exists an index in such that By (3.8) and (3.10), for any and , Hence, this implies that for any and . By applying Lemma 2.9 (1), it follows that is, , where is the determinant of . By Lemma 2.9 (2), has at least one solution in , that is, the system (2.1) has at least an equilibrium point.

Step 2. In this step, the proof of the uniqueness of the equilibrium point by the method of contradiction will be given.
Assume that and are two different equilibrium points of the system (2.1), then From Lemma 2.8, it can follow that where , with .
By means of (3.5), (3.6), (3.13), and (3.14), it follows that This is a contradiction. Hence, . This shows that the equilibrium point of the system (2.1) is unique.

Step 3. In this step, we will prove that the system (2.1) is globally exponentially stable.
Letwhere . Since , are continuous functions, are continuous and locally bounded. Hence, we can obtain the existence of the local solution of the system (2.1) with initial value on , where or , and is the maximal right-side existence interval of the local solution.
Let be the unique equilibrium point of the system (2.1). Make a transformation , then system (2.1) is transformed into where, , , , , . Similarly to (3.14), from Lemma 2.8, we have where , with .
Consider the following Lyapunov-Krasovskii functional candidate where Calculating the time derivative of along the trajectories of the system (3.16) on with (3.17), by the assumption and Lemma 2.12, we have Let . By (3.17) and (3.20), we can obtain This implies , . Furthermore, from (3.18), it follows that By (3.22) and Lemma 2.6, it is easy to derive that , are bounded on . By virtue of the continuous theorem [47], .
Since , by Lemma 2.7, there exist constants and , such that Moreover, from (3.22), we can get that Thus there exists a scalar , when , , where . Let , , and . From (3.22) and (3.23), we have That is, when , where Let . By (3.26), , for all . This shows that the equilibrium point of the system (2.1) is globally exponentially stable. This completes the proof of Theorem 3.1.

4. Stability of the Switched Cohen-Grossberg Neural Network

The switched Cohen-Grossberg neural networks with mixed time delays consist of a set of Cohen-Grossberg neural networks with mixed time delays and a switching rule. Each of the Cohen-Grossberg neural networks with mixed time delays is regarded as an individual subsystem. The operation mode of the switched neural networks is determined by the switching rule. In the following, we will develop the switched Cohen-Grossberg neural networks model with mixed time delays.

Suppose that is the unique equilibrium point of the system (2.1). Similar to (3.16), make a transformation , then system (2.1) is transformed into

The switched Cohen-Grossberg neural networks with mixed time delays can be described as where is the switching signal, which is a piecewise constant function of time. This means that the matrices are allowed to take values, at an arbitrary time, in the finite set . The initial value associated with (4.3) is assumed to be , is a continuous function on .

In this paper, it is assumed that the switching rule is not known a priori and its instantaneous value is available in real time. Corresponding to the switching signal , we have the switching sequence , which means that the th subsystem is activated when .

In the following, we will consider the switched Cohen-Grossberg neural networks with mixed time delays in (4.2). The average dwell time approach will be used to derive the exponential stability of the network.

Theorem 4.1. Under the assumptions , if there exist positive definite matrices , positive definite diagonal matrices , and scalars such that where , , then the switched Cohen-Grossberg neural network (4.2) is globally exponentially stable for any switching signal with average dwell time satisfying Moreover, an estimate of the state decay for the system (4.2) is given by

Proof. Consider the multiple Lyapunov-Krasovskii functional candidate where When , the th subsystem is activated. Arguing as in the proof Theorem 3.1, we can get . Thus, . In the light of (4.4) and (4.7), it follows that . Therefore, when , we have By (4.9); there exist constants , when , we have where , and . Hence, This implies that Due to and , then we can get This implies that the switched Cohen-Grossberg neural network (4.2) is globally exponentially stable. The proof is completed.

Remark 4.2. It is clear that, according to Theorem 4.1, the delay-dependent stability of the considered neural networks is dependent on for given . If , we have from (4.5) that the switching signal can be arbitrary, and (4.4) reduces to , which implies , which means that it requires a common Lyapunov functional for all subsystems. If , we get from (4.5) that there is no switching, that is, switching signal will have a great dwelltime on the average.

Remark 4.3. In the existing literature [2, 3, 17, 22, 2729, 3438], the activation functions of switched neural networks are required to be Lipschitz continuous. However, in this paper, the activation function is inverse Hölder function. It is obvious the results of this paper are different from the results in the above literatures. Hence, the work in this paper is an extension of the scope of the current investigation in this field.

5. Illustrative Examples

In this section, two illustrative examples will be given to check the validity of the results obtained in Theorems 3.1 and 4.1.

Example 5.1. Consider a second-order Cohen-Grossberg neural networks with mixed time delays in (2.1) with the following parameters: Set , . The activation functions are taken as , , and .
It is easy to check that assumptions hold. is locally Lipschitz; and is the second-order identity matrix.
Take . Solving the LMI in (3.1) by using appropriate LMI solver in the Matlab, the feasible positive definite matrices could be as All assumptions of Theorem 3.1 hold. Hence, this neural network has one unique equilibrium point, which is globally exponentially stable.
Figures 1 and 2 display the state trajectories of this neural network with initial values and . It can be seen that these trajectories converge to the unique equilibrium of the network. This is in accordance with the conclusion of Theorem 3.1.

Example 5.2. Consider a third-order switched Cohen-Grossberg neural networks with mixed time delays in (4.3) with the switching signal and the following parameters: Set , are the functions in Example 5.1, and the activation functions are taken as , , and .

It is easy to check that assumptions hold; and is the third-order identity matrix.

Take . Solving the LMIs in (4.3) and (4.4) by using appropriate LMI solver in the Matlab, the feasible positive definite matrices could be as By using (4.5), it follows that the average dwell time . All the assumptions of Theorem 4.1 hold. Hence, this switched neural network is globally exponentially stable.

For numerical simulation, assume that the two subsystems are switched every four seconds. Figures 3, 4, and 5 display the state trajectories of this neural network with initial values,

and . It can be seen that these trajectories converge to the unique equilibrium of the network. This is in accordance with the conclusion of Theorem 4.1.

6. Conclusion

In this paper, the existence, uniqueness, and global stability of the equilibrium point for Cohen-Grossberg neural networks with mixed time delays,-inverse Hölder neuron activation functions, and nonsmooth behaved functions have been discussed. By applying multiple Lyapunov-Krasovskii functional, a delay-dependent global exponential stability criterion has been obtained in terms of LMIs for the switched Cohen-Grossberg neural networks with mixed time delays and -inverse Hölder neuron activation functions under the switching rule with the average dwell time property. The results obtained are easily checked and applied in practice engineering.

When neuron activation functions are non-Lipschitz functions, it is possible that the neural network system has not the global solution and the equilibrium point. This leads to difficulty in solving the stability problem, particularly exponential stability for the switched neural networks with non-Lipschitz activation functions. In the future, the stability problem for the switched neural networks with other non-Lipschitz activation functions will be expected to be solved.

Acknowledgments

This work was supported by the Natural Science Foundation of Hebei Province of China (A2011203103) and the Hebei Province Education Foundation of China (2009157).