Research Article | Open Access

# Attracting and Quasi-Invariant Sets of Cohen-Grossberg Neural Networks with Time Delay in the Leakage Term under Impulsive Perturbations

**Academic Editor:**Daoyi Xu

#### Abstract

A class of impulsive Cohen-Grossberg neural networks with time delay in the leakage term is investigated. By using the method of -matrix and the technique of delay differential inequality, the attracting and invariant sets of the networks are obtained. The results in this paper extend and improve the earlier publications. An example is presented to illustrate the effectiveness of our conclusion.

#### 1. Introduction

Cohen-Crossberg neural network model, which is initially proposed by Cohen and Grossberg [1] in 1983, has been found successful applications in many fields such as pattern recognition, parallel computing, associative memory, signal and image processing, and combinatorial optimization. Hence, there has been increasing interest in studying the stability and asymptotic behavior of this model with delays, impulses, and unique equilibrium, and many significant results have been obtained (see, e.g., [2–6]). However, the equilibrium point sometimes does not exist in many real physical systems, so it is an interesting subject to discuss the attracting and invariant sets of the neural networks [7, 8].

On the other hand, a leakage delay, which is the time delay in the leakage term and a factor affecting the stability of the system, has attracted considerable attentions (see, e.g., [9–13]). However, to the best of our knowledge, so far there are few results on the attracting and invariant sets of the Cohen-Grossberg neural networks with leakage delay. Motivated by the above discussion, in this paper, we investigate the attracting and quasi-invariant sets of a class of impulsive Cohen-Grossberg neural networks with leakage delay. By using the method of -matrix and the technique of delay differential inequality, the attracting and invariant sets of the addressed networks are obtained. The results in this paper extend and improve the earlier publications. An example is presented to illustrate the effectiveness of our conclusion.

#### 2. Model Description and Preliminaries

Let be the space of -dimensional real column vectors, , , , and denotes the set of real matrices. Usually, denotes an unit matrix. For or , the notation (, , ) means that each pair of corresponding elements of and satisfies the inequality “ (, , )”. Particularly, is called a nonnegative matrix if , and is called a positive vector if .

Let ; for , we define

denotes the space of continuous mappings from the topological space to the topological space . Particularly, let .

is continuous and with continuous derivative everywhere except at finite number of point at which , , , and exist and , , where denotes the derivative of . is a space of piecewise right-hand continuous functions with the norm , , where is a norm in .

is continuous at , and exist, , for .

In this paper, we consider the following Cohen-Grossberg neural networks with impulses and time delays: where corresponds to the number of units in a neural network; corresponds to the state of the th unit at time ; and are the activation functions of the th unit; denotes the transmission delay and satisfies ( is a constant); is the leakage delay. Consider ; represents the amplification function of the th neuron; is the behaved function at time . Consider . The fixed impulsive moments () satisfy and .

*Definition 1 (see [14]). *A function is said to be a solution of (2) through , if as , and satisfies (2) with the initial condition

Throughout the paper, we always assume that, for any , system (2) has at least one solution through , denoted by or (simply and if no confusion should occur), where , .

*Definition 2 (see [7]). *The set is called a positive invariant set of (2), if for any initial value we have the solution for .

*Definition 3 (see [7]). *The set is called a quasi-invariant set of (2), if there exist a matrix and a vector such that, for any , there exists a vector such that the solution of (2) satisfies , , as . Obviously, the set is an invariant set of (2) if and .

*Definition 4 (see [7]). *The set is called a global attracting set of (2), if for any initial value the solution converges to as . That is,
where , for .

*Definition 5 (see [8]). *The zero solution of (2) is said to be globally exponentially stable if for any solution there exist constants and such that , .

*Definition 6 (see [15]). *Let the matrix have nonpositive off-diagonal elements (i.e., , ); then each of the following conditions is equivalent to the statement that is a nonsingular -matrix.(i)All the leading principle minors of are positive.(ii) and , where and .(iii)The diagonal elements of are all positive and there exists a positive vector such that or .

For a nonsingular matrix , we denote .

Lemma 7 (see [14]). *For a nonsingular -matrix , is nonempty, and for any we have
**
So is a cone without conical surface in . We call it an “-cone.”*

Lemma 8 (see [16]). *Let and satisfy
**
where , , (), , , , and , . Suppose that is a nonsingular -matrix.**(1)* If the initial condition satisfies
*
where , , then , for .**(2)* Suppose that there exist a scalar and a vector such that
*
where
**
If the initial condition satisfies
**
then
*

#### 3. Main Results

In this paper, we always suppose the following.(A1), where is a constant, .(A2) is differentiable, and there exist constants such that , , for any .(A3) and are Lipschitz continuous; that is, there exist constants and such that, for any , (A4) is a nonsingular -matrix, where (A5), , for any , where , .(A6)For , where , , and , satisfy where .

Theorem 9. *Assume that (A1)–(A6) hold. Then, for any , the set is a quasi-invariant set of (2).*

*Proof. *Combining with the middle value theorem, from (2) we can get
where .*Case 1*. Let us first consider . In this case , . Notice that ; there exist such that , for all .

From (A1)–(A4) and (16), we have
Hence,
*Case 2*. Let us consider . From (16), we get
Then from (A1)–(A4), we have
From (17) and (20), we get

For the initial conditions , , where , we have
By (21) and Lemma 8, we have

Suppose that for all the inequalities
hold, where . Then, from (A5) and (A6),
This, together with , leads to
On the other hand,
It follows from (A4), (26), (27), and Lemma 8 that
By the induction, we can conclude that
From (A6),
and we have
This implies that the conclusion holds and the proof is complete.

Theorem 10. *Assume that (A1)–(A6) hold. Then the set is a global attracting set of (2).*

*Proof. *By a similar proof of Theorem 9, we can get (21) and (31).

By continuity, we can find and an enough small such that
where

For the initial conditions , , where , we have
and so
By the induction and Lemma 8, we can conclude that
From (A6), we conclude that
This implies that the conclusion holds and the proof is complete.

Theorem 11. *Assume that (A1)–(A5) with hold. Then is a positive invariant set and also a global attracting set of (2).*

*Proof (straightforward). *Obviously, if , then is a solution of (2).

Corollary 12. *Assume that (A1)–(A5) with , hold. If
**
where satisfy , , then the zero solution of (2) is globally exponentially stable.*

*Remark 13. *If , and , then from Theorems 9 and 10 we can get Theorem 5.1 and Corollary 5.1 in [16].

*Remark 14. *If , and , then from Corollary 12 we can get Corollary 3.1 in [5].

#### 4. Illustrative Example

Consider the Cohen-Grossberg neural networks (2) with the following parameters, activation functions, amplification functions, behaved functions, and delay functions ():

Obviously, , , , and . The parameters of condition (A4) are as follows:

We can easily observe that is a nonsingular -matrix and

Let and which satisfies the inequality

Clearly, all conditions of Theorems 9 and 10 are satisfied. So is a quasi-invariant set and also a global attracting set of (2).

#### Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

#### Acknowledgments

This work was supported by the National Natural Science Foundation of China (11171374) and Natural Science Foundation of Shandong Province (ZR2011AZ001).

#### References

- M. A. Cohen and S. Grossberg, “Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,”
*IEEE Transactions on Systems, Man, and Cybernetics*, vol. 13, no. 5, pp. 815–826, 1983. View at: Publisher Site | Google Scholar | MathSciNet - H. Lu, R. Shen, and F.-L. Chung, “Global exponential convergence of Cohen-Grossberg neural networks with time delays,”
*IEEE Transactions on Neural Networks*, vol. 16, no. 6, pp. 1694–1696, 2005. View at: Publisher Site | Google Scholar - S. Guo and L. Huang, “Stability analysis of Cohen-Grossberg neural networks,”
*IEEE Transactions on Neural Networks*, vol. 17, no. 1, pp. 106–117, 2006. View at: Publisher Site | Google Scholar - Y. Chen, “Global asymptotic stability of delayed Cohen-Grossberg neural networks,”
*IEEE Transactions on Circuits and Systems I: Regular Papers*, vol. 53, no. 2, pp. 351–357, 2006. View at: Publisher Site | Google Scholar | MathSciNet - Z. Yang and D. Xu, “Impulsive effects on stability of Cohen-Grossberg neural networks with variable delays,”
*Applied Mathematics and Computation*, vol. 177, no. 1, pp. 63–78, 2006. View at: Publisher Site | Google Scholar | MathSciNet - O. Faydasicok and S. Arik, “Robust stability analysis of a class of neural networks with discrete time delays,”
*Neural Networks*, vol. 29-30, pp. 52–59, 2012. View at: Publisher Site | Google Scholar - D. Xu and Z. Yang, “Attracting and invariant sets for a class of impulsive functional differential equations,”
*Journal of Mathematical Analysis and Applications*, vol. 329, no. 2, pp. 1036–1044, 2007. View at: Publisher Site | Google Scholar | MathSciNet - D. Y. Xu and S. J. Long, “Attracting and quasi-invariant sets of non-autonomous neural networks with delays,”
*Neurocomputing*, vol. 77, no. 1, pp. 222–228, 2012. View at: Publisher Site | Google Scholar - K. Gopalsamy,
*Stability and Oscillations in Delay Differential Equations of Population Dynamics*, vol. 74 of*Mathematics and its Applications*, Kluwer Academic Publishers, Dordrecht, Netherlands, 1992. View at: Publisher Site | MathSciNet - K. Gopalsamy, “Leakage delays in BAM,”
*Journal of Mathematical Analysis and Applications*, vol. 325, no. 2, pp. 1117–1132, 2007. View at: Publisher Site | Google Scholar | MathSciNet - X. Li, X. Fu, P. Balasubramaniam, and R. Rakkiyappan, “Existence, uniqueness and stability analysis of recurrent neural networks with time delay in the leakage term under impulsive perturbations,”
*Nonlinear Analysis. Real World Applications*, vol. 11, no. 5, pp. 4092–4108, 2010. View at: Publisher Site | Google Scholar | MathSciNet - M. J. Park, O. M. Kwon, J. H. Park, S. M. Lee, and E. J. Cha, “Synchronization criteria for coupled stochastic neural networks with time-varying delays and leakage delay,”
*Journal of the Franklin Institute*, vol. 349, no. 5, pp. 1699–1720, 2012. View at: Publisher Site | Google Scholar | MathSciNet - G. Y. Chen and L. S. Wang, “Global exponential Robust stability of static interval neural networks with time delay in the leakage term,”
*Journal of Applied Mathematics*, vol. 2014, Article ID 972608, 6 pages, 2014. View at: Publisher Site | Google Scholar | MathSciNet - D. Xu and Z. Yang, “Impulsive delay differential inequality and stability of neural networks,”
*Journal of Mathematical Analysis and Applications*, vol. 305, no. 1, pp. 107–120, 2005. View at: Publisher Site | Google Scholar | MathSciNet - A. Berman and R. J. Plemmons,
*Nonnegative Matrices in the Mathematical Sciences*, Academic Press, New York, NY, USA, 1979. View at: MathSciNet - D. Xu and L. Xu, “New results for studying a certain class of nonlinear delay differential systems,”
*IEEE Transactions on Automatic Control*, vol. 55, no. 7, pp. 1641–1645, 2010. View at: Publisher Site | Google Scholar | MathSciNet

#### Copyright

Copyright © 2015 Guiying Chen and Linshan Wang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.