Journal of Applied Mathematics

Volume 2013, Article ID 935491, 8 pages

http://dx.doi.org/10.1155/2013/935491

## Global Robust Attractive and Invariant Sets of Fuzzy Neural Networks with Delays and Impulses

^{1}Center of Engineering Mathematics and Department of Applied Mathematics, Kunming University of Science and Technology, Kunming, Yunnan 650093, China^{2}Department of Mathematics, Yuxi Normal University, Yuxi, Yunnan 653100, China

Received 15 October 2012; Accepted 21 January 2013

Academic Editor: Huijun Gao

Copyright © 2013 Kaihong Zhao et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

A class of fuzzy neural networks (FNNs) with time-varying delays and impulses is investigated. With removing some restrictions on the amplification functions, a new differential inequality is established, which improves previouse criteria. Applying this differential inequality, a series of new and useful criteria are obtained to ensure the existence of global robust attracting and invariant sets for FNNs with time-varying delays and impulses. Our main results allow much broader application for fuzzy and impulsive neural networks with or without delays. An example is given to illustrate the effectiveness of our results.

#### 1. Introduction

The theoretical and applied studies of the current neural networks (CNNs) have been a new focus of studies worldwide because CNNs are widely applied in signal processing, image processing, pattern recognition, psychophysics, speech, perception, robotics, and so on. The scholars have introduced many classes of CNNs models such as Hopfield-type networks [1], bidirectional associative memory networks [2], cellular neural networks [3], recurrent back-propagation networks [4–6], optimization-type networks [7–9], brain-state-in-a-box-(BSB-) type networks [10, 11], and Cohen-Grossberg recurrent neural networks (CGCNNs) [12]. According to the choice of the variable for CNNs [13], two basic mathematical models of CNNs are commonly adopted: either local field neural network models or static neural network models. The basic model of local field neural network is described as where denotes the activation function of the th neuron; is the state of the th neuron; is the external input imposed on the th neuron; denotes the synaptic connectivity value between the th neuron and the th neuron; is the number of neurons in the network.

It is well known that local field neural network not only models Hopfield-type networks but also models bidirectional associative memory networks and cellular neural networks. In the past few years, there has been increasing interest in studying dynamical characteristics such as stability, persistence, periodicity, robust stability of equilibrium points, and domains of attraction of local field neural network. Many deep theoretical results have been obtained for local field neural network. We can refer to [14–32] and the references cited therein.

However, in mathematical modeling of real world problems, we will encounter some other inconveniences, for example, the complexity and the uncertainty or vagueness. Fuzzy theory is considered as a more suitable setting for the sake of taking vagueness into consideration. Based on traditional cellular neural networks (CNNs),T. Yang and L.-B. Yangproposed the fuzzy CNNs (FCNNs) [33], which integrate fuzzy logic into the structure of traditional CNNs and maintain local connectedness among cells. Unlike previous CNNs structures, FCNNs have fuzzy logic between its template input and/or output besides the sum of product operation. FCNNs are very useful paradigm for image processing problems, which is a cornerstone in image processing and pattern recognition. In addition, many evolutionary processes in nature are characterized by the fact that their states are subject to sudden changes at certain moments and therefore can be described by impulsive system. Therefore, it is necessary to consider both the fuzzy logic and delay effect on dynamical behaviors of neural networks with impulses. Nevertheless, to the best of our knowledge, there are few published papers considering the global robust domain of attraction for the fuzzy neural network (FNNs). Therefore, in this paper, we will study the global robust attracting set and invariant set of the following fuzzy neural networks (FNNs) with time-varying delays and impulses: where are elements of fuzzy feed-forward template, and are elements of fuzzy feedback MIN template, and are elements of fuzzy feedback MAX template, and and are elements of fuzzy feed-forward MIN template and fuzzy feed-forward MAX template, respectively. is the weight of connection between the th neurons and the th neurons. , and stand for state, input, and bias of the th neurons, respectively. is the transmission delay and is the activation function. and denote the fuzzy AND and fuzzy OR operation, respectively. is the impulses at moments , and is a strictly increasing sequence such that . is the impulsive function. Function is the initial function. is a constant. is the parameter.

The main purpose of this paper is to investigate the global robust attracting and invariant sets of FNNs (2). Different from [34, 35], in this paper, we will introduce a new nonlinear differential inequality, which is more effective than the linear differential inequalities for studying the asymptotic behavior of some nonlinear differential equations. Applying this new nonlinear delay differential inequality, sufficient conditions are gained for global robust attracting and invariant sets.

The rest of this paper is organized as follows. In Section 2, we will give some basic definitions and basic results about the attracting domains of FNNs (2). In Section 3, we will obtain the proof of the usefully nonlinear delay differential inequality. In Section 4, our main results will be proved by this delay differential inequality. Finally, an example is given to illustrate the effectiveness of our results in Section 5.

#### 2. Preliminaries

As usual, denotes the space of continuous mappings from the topological space to the topological space . In particular, let denote the set of all real-valued continuous mappings from to equipped with supremum norm defined by where . Denote by the solution of FCNNs (2) with initial condition .

Let denote the -dimensional unit matrix. For or means that each pair of the corresponding elements of and satisfies the inequality “”. For any , we define , , , , . For an -matrix [36], we denote and . For the sake of simplicity, we denote that , where is bounded in .

As usual, in the theory of impulsive differential equations, at the points of discontinuity , we assume that and .

Inspired by [37], we construct an equivalent theorem between (2) and (4). Then we establish some lemmas which are necessary in the proof of the main results.

Throughout this paper, we always assume the following. For all and is uniformly absolute convergence., , , , , , , , and are bounded in . The activation function with is second-order differentiable and Lipschitz continuous; that is, there exist positive constants such that for any . Functions are nonnegative, bounded, and continuous defined on and . Let , where , , , and .Consider the following non-impulsive system (4): We have the following lemma, which shows that system (2) and (4) is equivalent.

Lemma 1. *Assume holds, thenwe have the following.*(i)*If is a solution of (4), then is a solution of (2). *(ii)*If is a solution of (2), then is a solution of (4).*

*Proof. *Firstly, let us prove (i). For a given , it is easy to see that is absolutely continuous on the interval and for any , ,
satisfies system (2). In addition, for every ,
Thus, for every ,
The proof is complete.

Next, we prove (ii). Since is absolutely continuous on the interval and, in view of (7), it follows that, for any ,
which implies that is continuous on . It is easy to prove that is absolutely continuous on . Now, one can easily check that
is the solution of (4). The proof is complete.

*Definition 2. *Let be subsets of which is independent of the parameter and let be a solution of FNNs (2) with .(i)For any given , if for any initial value implies that for all , then is said to be a robust positive invariant set of system of FNNs (2).(ii)For any given , if for any initial value , the solution converges to as , that is, as , then is said to be a global robust attracting set of system of FNNs (2), where , and for .

For a class of differential equations with the term of fuzzy AND and fuzzy OR operation, there is the following useful inequality.

Lemma 3 (see [33]). *Let and be two states of (2), then we have
*

#### 3. Nonlinear Delay Differential Inequality

In this section, we will establish a new nonlinear delay differential inequality which will play the important role to prove our main results.

Lemma 4. *Assume that satisfies
**
where and for , , . If and , then we have the following.*(i)* For any constant , the solution of (11) satisfies
* *provided that .*(ii) * Consider that* *provided that
* *where and the positive constant is determined by the following inequality:
*

*Proof. *Since , we have . Let ( small enough), then . In order to prove (12), we will first prove that
for any given initial function with .

If (16) does not hold, then there exist and such that
It follows from (11) and (17) that
which contradicts the inequality (18). So (16) holds for all . Letting in (16), we have
The proof of part is complete.

Since , we have . Then
From (15), we can get
In the following, we at first will prove that for any positive constant ,
We let
If inequality (23) is not true, then is nonempty set and there must exist some integer such that .

By and the inequality (23), we can get
By applying (11) and (21)–(26), we obtain
which contradicts the inequality in (25). Thus the inequality (23) holds. Therefore, letting , we have (13). The proof is complete.

By the process of proof of Lemma 4, we easily derive the following theorem

Theorem 5. *Under the conditions of Lemma 4, then
*

#### 4. Main Results

In this section, we will state and prove our main results. The following lemma is very useful to prove Theorem 7.

Lemma 6. *Assume that and the series of number is absolute convergence, then the infinite products , and are convergent and .*

*Proof. *In fact, by the assumption , we have
which imply that
On the other hand, since is absolute convergence, we derive that
Equations (30) and (31) give that
According to (32) and considering that is absolute convergence, we get that the series of positive number , , and is convergent. Thus the infinite products , , and are convergent. At the same time, combined with (29), we conclude that , . The proof of Lemma 6 is complete.

Theorem 7. *Assume that ()–() hold, then
**
is a robust positive invariant and global robust attracting set of FNNs (2),where
*

*Proof. *Calculating the upper right derivative along system (4) and by using Lemma 6, we have
From , (35) can be rewritten as follows:
Then from the conclusion (i) of Lemma 4, we can obtain
provided that , where .

According to Lemma 1 and (37), one has
provided that , where . In view of Definition 2, we get that denoted by (33) is a robust positive invariant set of FNNs (2).

On the other hand, since , there exists a positive vector such that
By using continuity, we know that there must exist a positive scalar such that
where is a constant such that .

Then by (36), (40), and , all the conditions of Theorem 5 are satisfied, and we have
According to Lemma 1 and Definition 2, we yield that denoted by (33) is also a global robust attracting set of FNNs (2). The proof is complete.

Theorem 8. *In addition to ()–(), further assume . Then FNNs (2) has a zero solution and the zero solution is global robust exponential stability and the exponential convergent rate equals which is determined by (40).*

#### 5. Illustrative Example

The following illustrative example will demonstrate the effectiveness of our results. Consider the following FNNs with time-varying delays and impulses: where , , , , , , , , , , , , , , , , and , , . By the simple calculation, we obtain where . Then is a nonsingular -matrix, and taking , , we get Therefore, by Theorem 7, we obtain that is a robust positive invariant and global robust attracting set of FNNs (42).

#### Acknowledgments

The author would like to thank the anonymous referees for their useful and valuable suggestions. This work is supported by the National Natural Sciences Foundation of China under Grant no. 11161025, Yunnan Province Natural Scientific Research Fund Project (no. 2011FZ058), and Yunnan Province Education Department Scientific Research Fund Project (no. 2011Z001).

#### References

- J. J. Hopfield, “Neurons with graded response have collective computational properties like those of two-state neurons,”
*Proceedings of the National Academy of Sciences of the United States of America*, vol. 81, no. 10, pp. 3088–3092, 1984. View at Google Scholar · View at Scopus - B. Kosko, “Bidirectional associative memories,”
*IEEE Transactions on Systems, Man, and Cybernetics*, vol. 18, no. 1, pp. 49–60, 1988. View at Publisher · View at Google Scholar · View at MathSciNet - L. O. Chua and L. Yang, “Cellular neural networks: theory,”
*IEEE Transactions on Circuits and Systems*, vol. 35, no. 10, pp. 1257–1272, 1988. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - L.B. Almeida, “Backpropagation in perceptrons with feedback,” in
*Neural Computers*, pp. 199–208, Springer, New York, NY, USA, 1988. View at Google Scholar - F. J. Pineda, “Generalization of back-propagation to recurrent neural networks,”
*Physical Review Letters*, vol. 59, no. 19, pp. 2229–2232, 1987. View at Publisher · View at Google Scholar · View at MathSciNet - R. Rohwer and B. Forrest, “Training time-dependence in neural networks,” in
*Proceedings of the 1st IEEE International Conference on Neural Networks*, pp. 701–708, San Diego, Calif, USA, 1987. - M. Forti and A. Tesi, “New conditions for global stability of neural networks with application to linear and quadratic programming problems,”
*IEEE Transactions on Circuits and Systems. I*, vol. 42, no. 7, pp. 354–366, 1995. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Y. Xia and J. Wang, “A general methodology for designing globally convergent optimization neural networks,”
*IEEE Transactions on Neural Networks*, vol. 9, no. 6, pp. 1331–1343, 1998. View at Google Scholar · View at Scopus - Y. S. Xia and J. Wang, “On the stability of globally projected dynamical systems,”
*Journal of Optimization Theory and Applications*, vol. 106, no. 1, pp. 129–150, 2000. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - J.-H. Li, A. N. Michel, and W. Porod, “Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercube,”
*IEEE Transactions on Circuits and Systems*, vol. 36, no. 11, pp. 1405–1422, 1989. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - I. Varga, G. Elek, and S. H. Zak, “On the brain-state-in-a-convex-domain neural models,”
*Neural Networks*, vol. 9, no. 7, pp. 1173–1184, 1996. View at Publisher · View at Google Scholar · View at Scopus - M. A. Cohen and S. Grossberg, “Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,”
*IEEE Transactions on Systems, Man, and Cybernetics*, vol. 13, no. 5, pp. 815–826, 1983. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - H. Qiao, J. Peng, Z. B. Xu, and B. Zhang, “A reference model approach to stability analysis of neural networks,”
*IEEE Transactions on Systems, Man, and Cybernetics, Part B*, vol. 33, no. 6, pp. 925–936, 2003. View at Publisher · View at Google Scholar · View at Scopus - X. Yang, X. Liao, Y. Tang, and D. J. Evans, “Guaranteed attractivity of equilibrium points in a class of delayed neural networks,”
*International Journal of Bifurcation and Chaos in Applied Sciences and Engineering*, vol. 16, no. 9, pp. 2737–2743, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - H. Zhao, “Global asymptotic stability of Hopfield neural network involving distributed delays,”
*Neural Networks*, vol. 17, no. 1, pp. 47–53, 2004. View at Publisher · View at Google Scholar · View at Scopus - J. Cao and J. Wang, “Global asymptotic and robust stability of recurrent neural networks with time delays,”
*IEEE Transactions on Circuits and Systems I*, vol. 52, no. 2, pp. 417–426, 2005. View at Publisher · View at Google Scholar · View at MathSciNet - Z. Huang, X. Wang, and F. Gao, “The existence and global attractivity of almost periodic sequence solution of discrete-time neural networks,”
*Physics Letters A*, vol. 350, no. 3-4, pp. 182–191, 2006. View at Publisher · View at Google Scholar · View at Scopus - Z. B. Xu, H. Qiao, J. Peng, and B. Zhang, “A comparative study of two modeling approaches in neural networks,”
*Neural Networks*, vol. 17, no. 1, pp. 73–85, 2004. View at Publisher · View at Google Scholar · View at Scopus - M. Wang and L. Wang, “Global asymptotic robust stability of static neural network models with S-type distributed delays,”
*Mathematical and Computer Modelling*, vol. 44, no. 1-2, pp. 218–222, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - P. Li and J. Cao, “Stability in static delayed neural networks: a nonlinear measure approach,”
*Neurocomputing*, vol. 69, no. 13–15, pp. 1776–1781, 2006. View at Publisher · View at Google Scholar · View at Scopus - Z. Huang and Y. Xia, “Exponential p-stability of second order Cohen-Grossberg neural networks with transmission delays and learning behavior,”
*Simulation Modelling Practice and Theory*, vol. 15, no. 6, pp. 622–634, 2007. View at Publisher · View at Google Scholar · View at Scopus - K. Zhao and Y. Li, “Robust stability analysis of fuzzy neural network with delays,”
*Mathematical Problems in Engineering*, vol. 2009, Article ID 826908, 13 pages, 2009. View at Publisher · View at Google Scholar · View at Scopus - R. Yang, H. Gao, and P. Shi, “Novel robust stability criteria for stochastic Hopfield neural networks with time delays,”
*IEEE Transactions on Systems, Man, and Cybernetics, Part B*, vol. 39, no. 2, pp. 467–474, 2009. View at Publisher · View at Google Scholar · View at Scopus - R. Yang, Z. Zhang, and P. Shi, “Exponential stability on stochastic neural networks with discrete interval and distributed delays,”
*IEEE Transactions on Neural Networks*, vol. 21, no. 1, pp. 169–175, 2010. View at Publisher · View at Google Scholar · View at Scopus - X. Li and J. Shen, “LMI approach for stationary oscillation of interval neural networks with discrete and distributed time-varying delays under impulsive perturbations,”
*IEEE Transactions on Neural Networks*, vol. 21, no. 10, pp. 1555–1563, 2010. View at Publisher · View at Google Scholar · View at Scopus - Y. Zhao, L. Zhang, S. Shen, and H. Gao, “Robust stability criterion for discrete-time uncertain markovian jumping neural networks with defective statistics of modes transitions,”
*IEEE Transactions on Neural Networks*, vol. 22, no. 1, pp. 164–170, 2011. View at Publisher · View at Google Scholar · View at Scopus - X. W. Li, H. J. Gao, and X. H. Yu, “A Unified Approach to the Stability of Generalized Static Neural Networks With Linear Fractional Uncertainties and Delays,”
*IEEE Transactions on Systems, Man and Cybernetics, Part B*, vol. 41, no. 5, pp. 1275–1286, 2011. View at Google Scholar - H. Dong, Z. Wang, and H. Gao, “Robust ${H}_{\infty}$ filtering for a class of nonlinear networked systems with multiple stochastic communication delays and packet dropouts,”
*IEEE Transactions on Signal Processing*, vol. 58, no. 4, pp. 1957–1966, 2010. View at Publisher · View at Google Scholar · View at MathSciNet - W. Zhang, Y. Tang, J.-A. Fang, and X. Wu, “Stability of delayed neural networks with time-varying impulses,”
*Neural Networks*, vol. 36, pp. 59–63, 2012. View at Publisher · View at Google Scholar - H. Huang, T. Huang, and X. Chen, “Global exponential estimates of delayed stochastic neural networks with Markovian switching,”
*Neural Networks*, vol. 36, pp. 136–145, 2012. View at Publisher · View at Google Scholar - Q. T. Gan, “Exponential synchronization of stochastic Cohen-Grossberg neural networks with mixed time-varying delays and reactio-diffusion via periodically intermittent control,”
*Neural Networks*, vol. 31, pp. 12–21, 2012. View at Google Scholar - Y. K. Li and C. Wang, “Existence and global exponential stability of equilibrium for discretetime fuzzy BAM neural networks with variable delays and impulses,”
*Fuzzy Sets and Systems*, 2012. View at Publisher · View at Google Scholar - T. Yang and L.-B. Yang, “The global stability of fuzzy cellular neural network,”
*IEEE Transactions on Circuits and Systems. I*, vol. 43, no. 10, pp. 880–883, 1996. View at Publisher · View at Google Scholar · View at MathSciNet - D. Xu and Z. Yang, “Impulsive delay differential inequality and stability of neural networks,”
*Journal of Mathematical Analysis and Applications*, vol. 305, no. 1, pp. 107–120, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Z. Yang and D. Xu, “Impulsive effects on stability of Cohen-Grossberg neural networks with variable delays,”
*Applied Mathematics and Computation*, vol. 177, no. 1, pp. 63–78, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - R. A. Horn and C. R. Johnson,
*Topics in Matrix Analysis*, vol. 2, Cambridge University Press, Cambridge, UK, 1991. View at Publisher · View at Google Scholar · View at MathSciNet - Y. Li, “Global exponential stability of BAM neural networks with delays and impulses,”
*Chaos, Solitons and Fractals*, vol. 24, no. 1, pp. 279–285, 2005. View at Publisher · View at Google Scholar · View at Scopus