Table of Contents Author Guidelines Submit a Manuscript
Discrete Dynamics in Nature and Society
Volume 2009, Article ID 291594, 14 pages
http://dx.doi.org/10.1155/2009/291594
Research Article

Novel Criteria on Global Robust Exponential Stability to a Class of Reaction-Diffusion Neural Networks with Delays

1College of Applied Mathematics, University of Electronic Science and Technology of China, Chengdu, Sichuan 610054, China
2Department of Applied Mathematics, Sichuan Agricultural University, Yaan, Sichuan 625014, China

Received 13 June 2009; Accepted 31 August 2009

Academic Editor: Manuel De La Sen

Copyright © 2009 Jie Pan and Shouming Zhong. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The global exponential robust stability is investigated to a class of reaction-diffusion Cohen-Grossberg neural network (CGNNs) with constant time-delays, this neural network contains time invariant uncertain parameters whose values are unknown but bounded in given compact sets. By employing the Lyapunov-functional method, several new sufficient conditions are obtained to ensure the global exponential robust stability of equilibrium point for the reaction diffusion CGNN with delays. These sufficient conditions depend on the reaction-diffusion terms, which is a preeminent feature that distinguishes the present research from the previous research on delayed neural networks with reaction-diffusion. Two examples are given to show the effectiveness of the obtained results.

1. Introduction

In recent years, considerable attention has been paid to study the dynamics of artificial neural networks with fixed parameters because of their potential applications in the areas such as signal and image processing, pattern recognition, parallel computations, and optimization problems [110]. However, during the implementation on very scale integration chips, the stability of a well-designed system may often be destroyed by its unavoidable uncertainty due to the existence of modelling error, external disturbance, and parameter fluctuation. In general, on other hand, a mathematical description is only an approximation of the actual physical system and deals with fixed nominal parameters. Usually, these parameters are not known exactly due to the imperfect identification or measurement, aging of components and/or changes in the environmental condition. Thus, it is almost impossible to get an exact model for the system due to the existence of various parameter uncertainties. So it is essential to introduce the robust technique to design a system with such uncertainty [11, 12]. If the uncertainty of a system is only due to the deviations and perturbations of its parameters, and if those deviations and perturbations are all bounded, then the system is called an interval system [1323]. Recently, Chen and Rong [23] considered a class of Cohen-Grossberg neural networks (CGNNs) with time-varying delays. Several sufficient conditions were given to ensure global exponential robust stability.

In a real world, strictly speaking, the diffusion phenomena could not be ignored in neural networks and electric circuits once electrons transport in a nonuniform electromagnetic field. Hence, it is essential to consider the state variables varying with the time and space variables. The neural networks with diffusion terms can commonly be expressed by partial differential equations. Recently, some authors have devoted to the study of reaction-diffusion neural networks, for instance see [2429] and references therein. In particular, more recently, Liu et al. [27] and Wang et al. [28], considered the global exponential robust stability of a class of reaction-diffusion Hopfield neural networks with distributed delays and time-varying delay, respectively. Song and Cao [29] have obtained the criteria to guarantee the global exponential robust stability of a class of reaction-diffusion CGNNs with time-varying delays and Neumann boundary condition. In [2729], unfortunately, owing to the divergence theorem employed, a negative integral term with gradient is left out in their deduction. As a result, the global exponential robust stability criteria acquired by them do not contain a diffusion term. In other words, the diffusion term does not take effect in their deduction and sufficient conditions. The same case appears also in the other literatures [2426].

Motivated by the above discussions, in this paper we will consider a class of reaction-diffusion CGNNs with constant time delays and a boundary condition. We will construct a appropriate Lyapunov functional to derive some new criteria ensuring the global exponential robust stability for an equilibrium point of the delayed reaction-diffusion CGGNs with the boundary condition. The present work differs from the paper [2729] since (i) the diffusion terms play an important role in the global exponential robust stability criteria in the paper, (ii) the boundary condition of CGNNs model considered includes the Neumann type and the Dirichlet type while the boundary condition of model in [27, 28] is the Neumann type. The work will have significance impact on the design and applications of globally exponentially robustly stable reaction-diffusion neural network with delays and is of great interest in many applications.

The rest of this paper is organized as follows. In Section 2, model description and preliminaries are given. In Section 3, several criteria are derived for the global exponential robust stability for an equilibrium point of reaction-diffusion CGNNs with delays and the boundary condition. Then, we give two examples and comparison to illustrate our criteria in Section 4. Finally, in Section 5, some conclusions are made.

2. Model Description and Preliminaries

To begin with, we introduce some notations.

(i) is an open bounded domain in with smooth boundary , and as denotes the measure of . . (ii) is the space of real Lebesgue measurable functions on which is a Banach space. Define the inner product , for any and the -norm , for . (iii), where , . the closure of in . (iv)Let be the Banach space of continuous functions which map into with the norm for , where , . (v)Let be the Banach space of bounded continuous functions which map into with the following norm: , for any , where .

Consider the following reaction-diffusion CGNNs with interval coefficients and delays on :

for , is space variable, corresponds to the state of the th unit at time and in space ; , for , , corresponds to the transmission diffusion coefficient along the th neuron, for ; represents an amplification function; is an appropriate behavior function; , denote the connection strengths of the th neuron on the th neuron, respectively; , denote the activation functions of th neuron at time and in space ; () corresponds to the transmission delay along the axon of the th unit from the th unit. is the constant input from outside of the network.

Throughout this paper, we assume the following.

(H1)Each function is positive, continuous, and bounded, that is, there exist constants , such that , for , .(H2)Each function and is locally Lipschitz continuous.(H3)The activation functions and satisfy Lipschitz condition, that is, there exist two positive diagonal matrices and such that for all , .

Remark 2.1. The activation functions and , , are typically assumed to be sigmoidal which implies that they are monotone, bounded, and smooth. However, in this paper, we only need the previous weaker assumptions.

We assume that the nonlinear delayed systems (2.1) are supplemented with the boundary condition:

where is said the Dirichlet boundary condition, is said the Neumann boundary condition, where = , , denotes the outward normal derivative on .

Systems (2.1) are equipped with the initial condition:

where .

Given boundary condition (2.3) and initial function (2.4), the existence on the solutions of systems (2.1), the reader can refer to [18]. We denote the solution by , and sometimes it is denoted by , or for short when there is no risk of confusion.

Lemma 2.2. Under assumptions (H1)–(H3), system (2.1) has a unique equilibrium point, if (H4), for .

As for the proof of Lemma 2.2, the reader can refer to [21, 28]. Here, we omit it.

Definition 2.3. An equilibrium point of system (2.1)–(2.4) is said to be globally exponentially stable on -norm, if there exist constant and such that where

Definition 2.4. Let , , , , , , . An equilibrium point of system (2.1)–(2.4) is said to be globally exponentially robustly stable if its equilibrium point is globally exponentially stable for all , , , , for .

Lemma 2.5 (Poincaré inequality [3032]). Let be a bounded domain of with a smooth boundary of class by . is a real-valued function belonging to and . Then which is the lowest positive eigenvalue of the Laplacian with boundary condition

Regarding the proof of Lemma 2.5, we refer to any textbook on partial differential equations. For example, [30, 31] or [32] are good standard references.

Remark 2.6. (i) When is bounded or at least bounded in one direction, not only limited to a rectangle domain, inequality (2.6) holds. (ii) The lowest positive eigenvalue of the Laplacian is sometimes known as the first eigenvalue. Determining the lowest eigenvalue is, in general, a very hard task that depends upon the geometry of the domain . Certain special cases are tractable, however. For example, let the Laplacian on , if or , then or , respectively. (iii) Although the eigenvalue of the laplacian with the Dirichlet boundary condition on a generally bounded domain cannot be determined exactly, a lower bound of it may nevertheless be estimated by , where is a surface area of the unit ball in , is a volume of domain [33].

3. Main Results

Theorem 3.1. Let hypotheses (H1)–(H4) hold. Assume further that (A1) for , then equilibrium point of system (2.1) with (2.3) and (2.4) is globally exponentially robust stable for each constant input .

Proof. Let . is denoted by for short. From (2.1), we obtain for , , where for .
Taking the inner product of both sides of (3.1) with , we get
for , .
From the boundary condition (2.3), Gauss formula and Lemma 2.2, we have
From assumption (H2), we get From assumptions (H1) and (H3), we obtain By the same way, we have Combining (3.4)–(3.7) into (3.3), we obtain for .
According to (A1), we can choose a sufficiently small such that
Now consider the Lyapunov functional defined by By calculating the upper right Dini derivative of along the solutions of (3.1), we get for . Hence Note that Denote and where , then . So that is, Since all the solutions of system (2.1)–(2.4) tend to exponentially as for any values of the coefficients in system (2.1) with (2.3)-(2.4), that is, the system described by (2.1) with (2.3)- (2.4) has a unique equilibrium which is globally exponentially robust stable on -norm and the theorem is proved.

Remark 3.2. In the deduction for Theorem 3.1, by Lemma 2.5, we have obtained (see (3.4)). This is an important step. As a result, the condition of Theorem 3.1 includes the diffusion terms.
Changing a little the Lyapunov functional (3.10) by
and using the similar way of the proof of Theorem 3.1, we derive another new criterion.

Theorem 3.3. Under assumptions (H1)–(H4), if, in addition (A2)   for , then equilibrium point of system (2.1) with (2.3)-(2.4) is globally exponentially robust stable for each constant input .

For system (2.1), when the strength of the neuron interconnections and () is fixed constant matrices, the following result is obvious from Theorems 3.1 and 3.3.

Corollary 3.4. Under assumptions (H1)–(H4), if any one of the following condition is true: (A3)(A4)   for , then equilibrium point of system (2.1) with (2.3)-(2.4) is globally exponentially stable.

Remark 3.5. When , , , then system (2.1) reduces to the following reaction-diffusion cellular neural network: for .

From Theorems 3.1 and 3.3, we have the following results.

Corollary 3.6. Under assumptions (H3) and (H4), if, in addition, any one of the following condition is true: (A3)(A4)   for , then equilibrium point of system (3.18) with (2.3) and (2.4) is globally exponentially robust stable for each constant input .

Remark 3.7. When , then system (2.1) reduces to the following system without diffusive terms: for , .

From Theorems 3.1 and 3.3, we have the following results.

Corollary 3.8. Under assumptions (H1)–(H4), if, in addition, any one of the following condition holds: (A5)(A6)   for , then equilibrium point of system (3.19) with (2.4) is globally exponentially robust stable for each constant input .

Remark 3.9. From Theorems 3.1 and 3.3, Corollary 3.8, we see that condition (A5) or condition (A6) imply (A1) and (A2), respectively, conversely, if conditions (A1) and (A2) hold, (A5) and (A6) do not certainly hold. This show that the reaction-diffusion terms have play an important role in the globally exponentially robust stability to a reaction-diffusion neural network.

4. Examples and Comparison

In order to illustrate the feasibility of the previous established criteria in the preceding sections, we provide concrete two examples. Although the selection of the coefficients and functions in the examples is somewhat artificial, the possible application of our theoretical theory is clearly expressed.

Example 4.1. Consider the following reaction-diffusion CGNNs on : where , , , , , , , , , , , , , , and .
This model satisfies assumptions (H1)–(H4) in this paper with , , , , , , , , , , . It is easily computed that
From Theorem 3.1, we know that model (4.1) has a unique equilibrium point which is globally exponentially robustly stable.

Remark 4.2. It should be noted that From Corollary 3.8, the corresponding delayed differential equation of system (4.1) without reaction-diffusion terms is not certainly robustly stable, as we can see in Example  4.1, reaction-diffusion terms do contribute to the exponentially robust stability of system (4.1).

Example 4.3. For the model in Example 4.1, if the diffusion operator, , and the boundary condition are replaced by, respectively, and the Neumann boundary condition the remainder parameters unchanged. According to Remark 2.1, we see that . By Theorem 3.1, using the same way with Example  4.1, we see that model (4.1) has a unique equilibrium point which is globally exponentially robustly stable.

Remark 4.4. Song and Cao have considered reaction-diffusion CGNNs with the Neumann boundary condition and obtained the criteria of the globally exponentially robust stability unique equilibrium point for CGNN, that is, [29, Theorem  1]. We notice that [29, Theorem  1] is irrelevant to the reaction-diffusion terms. In principal, [29, Theorem  1] could be applied to analyze the globally exponentially robust stability for the system in Example  4.2. Unfortunately, [29, Theorem  1] is not applicable to ascertain the globally exponentially robust stability for the system in Example  4.2, since (according to the symbols in this paper) is not an -matrix, where , .

5. Conclusion

In this paper, we have proposed several sufficient condition for the globally exponentially robustly stability of equilibrium point for the reaction-diffusion CGNNs with constant time delays. All the criteria are established by constructing suitable Lyapunov functionals, without assuming the monotonicity and differentiability of activation functions and the symmetry of connection matrices. The space domain that CGNNs model is on is relatively general, the boundary condition of CGNNs model includes the Dirichlet and the Neumann. In particular, Poincaré inequality is used and all the criteria obtained depend on reaction-diffusion terms, this is a preeminent feature that distinguishes our research from the previous research on delayed neural network with reaction diffusion. Numerical examples are presented to illustrate the feasibility of this method.

Acknowledgment

The authors would like to thank the editor and the reviewers for their detailed comments and valuable suggestions which have led to a much improved paper.

References

  1. M. A. Cohen and S. Grossberg, “Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 13, no. 5, pp. 815–826, 1983. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  2. X. Liao, C. Li, and K.-W. Wong, “Criteria for exponential stability of Cohen-Grossberg neural networks,” Neural Networks, vol. 17, no. 10, pp. 1401–1414, 2004. View at Publisher · View at Google Scholar · View at Scopus
  3. Y. Chen, W. Bi, and Y. Wu, “Delay-dependent exponential stability for discrete-time BAM neural networks with time-varying delays,” Discrete Dynamics in Nature and Society, vol. 2008, Article ID 421614, 14 pages, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  4. J. Liu, “Global exponential stability of Cohen-Grossberg neural networks with time-varying delays,” Chaos, Solitons & Fractals, vol. 26, no. 3, pp. 935–945, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  5. Q. Zhang, X. Wei, and J. Xu, “On global exponential stability of discrete-time Hopfield neural networks with variable delays,” Discrete Dynamics in Nature and Society, vol. 2007, Article ID 67675, 9 pages, 2007. View at Publisher · View at Google Scholar · View at MathSciNet
  6. H. Ye, A. N. Michel, and K. Wang, “Qualitative analysis of Cohen-Grossberg neural networks with multiple delays,” Physical Review E, vol. 51, no. 3, pp. 2611–2618, 1995. View at Google Scholar · View at MathSciNet
  7. L. Wang and X. Zou, “Harmless delays in Cohen-Grossberg neural networks,” Physica D, vol. 170, no. 2, pp. 162–173, 2002. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  8. X. Liao, X. Yang, and W. Zhang, “Delay-dependent asymptotic stability for neural networks with time-varying delays,” Discrete Dynamics in Nature and Society, vol. 2006, Article ID 91725, 25 pages, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  9. H. Xiang, K.-M. Yan, and B.-Y. Wang, “Existence and global stability of periodic solution for delayed discrete high-order Hopfield-type neural networks,” Discrete Dynamics in Nature and Society, vol. 2005, no. 3, pp. 281–297, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  10. R. Rakkiyappan and P. Balasubramaniam, “New global exponential stability results for neutral type neural networks with distributed time delays,” Neurocomputing, vol. 71, no. 4–6, pp. 1039–1045, 2008. View at Publisher · View at Google Scholar · View at Scopus
  11. M. De la Sen, “Robust adaptive control of linear time-delay systems with point time-varying delays via multiestimation,” Applied Mathematical Modelling, vol. 33, no. 2, pp. 959–977, 2009. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  12. M. De la Sen, “About robust stability of dynamic systems with time delays through fixed point theory,” Fixed Point Theory and Applications, vol. 2008, Article ID 480187, 20 pages, 2008. View at Publisher · View at Google Scholar · View at MathSciNet
  13. M. Gao and B. Cui, “Global robust stability of neural networks with multiple discrete delays and distributed delays,” Chaos, Solitons & Fractals, vol. 40, no. 4, pp. 1823–1834, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  14. C. Li, X. Liao, and R. Zhang, “A global exponential robust stability criterion for interval delayed neural networks with variable delays,” Neurocomputing, vol. 69, no. 7–9, pp. 803–809, 2006. View at Publisher · View at Google Scholar · View at Scopus
  15. N. Ozcan and S. Arik, “A new sufficient condition for global robust stability of bidirectional associative memory neural networks with multiple time delays,” Nonlinear Analysis: Real World Applications, vol. 10, no. 5, pp. 3312–3320, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  16. L. Sheng and H. Yang, “Novel global robust exponential stability criterion for uncertain BAM neural networks with time-varying delays,” Chaos, Solitons & Fractals, vol. 40, no. 5, pp. 2102–2113, 2009. View at Publisher · View at Google Scholar · View at Scopus
  17. V. Singh, “Novel global robust stability criterion for neural networks with delay,” Chaos, Solitons & Fractals, vol. 41, no. 1, pp. 348–353, 2009. View at Publisher · View at Google Scholar · View at Scopus
  18. W. Su and Y. Chen, “Global robust stability criteria of stochastic Cohen-Grossberg neural networks with discrete and distributed time-varying delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 14, no. 2, pp. 520–528, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  19. Z. Wu, H. Su, J. Chu, and W. Zhou, “New results on robust exponential stability for discrete recurrent neural networks with time-varying delays,” Neurocomputing, vol. 72, no. 13–15, pp. 3337–3342, 2009. View at Publisher · View at Google Scholar · View at Scopus
  20. E. Yucel and S. Arik, “Novel results for global robust stability of delayed neural networks,” Chaos, Solitons & Fractals, vol. 39, no. 4, pp. 1604–1614, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  21. R. Zhang and L. Wang, “Global exponential robust stability of interval cellular neural networks with S-type distributed delays,” Mathematical and Computer Modelling, vol. 50, no. 3-4, pp. 380–385, 2009. View at Publisher · View at Google Scholar · View at Scopus
  22. W. Zhao, “New results of existence and stability of periodic solution for a delay multispecies logarithmic population model,” Nonlinear Analysis: Real World Applications, vol. 10, no. 1, pp. 544–553, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  23. T. Chen and L. Rong, “Robust global exponential stability of Cohen-Grossberg neural networks with time delays,” IEEE Transactions on Neural Networks, vol. 15, no. 1, pp. 203–205, 2004. View at Publisher · View at Google Scholar · View at Scopus
  24. L. Wang and D. Xu, “Global exponential stability of reaction-diffusion Hopfield neural networks with variable delays,” Science in China Series E, vol. 33, no. 2, pp. 488–495, 2003. View at Google Scholar
  25. J. Liang and J. Cao, “Global exponential stability of reaction-diffusion recurrent neural networks with time-varying delays,” Physics Letters A, vol. 314, no. 5-6, pp. 434–442, 2003. View at Publisher · View at Google Scholar · View at Scopus
  26. J. Qiu, “Exponential stability of impulsive neural networks with time-varying delays and reaction-diffusion terms,” Neurocomputing, vol. 70, no. 4–6, pp. 1102–1108, 2007. View at Publisher · View at Google Scholar · View at Scopus
  27. P. Liu, F. Yi, Q. Guo, J. Yang, and W. Wu, “Analysis on global exponential robust stability of reaction-diffusion neural networks with S-type distributed delays,” Physica D, vol. 237, no. 4, pp. 475–485, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  28. L. Wang, Y. Zhang, Z. Zhang, and Y. Wang, “LMI-based approach for global exponential robust stability for reaction-diffusion uncertain neural networks with time-varying delay,” Chaos, Solitons & Fractals, vol. 41, no. 2, pp. 900–905, 2009. View at Publisher · View at Google Scholar · View at Scopus
  29. Q. Song and J. Cao, “Global exponential robust stability of Cohen-Grossberg neural network with time-varying delays and reaction-diffusion terms,” Journal of the Franklin Institute, vol. 343, no. 7, pp. 705–719, 2006. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  30. R. Courant and D. Hilbert, Methods of Mathematical Physics, vol. 1, Wiley, New York, NY, USA, 1953.
  31. R. Courant and D. Hilbert, Methods of Mathematical Physics, vol. 2, Wiley, New York, NY, USA, 1962.
  32. R. Temam, Infinite-Dimensional Dynamical Systems in Mechanics and Physics, Springer, New York, NY, USA, 1998.
  33. P. Niu, J. Qu, and J. Han, “Estimation of the eigenvalue of Laplace operator and generalization,” Journal of Baoji College of Arts and Science (Natural Science), vol. 23, no. 1, pp. 85–87, 2003 (Chinese). View at Google Scholar