About this Journal Submit a Manuscript Table of Contents
Discrete Dynamics in Nature and Society
Volume 2011 (2011), Article ID 570295, 16 pages
http://dx.doi.org/10.1155/2011/570295
Research Article

Stability of Stochastic Reaction-Diffusion Recurrent Neural Networks with Unbounded Distributed Delays

1College of Mathematics and Computing Science, Changsha University of Science and Technology, Changsha, Hunan 410076, China
2Department of Mathematics, Honghe University, Mengzi, Yunnan 661100, China
3College of Electrical and Information Engineering, Hunan University, Changsha, Hunan 410082, China

Received 5 September 2010; Accepted 11 January 2011

Academic Editor: Binggen Zhang

Copyright © 2011 Chuangxia Huang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Stability of reaction-diffusion recurrent neural networks (RNNs) with continuously distributed delays and stochastic influence are considered. Some new sufficient conditions to guarantee the almost sure exponential stability and mean square exponential stability of an equilibrium solution are obtained, respectively. Lyapunov's functional method, M-matrix properties, some inequality technique, and nonnegative semimartingale convergence theorem are used in our approach. The obtained conclusions improve some published results.

1. Introduction

For decades, studies have been intensively focused on recurrent neural networks (RNNs) because of the successful hardware implementation and their various applications such as classification, associative memories, parallel computation, optimization, signal processing and pattern recognition, see, for example, [13]. These applications rely crucially on the analysis of the dynamical behavior of neural networks. Recently, it has been realized that the axonal signal transmission delays often occur in various neural networks and may cause undesirable dynamic network behaviors such as oscillation and instability. Consequently, the stability analysis problems resting with delayed recurrent neural networks (DRNNs) have drawn considerable attention. To date, a great deal of results on DRNNs have been reported in the literature, see, for example, [49] and references therein. To a large extent, the existing literature on theoretical studies of DRNNs is predominantly concerned with deterministic differential equations. The literature dealing with the inherent randomness associated with signal transmission seems to be scarce; such studies are, however, important for us to understand the dynamical characteristics of neuron behavior in random environments for two reasons: (i) in real nervous systems and in the implementation of artificial neural networks, synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes; hence, noise is unavoidable and should be taken into consideration in modeling [10]; (ii) it has been realized that a neural network could be stabilized or destabilized by certain stochastic effects [11, 12].

Although systems are often perturbed by various types of environmental “noise”, it turns out that one of the reasonable interpretation for the “noise” perturbation is the so-called white noise , where is the Brownian motion process, also called as Wiener process [12, 13]. More detailed mechanism of the stochastic effects on the interaction of neurons and analog circuit implementing can be found in [13, 14]. However, because the Brownian motion is nowhere differentiable, the derivative of Brownian motion can not be defined in the ordinary way, the stability analysis for stochastic neural networks is difficult. Some initial results have just appeared, for example, [11, 1523]. In [11, 15], Liao and Mao discussed the exponentially stability of stochastic recurrent neural networks (SRNNs); in [16], the authors continued their research to discuss almost sure exponential stability for a class of stochastic CNN with discrete delays by using the nonnegative semimartingale convergence theorem; in [18], exponential stability of SRNNs via Razumikhin-type was investigated; in [17], Wan and Sun investigated mean square exponential stability of stochastic delayed Hopfield neural networks(HNN); in [19], Zhao and Ding studied almost sure exponential stability of SRNN; in [20], Sun and Cao investigated th moment exponential stability of stochastic recurrent neural networks with time-varying delays.

The delays in all above-mentioned papers have been largely restricted to be discrete. As is well known, the use of constant fixed delays in models of delayed feedback provides of a good approximation in simple circuits consisting of a small number of cells. However, neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths. Thus there will be a distribution of conduction velocities along these pathways and a distribution of propagation delays. In these circumstances, the signal propagation is not instantaneous and cannot be modeled with discrete delays and a more appropriate way is to incorporate continuously distributed delays. For instance, in [24], Tank and Hopfield designed an analog neural circuit with distributed delays, which can solve a general problem of recognizing patterns in a time-dependent signal. A more satisfactory hypothesis is that to incorporate continuously distributed delays, we refer to [5, 25, 26]. In [27], Wang et al. developed a linear matrix inequality (LMI) approach to study the stability of SRNNs with mixed delays. To the best of the authors' knowledge, few authors investigated the convergence dynamics of SRNNs with unbounded distributed delays. On the other hand, if the RNNs depend on only time or instantaneously time and time delay, the model is in fact an ordinary differential equation or a functional differential equation. In the factual operations, however, the diffusion phenomena could not be ignored in neural networks and electric circuits once electrons transport in a nonuniform electromagnetic field. Hence, it is essential to consider the state variables varying with the time and space variables. The neural networks with diffusion terms can commonly be expressed by partial differential equations [2830].

Keeping this in mind, in this paper, we consider the SRNNs described by the following stochastic reaction-diffusion RNNs In the above model, is the number of neurons in the network, is space variable, is the state variable of the th neuron at time and in space variable , , and denote the output of the th unit at time and in space variable ; , , are constants: represents the rate with which the th unit will reset its potential to the resting state in isolation when disconnected from the network and the external stochastic perturbation, and is a positive constant; and weight the strength of the th unit on the th unit at time . Moreover, are independent scalar standard Wiener processes on the complete probability space with the natural filtration generated by the standard Wiener process which is independent of , where we associate with the canonical space generated by , and denote by the associated -algebra generated by with the probability measure ; Furthermore, we assume the following boundary conditionsmooth function is a diffusion operator, is a compact set with smooth boundary and measure in . , and are the boundary value and initial value, respectively.

For the sake of convenience, some of the standing definitions and assumptions are formulated below: is differentiable and , for ,, , and are Lipschitz continuous with positive Lipschitz constants , , , respectively, and , for , ,For , there is a positive constant , such that

Remark 1.1. The authors in [28] (see , , ) and [30] (see , , ) also studied the convergence dynamics of (1.1) under the following foundational conditions:the kernels , are real-valued nonnegative piecewise continuous functions defined on and satisfy: ,for each , we have ,for each , there is a positive constant such that for each , there is a positive constant such that Just take the widely applied delay kernels as mentioned in [3133] as an example, which given by for , where , . One can easy to find out that the conditions about the kernels in [28, 30] are not satisfy at the same time. Therefore, the applications of the main results in [28, 30] appears to be somewhat certain limits because of the obviously restrictive assumptions on the kernels. The main purpose of this paper is to further investigate the convergence of stochastic reaction-diffusion RNNs with more general kernels.

2. Preliminaries

Let and is the space of scalar value Lebesgue measurable functions on which is a Banach space for the -norm: , , then we define the norm as . Assume be an -measurable valued random variable, where, for example, restricted on , and is the space of all continuous -valued functions defined on with a norm . Clearly, (1.1) admits an equilibrium solution .

Definition 2.1. Equation (1.1) is said to be almost surely exponentially stable if there exists a positive constant such that for each pair of and , such that

Definition 2.2. Equation (1.1) is said to be exponentially stable in mean square if there exists a pair of positive constants and such that

Lemma 2.3 (semimartingale convergence theorem [12]). Let and be two continuous adapted increasing processes on with a.s. Let be a real-valued continuous local martingale with a.s. Let be a nonnegative -measurable random variable. Define If is nonnegative, then where (2.4) a.s. means (2.4). In particular, If (2.4) a.s., then for almost all (2.4) that is both and converge to finite random variables.

Lemma 2.4 (see [34]). A nonsingular -matrix is equivalent to the following properties: there exists , such that , .

Furthermore, an -matrix is a -matrix with eigenvalues whose real parts are positive. In mathematics, the class of -matrices are those matrices whose off-diagonal entries are less than or equal to zero, that is, a -matrix satisfies

3. Main Results

Theorem 3.1. Under the assumptions , , , , and , is a nonsingular -matrix, where then the trivial solution of system (1.1) is almost surely exponentially stable and also is exponential stability in mean square.

Proof. From and Lemma 2.4, there exist constants , such that That also can be expressed as follows: From the assumption , one can choose a constant such that Consider the following Lyapunov functional: From the boundary condition , we get Using the Itô formula, for , we have Notice that Submitting inequality (3.9) to (3.8), it is easy to calculate that On the other hand, we observe that From inequality (3.4), we have Hence, is bounded. Integrate both sides of (3.10) with respect to , we have Therefore, we have It is obvious that the right-hand side of (3.14) is a nonnegative martingale, From Lemma 2.3, it can be easily seen that that is On the other hand, since , taking expectation on both sides of the equality (3.14) yields From (3.16) and (3.17), the trivial solution of system (1.1) is almost surely exponentially stable, which is also exponential stability in mean square. This completes the proof.

Removing the reaction-diffusion term from the system (1.1), we investigate the following stochastic recurrent neural networks with unbounded distributed delays: We have the following Corollary 3.2 for system (3.18). The derived conditions for almost surely exponentially stable and exponential stability in mean square can be viewed as byproducts of our results from Theorem 3.1, so the proof is trivial, we omit it.

Corollary 3.2. Under the assumptions , , , and , is a nonsingular -matrix, where then the trivial solution of system (3.18) is almost surely exponentially stable and also is exponential stability in mean square.

Furthermore, if we remove noise from the system, then system (3.18) turns out to be system as the following: To investigate the stability of model (3.20), we should modify the assumption as follows:, , and are Lipschitz continuous with positive Lipschitz constants , , respectively, and , for , .

The derived conditions for exponential stability of system (3.20) can be obtained directly from of our results from Corollary 3.2.

Corollary 3.3. Under the assumptions , , , and , is a nonsingular -matrix, where then the trivial solution of system (3.20) is exponentially stable.

4. An Illustrative Example

In this section, a numerical example is presented to illustrate the correctness of our main result.

Example 4.1. Consider a two-dimensional stochastic reaction-diffusion recurrent neural networks with unbounded distributed delays as follows: where, , ; ; , , , ; , , , ; ; , , , ; , , , for . In the example, let is a positive constant, according to Theorem 3.1, by simple computation, we get Therefore, With the help of Matlab, one can get the eigenvalues of matrix quickly, which are , , by Lemma 2.4, is a nonsingular -matrix, therefore, all conditions in Theorem 3.1 are hold, that is to say, the trivial solution of system (4.1) is almost surely exponentially stable and also is exponential stability in mean square. Just choose , then these conclusions can be verified by the following numerical simulations (Figures 14).

570295.fig.001
Figure 1: Numerical simulation for .
570295.fig.002
Figure 2: Numerical simulation for .
570295.fig.003
Figure 3: Numerical simulation for the mean square of .
570295.fig.004
Figure 4: Numerical simulation for the mean square of .

5. Conclusions

In this paper, stochastic recurrent neural networks with unbounded distributed delays and reaction-diffusion have been investigated. All features of stochastic systems, reaction-diffusion systems have been taken into account in the neural networks. The proposed results generalized and improved some of the earlier published results. The results obtained in this paper are independent of the magnitude of delays and diffusion effect, which implies that strong self-regulation is dominant in the networks. If we remove the noise and reaction-diffusion terms from the system, the derived conditions for stability of general deterministic neural networks can be viewed as byproducts of our results.

Acknowledgments

The authors are extremely grateful to Professor Binggen Zhang and the anonymous reviewers for their constructive and valuable comments, which have contributed a lot to the improved presentation of this paper. This work was supported by the Key Project of Chinese Ministry of Education (2011), the Foundation of Chinese Society for Electrical Engineering (2008), the Excellent Youth Foundation of Educational Committee of Hunan Province of China (10B002), the National Natural Science Funds of China for Distinguished Young Scholar (50925727), the National Natural Science Foundation of China (60876022), and the Scientific Research Fund of Yunnan Province of China (2010ZC150).

References

  1. S. Arik, “A note on the stability of dynamical neural networks,” IEEE Transactions on Circuits and Systems. I, vol. 49, no. 4, pp. 502–504, 2002.
  2. J. Cao, “A set of stability criteria for delayed cellular neural networks,” IEEE Transactions on Circuits and Systems. I, vol. 48, no. 4, pp. 494–498, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  3. J. Cao and J. Liang, “Boundedness and stability for Cohen-Grossberg neural network with time-varying delays,” Journal of Mathematical Analysis and Applications, vol. 296, no. 2, pp. 665–685, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  4. T. Chen and S. I. Amari, “New theorems on global convergence of some dynamical systems,” Neural Networks, vol. 14, no. 3, pp. 251–255, 2001. View at Publisher · View at Google Scholar
  5. Y. Chen, “Global asymptotic stability of delayed Cohen-Grossberg neural networks,” IEEE Transactions on Circuits and Systems. I, vol. 53, no. 2, pp. 351–357, 2006. View at Publisher · View at Google Scholar
  6. S. Guo and L. Huang, “Stability analysis of Cohen-Grossberg neural networks,” IEEE Transactions on Neural Networks, vol. 17, no. 1, pp. 106–117, 2006. View at Publisher · View at Google Scholar
  7. Z. Yuan, L. Huang, D. Hu, and B. Liu, “Convergence of nonautonomous Cohen-Grossberg-type neural networks with variable delays,” IEEE Transactions on Neural Networks, vol. 19, no. 1, pp. 140–147, 2008. View at Publisher · View at Google Scholar
  8. J. H. Park and O. M. Kwon, “Further results on state estimation for neural networks of neutral-type with time-varying delay,” Applied Mathematics and Computation, vol. 208, no. 1, pp. 69–75, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  9. J. H. Park and O. M. Kwon, “Delay-dependent stability criterion for bidirectional associative memory neural networks with interval time-varying delays,” Modern Physics Letters B, vol. 23, no. 1, pp. 35–46, 2009. View at Publisher · View at Google Scholar
  10. S. Haykin, Neural Networks, Prentice Hall, Upper Saddle River, NJ, USA, 1994.
  11. X. X. Liao and X. Mao, “Exponential stability and instability of stochastic neural networks,” Stochastic Analysis and Applications, vol. 14, no. 2, pp. 165–185, 1996. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  12. X. Mao, Stochastic Differential Equations and Applications, Horwood Publishing Limited, Chichester, UK, 2nd edition, 2008.
  13. C. Turchetti, M. Conti, P. Crippa, and S. Orcioni, “On the approximation of stochastic processes by approximate identity neural networks,” IEEE Transactions on Neural Networks, vol. 9, no. 6, pp. 1069–1085, 1998.
  14. C. Huang, Y. He, L. Huang, and W. Zhu, “pth moment stability analysis of stochastic recurrent neural networks with time-varying delays,” Information Sciences, vol. 178, no. 9, pp. 2194–2203, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  15. X. X. Liao and X. Mao, “Stability of stochastic neural networks,” Neural, Parallel & Scientific Computations, vol. 4, no. 2, pp. 205–224, 1996. View at Zentralblatt MATH
  16. S. Blythe, X. Mao, and X. Liao, “Stability of stochastic delay neural networks,” Journal of the Franklin Institute, vol. 338, no. 4, pp. 481–495, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  17. L. Wan and J. Sun, “Mean square exponential stability of stochastic delayed Hopfield neural networks,” Physics Letters Section A, vol. 343, no. 4, pp. 306–318, 2005. View at Publisher · View at Google Scholar
  18. X. Li and J. Cao, “Exponential stability of stochastic Cohen-Grossberg neural networks with time-varying delays,” in Proceedings of the 2nd International Symposium on Neural Networks: Advances in Neural Networks (ISNN '05), vol. 3496 of Lecture Notes in Computer Science, pp. 162–167, 2005.
  19. H. Zhao and N. Ding, “Dynamic analysis of stochastic Cohen-Grossberg neural networks with time delays,” Applied Mathematics and Computation, vol. 183, no. 1, pp. 464–470, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  20. Y. Sun and J. Cao, “pth moment exponential stability of stochastic recurrent neural networks with time-varying delays,” Nonlinear Analysis. Real World Applications, vol. 8, no. 4, pp. 1171–1185, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  21. C. Li and X. Liao, “Robust stability and robust periodicity of delayed recurrent neural networks with noise disturbance,” IEEE Transactions on Circuits and Systems I, vol. 53, no. 10, pp. 2265–2273, 2006. View at Publisher · View at Google Scholar
  22. C. Li, L. Chen, and K. Aihara, “Stochastic stability of genetic networks with disturbance attenuation,” IEEE Transactions on Circuits and Systems II, vol. 54, no. 10, pp. 892–896, 2007. View at Publisher · View at Google Scholar
  23. H. Zhang and Y. Wang, “Stability analysis of Markovian Jumping stochastic Cohen-Grossberg neural networks with mixed time delays,” IEEE Transactions on Neural Networks, vol. 19, no. 2, pp. 366–370, 2008. View at Publisher · View at Google Scholar
  24. D. W. Tank and J. J. Hopfield, “Neural computation by concentrating information in time,” Proceedings of the National Academy of Sciences of the United States of America, vol. 84, no. 7, pp. 1896–1900, 1987. View at Publisher · View at Google Scholar
  25. S. Mohamad, “Global exponential stability in DCNNs with distributed delays and unbounded activations,” Journal of Computational and Applied Mathematics, vol. 205, no. 1, pp. 161–173, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  26. P. Balasubramaniam and R. Rakkiyappan, “Global asymptotic stability of stochastic recurrent neural networks with multiple discrete delays and unbounded distributed delays,” Applied Mathematics and Computation, vol. 204, no. 2, pp. 680–686, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  27. Z. Wang, Y. Liu, M. Li, and X. Liu, “Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delays,” IEEE Transactions on Neural Networks, vol. 17, no. 3, pp. 814–820, 2006. View at Publisher · View at Google Scholar
  28. Y. Lv, W. Lv, and J. Sun, “Convergence dynamics of stochastic reaction-diffusion recurrent neural networks in continuously distributed delays,” Nonlinear Analysis. Real World Applications, vol. 9, no. 4, pp. 1590–1606, 2008. View at Publisher · View at Google Scholar
  29. L. Wan and Q. Zhou, “Exponential stability of stochastic reaction-diffusion Cohen-Grossberg neural networks with delays,” Applied Mathematics and Computation, vol. 206, no. 2, pp. 818–824, 2008. View at Publisher · View at Google Scholar
  30. Z. Liu and J. Peng, “Delay-independent stability of stochastic reaction-diffusion neural networks with Dirichlet boundary conditions,” Neural Computing and Applications, vol. 19, no. 1, pp. 151–158, 2010. View at Publisher · View at Google Scholar
  31. D. W. Tank and J. J. Hopfield, “Neural computation by concentrating information in time,” Proceedings of the National Academy of Sciences of the United States of America, vol. 84, no. 7, pp. 1896–1900, 1987. View at Publisher · View at Google Scholar
  32. K. Gopalsamy and X. Z. He, “Stability in asymmetric Hopfield nets with transmission delays,” Physica D. Nonlinear Phenomena, vol. 76, no. 4, pp. 344–358, 1994. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  33. S. Mohamad and K. Gopalsamy, “Dynamics of a class of discrete-time neural networks and their continuous-time counterparts,” Mathematics and Computers in Simulation, vol. 53, no. 1-2, pp. 1–39, 2000. View at Publisher · View at Google Scholar
  34. J. Cao, “Exponential stability and periodic solutions of delayed cellular neural networks,” Science in China. Series E. Technological Sciences, vol. 43, no. 3, pp. 328–336, 2000. View at Publisher · View at Google Scholar · View at Zentralblatt MATH