About this Journal Submit a Manuscript Table of Contents
Abstract and Applied Analysis
Volume 2013 (2013), Article ID 746241, 12 pages
http://dx.doi.org/10.1155/2013/746241
Research Article

Global -Stability Analysis for Impulsive Stochastic Neural Networks with Unbounded Mixed Delays

1School of Mathematical Sciences, University of Jinan, Jinan 250022, China
2Computer Department, Jinan Vocational College, Jinan 250103, China

Received 19 October 2012; Accepted 14 December 2012

Academic Editor: Chuandong Li

Copyright © 2013 Lizi Yin and Xinchun Wang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We investigate the global -stability in the mean square of impulsive stochastic neural networks with unbounded time-varying delays and continuous distributed delays. By choosing an appropriate Lyapunov-Krasovskii functional, a novel robust stability condition, in the form of linear matrix inequalities, is derived. These sufficient conditions can be tested by MATLAB LMI software packages. The results extend and improve the earlier publication. Two numerical examples are provided to illustrate the effectiveness of the obtained theoretical results.

1. Introduction

Since 1982 when American California Engineering institute physicist Hopfield proposed Hopfield neural networks models [1, 2], artificial neural networks theory and applied research have attracted more and more attentions. The application of artificial neural networks is very extensive. It can process a variety of information, such as artificial intelligence, secure communications, network optimization, military information, pattern recognition, and so forth. Information processing of neural networks greatly depends on the system's dynamic characteristics. The stability is one of very important dynamic characteristics of neural network.

External perturbations can affect the stability of a real system, so it is necessary to consider the stochastic effects for the stability property of neural networks. Recently, some results to guarantee the global asymptotic stability or exponential stability for stochastic neural networks are obtained, see [38]. For example, in [4], Blythe et al. investigated stochastic stability of neural networks with constant delay. In [6], Wang et al. analysed the stability in the mean square for stochastic Cohen-Grossberg neural networks with time-varying delays and continuous distributed delays.

As we known, artificial neural networks are often subject to impulsive perturbations which can affect systems’ stability, see [911]. Impulsive perturbations often make stable systems unstable. Therefore, we should consider systems’ stability in impulsive effects. Very recently, some results on stability of stochastic neural networks with impulses have been proposed and studied, see [12, 13].

The neural network may be disturbed by environmental noises, which cause the uncertainty of the connection weights of the neurons and affect the dynamical behaviors of system. Therefore, parameter uncertainties should be taken into account in system model. There are some global stability results of neural networks under the influence of parameter uncertainties [1420]. In [14], the authors kept a watchful eye on robust exponential stability for uncertain stochastic neural networks with discrete interval and distributed time-varying delays by Lyapunov-Krasovski functional method and the random analysis theory.

The time delays happen frequently in the neurotransmission. Many researchers have a large amount interest in time delay neural networks. There are many sufficient conditions have been proposed to guarantee the stability of neural networks with various type of time delays [2126]. For example, in [22], Lou and Cui investigated the global asymptotic stability of the Hopfield neural networks with bounded time delay by constructing suitable Lyapunov-krasovski functional and employing LMI method. In [24], Raja et al. further obtained some results of stochastic neural networks with mixed time delays by using Lyapunov stability theory and LMI technology. However, most of the results have only focused on bounded delays. As we know, in many engineering time delays depend on the histories heavily and may be unbounded [27, 28]. In this case, the conditions on delays of those existing results are too restrictive.

Recently, Chen and Wang [29], Liu and Chen [30] proposed a new concept of -stability, which can be applied to neural networks with unbounded time-varying delay. Moreover, few results have been reported in the literature concerning the problem of -stability for parameter impulsive stochastic neural networks with unbounded time-varying delays and continuously distributed delays, which inspire our interests.

This paper is concerned with the global stability analysis in the mean square for impulsive stochastic neural networks with unbounded time-varying delays and continuously distributed delays. By choosing an appropriate Lyapunov-Krasovskii functional, a novel robust stability condition, in the form of linear matrix inequalities, is derived. These sufficient conditions can be tested by Matlab LMI software packages. Our results extend and improve the some results [29, 30].

The paper is organized as following. In Section 2, the basic definitions and assumptions are given together with the statement of the problem. In Section 3, the sufficient conditions of -stability in the mean square of impulsive stochastic neural networks with unbounded time-varying delays and continuously distributed delays are obtained. In Section 4, global robust -stability criteria in the mean square are derived for uncertain neural networks model. Then, two numerical examples are provided to demonstrate the effectiveness of our results in Section 5. Finally, concluding remarks are given in Section 6.

Notations. Throughout this paper, and denote, respectively, the set of real numbers and -dimensional real spaces equipped with the Euclidean norm ; denotes the set of positive integers; denotes the set of all . Let () denote that the matrix is a symmetric and positive semidefinite (negative semidefinite) matrix; denotes the transpose of the matrix ; () denotes the maximum eigenvalue (minimum eigenvalue) of matrix . is the space of square integrable vector. Moreover, let be a probability space with a filtration satisfying the usual conditions (It is right continuous and contains all -null sets). denotes the family of all -measurable. stands for the mathematical expectation operator with respect to the given probability measure .

2. Model Description and Some Assumptions

Consider the following neural network model: where is the neuron state vector of the neural network; , is decay rate; , and are connection weight matrix; is the neuron activation function; represents the transmission delay of neural networks; is the delay kernel function; is an input constant matrix, and is the impulsive function.

Throughout the paper, we make the following assumptions.

Assumption 1. The neuron activation functions , are bounded, continuous differentiable, and satisfy where are some positive constants.

Assumption 2. is a nonnegative and continuous differentiable time-varying delay and satisfies , is a positive constant.

Assumption 3. The delay kernels , are some real nonnegative continuous functions defined in and satisfy

Assumption 4. The impulse times satisfy .

If the functions satisfy Assumption 1, the system (1) exists an equilibrium point, see [14]. Assume that is an equilibrium point of system (1), and the impulsive function of system (1) characterized by , where , is a real matrix.

Letting , one can switch the equilibrium point of system (1) to the origin and have where , , and . By the definition of and Assumption 1, satisfies the sector condition and , where are some positive constants.

In the practical application, a neural system is affected by external perturbations. It is necessary to consider the effect of randomness to neural networks. We obtain stochastic impulsive neural networks with delays: where is dimensional Brownian motions defined on a complete probability space .

Assumption 5. Assume that is local Lipschitz continuous and satisfies the linear growth condition, that is, for all . Moreover, there exist dimension matrices , such that

The initial conditions for system (6) are , where denotes the family of all bounded -measurable, -valued variables, satisfying .

For completeness, we first give the following definition and lemmas.

Definition 1 (see [29]). Suppose that is a nonnegative continuous function and satisfies. If there exists a scalar such that then the system (6) is said to be globally stochastically -stable in the mean square.

Obviously, the definition of global stochastic -stability in the mean square includes the globally stochastically asymptotical stability and exponential stability in the mean square.

Lemma 2 (see [31]). Let be real matrices of appropriate dimensions, and be a scalar, then one has

Lemma 3 (see [32]). Let and be the real matrices of appropriate dimensions with satisfying , then if and only if there exists a scalar , such that

Lemma 4 (see [32]). For a given matrix where , is equivalent to any one of the following conditions:(1); (2).

3. Stochastic Stability Analysis of Neural Networks

Let denote the family of all nonnegative functions on , which are continuous one differentiable in and twice differentiable in . For every such , we define an operator associated with system (6), where

Theorem 5. Assume that Assumptions 15 hold. Then, the zero solution of system (6) is globally stochastically -stable in the mean sense if there exist diagonal matrices , and , some constants , , , , , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where
, , and impulsive operator .

Proof. By Lemma 4, (16) is equal to the following inequality: Based on the system (6), we construct the following Lyapunov-Krasovski functional: where By Itô's formula, we can obtain the following stochastic differential: where Along the trajectories of system (6), we have that By Lemma 2, we get that From Assumption 5, we obtain that Therefore, From (15), we have We use Cauchy's inequality and get Thus, Substituting (25), (26), and (28) to (21), we get where So, we have Taking the mathematical expectation, we get By , we get In addition, Noting , we have It is obvious that , , therefore By (33) and (36), we know that is monotonically nonincreasing for , which implies that From the definition of , we can deduce that where . It implies that This completes the proof of Theorem 5.

Remark 6. Theorem 5 provides a global stochastic -stability in the mean sense for a impulsive stochastic differential system (6). It should be noted that our results depend on the upper bound of the derivative of time-varying delay, the influence of randomness, the delay kernels of continuous distribution, and have nothing to do with the range of time-varying delay. Therefore, our results can be applied to the impulsive stochastic neural networks with unbounded time-varying delay and continuous distributed delay.

Remark 7. In [29, 30], different approaches were used to study -stability of neural networks with unbounded time-varying. However, the impulsive effect and unbounded continuous distributed delay are not considered in their models. It is known that neural networks with unbounded continuous distributed delays in impulsive effect are of great importance in many practically problems. Hence, our results in this paper are more general than those reported in [29, 30],

If the diffusion coefficient , the model (6) becomes the model (4), then the following result can be obtained.

Corollary 8. Assume that Assumptions 14 hold. Then, the zero solution of system (4) is globally -stable if there exist diagonal matrices , and , some constants , , , , , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , and impulsive operator .

If we take in Theorem 5, then we obtain the following testable condition.

Corollary 9. Assume that Assumptions 15 hold. Then, the zero solution of system (6) is globally stochastically -stable in the mean square if there exist some constants , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , and impulsive operator .

4. Robust Stochastic Stability Analysis of Neural Networks

Consider the following uncertain stochastic neural networks: where some parameters and variables are introduced in Section 2, the uncertainties are described as follows: where are perturbed matrices satisfying where are some appropriate dimensions constant matrices, is an unknown real time-varying function with appropriate dimensions and satisfies where is appropriate dimensions unit matrix.

Theorem 10. Assume that Assumptions 15 hold. Then, the zero solution of system (44) is globally robustly stochastically -stable in the mean square if there exist diagonal matrices ,, and , some constants , , , , ,, , , , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , , , , and impulsive operator .

Proof. Let the Lyapunov-Krasovskii functional as defined in Theorem 5, then by Itô's formula, we have
By Lemma 3, we get .
This completes the proof of Theorem 10.

Remark 11. The neural network can be disturbed by environmental noises, which cause system parameters' uncertainty. It can lead to system instability. As far as we know, there is no literature that is published on robust -stability on uncertain neural networks with unbounded continuous distributed delays in impulsive perturbations.

There exist only parameter uncertainties and no stochastic perturbations. So the uncertain neural networks can be described as

Corollary 12. Assume that Assumptions 15 hold. Then, the zero solution of system (52) is globally robustly -stable if there exist diagonal matrices , , and , some constants , , , , , , , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , , , , and impulsive operator .

5. Illustrative Examples

In this section, we will give two examples to show the validity of the results obtained.

Example 13. Consider the stochastic impulsive neural network with two neurons: where the activation function is described by , , , . It is obvious that is an equilibrium point of system (55). Let and choose , , , , and the parameter matrices are, respectively, given by

In this case, we get . Let , , and . We can get via MATLAB LMI toolbox. By Theorem 5, the equilibrium point of model (55) with unbounded time-varying delay and continuously distributed delay is globally stochastically -stable in the mean square. The numerical simulation is shown in Figure 1.

fig1
Figure 1: (a) Time-series of the of model (55) without impulsive effects for . (b) Time-series of the of model (55) without impulsive effects for . (c) Time-series of the of model (55) with impulsive effects for . (d) Time-series of the of model (55) with impulsive effects for .

Example 14. Consider the uncertain stochastic impulsive neural network with two neurons: where the activation function is described by , , . It is obvious that is an equilibrium point of system (57). Let and choose , , , , and the parameter matrices are, respectively, given by In this case, we get . Let , , we can get the following feasible solution via MATLAB LMI toolbox: and obtain . By Theorem 10, the equilibrium point of model (24) is globally robustly stochastically -stable in the mean square.

6. Concluding Remarks

This paper firstly investigated the global stochastic -stability in the mean square of a class of impulsive stochastic neural network with unbounded mixed delays. We obtained some sufficient conditions by using Lyapunov-Krasovskii functional method, the random analysis theory, and the linear matrix inequality (LMI) technique. Secondly, we researched global robust -stability in the mean square for a class of uncertain impulsive neural network with unbounded mixed delays. Our results improve and generalize some earlier works reported in the literature. Finally, two examples and their numerical simulations are given to illustrate the effectiveness of obtained results.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (61174217), the Natural Science Foundation of Shandong Province of China (ZR2010AL016 and ZR2011AL007), and the Doctoral Foundation of University of Jinan (XBS1244).

References

  1. J. J. Hopfield and D. W. Tank, “‘Neural’ computation of decisions in optimization problems,” Biological Cybernetics, vol. 52, no. 3, pp. 141–152, 1985. View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  2. J. J. Hopfield and D. W. Tank, “Computing with neural circuits: a model,” Science, vol. 233, no. 4764, pp. 625–633, 1986. View at Publisher · View at Google Scholar · View at Scopus
  3. Z. Wang, Y. Liu, K. Fraser, and X. Liu, “Stochastic stability of uncertain Hopfield neural networks with discrete and distributed delays,” Physics Letters A, vol. 354, no. 4, pp. 288–297, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  4. S. Blythe, X. Mao, and X. Liao, “Stability of stochastic delay neural networks,” Journal of the Franklin Institute, vol. 338, no. 4, pp. 481–495, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  5. C. Huang, Y. He, and H. Wang, “Mean square exponential stability of stochastic recurrent neural networks with time-varying delays,” Computers & Mathematics with Applications, vol. 56, no. 7, pp. 1773–1778, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  6. Z. Wang, Y. Liu, M. Li, and X. Liu, “Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delays,” IEEE Transactions on Neural Networks, vol. 17, no. 3, pp. 814–820, 2006. View at Publisher · View at Google Scholar · View at Scopus
  7. W. Su and Y. Chen, “Global robust stability criteria of stochastic Cohen-Grossberg neural networks with discrete and distributed time-varying delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 14, no. 2, pp. 520–528, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  8. L. Wan and Q. Zhou, “Exponential stability of stochastic reaction-diffusion Cohen-Grossberg neural networks with delays,” Applied Mathematics and Computation, vol. 206, no. 2, pp. 818–824, 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. S. Mohamad, K. Gopalsamy, and H. Akça, “Exponential stability of artificial neural networks with distributed delays and large impulses,” Nonlinear Analysis: Real World Applications, vol. 9, no. 3, pp. 872–888, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  10. J. Zhou, L. Xiang, and Z. Liu, “Synchronization in complex delayed dynamical networks with impulsive effects,” Physica A, vol. 384, no. 2, pp. 684–692, 2007. View at Publisher · View at Google Scholar · View at Scopus
  11. H. Gu, H. Jiang, and Z. Teng, “Existence and globally exponential stability of periodic solution of BAM neural networks with impulses and recent-history distributed delays,” Neurocomputing, vol. 71, no. 4–6, pp. 813–822, 2008. View at Publisher · View at Google Scholar · View at Scopus
  12. H. Zhang, M. Dong, Y. Wang, and N. Sun, “Stochastic stability analysis of neutral-type impulsive neural networks with mixed time-varying delays and Markovian jumping,” Neurocomputing, vol. 73, no. 13–15, pp. 2689–2695, 2010. View at Publisher · View at Google Scholar · View at Scopus
  13. L. Chen, R. Wu, and D. Pan, “Mean square exponential stability of impulsive stochastic fuzzy cellular neural networks with distributed delays,” Expert Systems with Applications, vol. 38, no. 5, pp. 6294–6299, 2011. View at Publisher · View at Google Scholar · View at Scopus
  14. R. Rakkiyappan, P. Balasubramaniam, and S. Lakshmanan, “Robust stability results for uncertain stochastic neural networks with discrete interval and distributed time-varying delays,” Physics Letters A, vol. 372, no. 32, pp. 5290–5298, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  15. W. Zhou and M. Li, “Mixed time-delays dependent exponential stability for uncertain stochastic high-order neural networks,” Applied Mathematics and Computation, vol. 215, no. 2, pp. 503–513, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  16. H. Li, C. Wang, P. Shi, and H. Gao, “New passivity results for uncertain discrete-time stochastic neural networks with mixed time delays,” Neurocomputing, vol. 73, no. 16-18, pp. 3291–3299, 2010. View at Publisher · View at Google Scholar · View at Scopus
  17. O. M. Kwon, S. M. Lee, and J. H. Park, “Improved delay-dependent exponential stability for uncertain stochastic neural networks with time-varying delays,” Physics Letters A, vol. 374, no. 10, pp. 1232–1241, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  18. Z. Wang, Y. Liu, X. Liu, and Y. Shi, “Robust state estimation for discrete-time stochastic neural networks with probabilistic measurement delays,” Neurocomputing, vol. 74, no. 1-3, pp. 256–264, 2010. View at Publisher · View at Google Scholar · View at Scopus
  19. C. Li, J. Shi, and J. Sun, “Stability of impulsive stochastic differential delay systems and its application to impulsive stochastic neural networks,” Nonlinear Analysis, Theory, Methods and Applications, vol. 74, no. 10, pp. 3099–3111, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  20. P. Balasubramaniam and M. S. Ali, “Robust exponential stability of uncertain fuzzy Cohen-Grossberg neural networks with time-varying delays,” Fuzzy Sets and Systems, vol. 161, no. 4, pp. 608–618, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  21. X. Liao and C. Li, “An LMI approach to asymptotical stability of multi-delayed neural networks,” Physica D, vol. 200, no. 1-2, pp. 139–155, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  22. X. Y. Lou and B. Cui, “New LMI conditions for delay-dependent asymptotic stability of delayed Hopfield neural networks,” Neurocomputing, vol. 69, no. 16-18, pp. 2374–2378, 2006. View at Publisher · View at Google Scholar · View at Scopus
  23. Q. Zhang and X. W. J. Xu, “Delay-dependent global stability results for delayed Hopfield neural networks,” Chaos, Solitons and Fractals, vol. 34, no. 2, pp. 662–668, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  24. R. Raja, R. Sakthivel, and S. M. Anthoni, “Stability analysis for discrete-time stochastic neural networks with mixed time delays and impulsive effects,” Canadian Journal of Physics, vol. 88, no. 12, pp. 885–898, 2010. View at Publisher · View at Google Scholar · View at Scopus
  25. R. Sakthivel, R. Samidurai, and S. M. Anthoni, “New exponential stability criteria for stochastic BAM neural networks with impulses,” Physica Scripta, vol. 82, no. 4, Article ID 045802, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  26. R. Sakthivel, R. Samidurai, and S. M. Anthoni, “Asymptotic stability of stochastic delayed recurrent neural networks with impulsive effects,” Journal of Optimization Theory and Applications, vol. 147, no. 3, pp. 583–596, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  27. S.-I. Niculescu, Delay Effects on Stability: A Robust Control Approach, vol. 269 of Lecture Notes in Control and Information Sciences, Springer, London, UK, 2001. View at MathSciNet
  28. V. B. Kolmanovskiĭ and V. R. Nosov, Stability of Functional-Differential Equations, vol. 180 of Mathematics in Science and Engineering, Academic Press, London, UK, 1986. View at MathSciNet
  29. T. Chen and L. Wang, “Global μ-stability of delayed neural networks with unbounded time-varying delays,” IEEE Transactions on Neural Networks, vol. 18, no. 6, pp. 1836–1840, 2007. View at Publisher · View at Google Scholar · View at Scopus
  30. X. Liu and T. Chen, “Robust μ-stability for uncertain stochastic neural networks with unbounded time-varying delays,” Physica A, vol. 387, no. 12, pp. 2952–2962, 2008. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  31. J. Cao, K. Yuan, and H. X. Li, “Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays,” IEEE Transactions on Neural Networks, vol. 17, no. 6, pp. 1646–1651, 2006. View at Publisher · View at Google Scholar · View at Scopus
  32. S. Boyd, L. El Ghaoui, E. Feron, and V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory, vol. 15 of SIAM Studies in Applied Mathematics, SIAM, Philadelphia, Pa, USA, 1994. View at Publisher · View at Google Scholar · View at MathSciNet