About this Journal Submit a Manuscript Table of Contents
Journal of Applied Mathematics
Volume 2014 (2014), Article ID 145061, 11 pages
http://dx.doi.org/10.1155/2014/145061
Research Article

Existence and Exponential Stability of Solutions for Stochastic Cellular Neural Networks with Piecewise Constant Argument

School of Mathematics and Statistics, Central South University, Changsha, Hunan 410083, China

Received 27 October 2013; Accepted 6 January 2014; Published 30 March 2014

Academic Editor: Sabri Arik

Copyright © 2014 Xiaoai Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

By using the concept of differential equations with piecewise constant argument of generalized type, a model of stochastic cellular neural networks with piecewise constant argument is developed. Sufficient conditions are obtained for the existence and uniqueness of the equilibrium point for the addressed neural networks. th moment exponential stability is investigated by means of Lyapunov functional, stochastic analysis, and inequality technique. The results in this paper improve and generalize some of the previous ones. An example with numerical simulations is given to illustrate our results.

1. Introduction

Since Chua and Yang introduced cellular neural networks (CNNs) [1, 2] in 1988, delayed cellular neural networks (DCNNs) were proposed by Chua and Roska [3] in 1990; the dynamics of CNNs and DCNNs have received great attention due to their wide applications in classification of patterns, associative memories, optimization problems, and so forth. As is well known, such applications depend on the existence of an equilibrium point and its stability. Up to now, many results on stability of delayed neural networks have been developed [325].

In real nervous systems, there are many stochastic perturbations that affect the stability of neural networks. The results in [26] suggested that one neural network could be stabilized or destabilized by certain stochastic inputs. It implies that the stability analysis of stochastic neural networks has primary significance in the design and applications of neural networks, such as [1326], but only few works have been done on the th moment exponentially stable for stochastic cellular neural networks [2225].

The theory of differential equations with piecewise constant argument was initiated in Cooke and Wiener [27] and Shah and Wiener [28]. These equations represent a hybrid of continuous and discrete dynamical systems and combine the properties of both the differential and difference equations. It is well known that the reduction of differential equations with piecewise constant argument to discrete equations has been the main and possibly a unique way of stability analysis for these equations [2732]. Particularly, one cannot investigate the problem of stability completely, as only elements of a countable set are allowed to be discussed for initial moments. By introducing arbitrary piecewise constant functions as arguments, the concept of differential equations with a piecewise constant argument has been generalized in [12, 3338], where an integral representation formula was proposed as another approach to meet the challenges discussed above.

To the best of our knowledge, the equations with piecewise constant arguments were not considered as models of neural networks, except possibly in [12, 3638]. In [12], the authors assume CNNs may “memorize” values of the phase variable at certain moments of time to utilize the values during middle process till the next moment. Thus, they arrive at differential equations with a piecewise constant argument. Obviously, the distances between the “memorized” moments may be very variative. Consequently, the concept of generalized type of piecewise constant argument is fruitful. But these systems are deterministic; the dynamical behavior of stochastic neural networks with piecewise constant arguments has never been tackled.

Motivated by the discussion above, our paper attempts to fill the gap by considering stochastic cellular neural networks with piecewise constant arguments. In this paper, criteria on the th moment exponential stability of equilibrium point can be derived by constructing a suitable Lyapunov functional; the results obtained in this paper generalize and improve some of the existing results in [12, 38].

The remainder of this paper is organized as follows. In Section 2, we introduce some notations and assumptions. In Section 3, a sufficient condition for the existence and uniqueness of the solution is obtained. In Section 4, we establish our main results on th moment exponentially stable. A numerical example is given in Section 5 to demonstrate the theoretical results of this paper. Finally, some conclusions are given in Section 6.

2. Preliminaries

In this paper, let , be the family of -valued -adapted process such that, for every , , be the family of -valued -adapted process such that, for every , . , denotes the -dimensional real space, . We fix real-valued sequences such that with as .

We study stochastic cellular neural networks with piecewise constant arguments described by the differential equations: If , , . corresponds to the number of units in a neural network, stands for the potential (or voltage) of cell at time , , are activation functions, denotes the rate with which cell resets its potential to the resting state when isolated from the other cells and inputs, and and denote the strengths of connectivity between cells and at times and , respectively. denotes the external bias on the th unit. Moreover, is -dimensional Brownian motion defined on a complete probability space with a natural filtration generated by , where we associate with the canonical space generated by , and denote by the associated -algebra generated by with the probability measure . Let be the family of -valued random variables with , and let be the th row vector of .

Throughout this paper, the following standard hypotheses are needed.(H1)Functions , are Lipschitz-continuous on with Lipschitz constants , , respectively. That is, for all , .(H2)There exists a positive number such that , .(H3)Assume that , , .(H4)There exist nonnegative constants , such that for all , .

In the following, for further study, we give the following definitions and lemmas.

denotes a vector norm defined by

Definition 1 (see [26]). The equilibrium point of system (1) is said to be the th moment exponentially stable if there exist and such that , where is a solution of system (1) with initial value .
In such a case, The right hand of (6) is commonly known as the th moment Lyapunov exponent of this solution.
When , it is usually said to be exponentially stable in mean square.

Lemma 2 (see [26], (Burkholder-Davis-Gundy inequality)). Let . Define, for , Then, for every , there exist universal positive constants (depending only on ), such that for all .
In particular, one may takeif , ,if , ,if ,.

Lemma 3 (see [39]). If () denote nonnegative real numbers, then where denotes an integer.
A particular form of (9) is, namely,

Lemma 4 (see [39]). Assuming that there exists constant , , if , then the following inequality holds:

3. Existence and Uniqueness of Solutions

In this section, we will study the existence and uniqueness of the equilibrium point of neural networks (1).

Theorem 5. Assume that (H1)–(H4) are fulfilled. Then, for every , there exists a unique solution , of (1) such that and ; is a solution of the following integral equation:

Proof. Existence. Fix ; then there exists , such that . Without loss of generality, we assume that . We will prove that, for every , there exists a solution of (1) such that , .
For each , set and define, by the Picard iterations, Obviously, . Moreover, for , it is easy to see by induction that . For simpleness, we let Then where For any , where .
The Gronwall inequality implies Since is arbitrary, for , we must have Therefore .
Now we claim that, for all , where and will be defined below.
Firstly, we compute where Lemma 4 is used in the first inequality and Lemma 2 and Hölder inequality are used in the second inequality. So (20) holds for .
Next, assume (20) holds for ; then where .
That is, (20) holds for . Hence, by induction, (20) holds for all .
One can see from (20) that, for every , is a Cauchy sequence in . Hence we have as in . Letting in (19) gives Therefore, .
It remains to show that is a solution of system (1) satisfying . Note that Hence, we can let in (13), and (12) is derived. Again, using the same argument, we can continue from to . Hence, the mathematical induction completes the proof.
Uniqueness. Let and be the two solutions, . By noting we can easily show that The Gronwall inequality then yields that This implies that for . The uniqueness has been proved for . For every , we can prove the uniqueness by mathematical induction on . As the assumptions of the existence-and-uniqueness theorem hold on every finite subinterval of , then (1) has a unique solution on the entire interval (see [26]). Hence, the theorem is proved.

From the proof of Theorem 5, we easily obtain the following theorem.

Theorem 6. Assume that (H1)–(H3) are fulfilled. Then, for every , there exists a unique solution of the following system such that , :

Remark 7. When the system (1) neglects stochastic perturbations, it becomes the system (28) which is studied in [12, 38]. In order to obtain the existence and uniqueness, more restrictions on coefficients are needed in [12, 38], which might greatly restrict the application domain of the neural networks. Clearly, our results in this paper contain the results given in [12, 38], and the conditions are less restrictive compared with [12, 38].

4. th Moment Exponential Stability

In this section, we will establish some sufficient conditions ensuring the th moment exponential stability of the equilibrium point of system (1).

Let denote the family of all nonnegative functions on which are continuous once differentiable in and twice differentiable in . If , define an operator associated with (1) as where

Let ; then (1) can be written by where

It is clear that the stability of the zero solution of (31) is equivalent to that of the equilibrium of (1). Therefore, we restrict our discussion to the stability of the zero solution of (31).

For simplicity of notation, we denote

In order to obtain our results, the following assumption and Lemmas are needed:(H5).

Lemma 8. Let be a solution of (31) and let (H1)–(H5) be satisfied. Then the following inequality holds for all , where

Proof. Fix ; there exists , such that . Then from Lemma 4, it follows that Now we compute and : The Hölder inequality yields Substituting (38)–(40) into (37) yields that where , .
On the other hand, Lemma 2 and (H3) yield Substituting (41)-(42) into (36), we have where , .
By Gronwall inequality
Furthermore, for , we have Thus, it follows from condition (H5) that Therefore, (34) holds for ; the extension of (34) for all is obvious. This proves the theorem.

Lemma 9 (Mao [26]). Assume that there is a function and positive constants , , and , such that for all . Then for all .

For convenience, we adopt the following notation:

Theorem 10. Assume that (H1)–(H5) hold and, furthermore, that the following inequality is satisfied: Then system (31) is the th moment exponentially stable.

Proof. We define a Lyapunov function Obviously, (47) is satisfied with . Now we prove (48) holds.
For , , the operator with respect to (12) is given by By using Lemma 8, we obtain This, together with (51), we get that (48) is true. Now, define for convenience as follows: By Lemma 9, the equilibrium of (31) is the th moment exponentially stable and the th moment Lyapunov exponent should not be greater than .

Theorem 11 (see [12]). Suppose that (H1)–(H3) and (H5) hold true. Assume, furthermore, that the following inequality is satisfied: where then system (28) is globally exponentially stable.

Proof. Let , ; the conclusion is straightforward.

Remark 12. Theorem 10 generalizes the work of Akhmet et al. [12] and the conditions in the theorem are easy to verify.

5. Illustrative Example

In the following, we will give an example to illustrate our results.

Example 1. Consider the following model: Suppose the activation function is described by , .

Let , when , . , , , , , . , . , .

It is easy to check that , , , , .

When , , we compute , , , , , , and .

It is obvious that . Thus all conditions of Theorem 10 in this paper are satisfied; the equilibrium solution of (58) is exponentially stable in mean square.

Remark 13. When , let , ; then system (58) can be viewed as the system (2) in [22]. We can compute , where was defined in [22] and the condition is not satisfied. Hence results in [22] are useless to judge the exponential stability of system (58).

6. Conclusion

This is the first time that stochastic cellular neural networks with piecewise constant argument of generalized type are considered. Sufficient conditions on existence, uniqueness, and the th moment exponential stability of the equilibrium point are derived by applying stochastic analysis techniques and some known inequalities. The obtained results could be useful in the design and applications of cellular neural networks. Furthermore, the method given in this paper may be extended to study more complex systems, such as stochastic neural networks with piecewise constant argument and impulsive perturbations.

Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The author would like to thank Dr. Yanchun Zhang, Dr. Yeqing Ren, the editor, and the anonymous referees for their detailed comments and valuable suggestions which considerably improved the presentation of this paper. This work was supported by the Project of Humanities and Social Sciences of Ministry of Education of China under Grant no. 13YJC630232.

References

  1. L. O. Chua and L. Yang, “Cellular neural networks: theory,” IEEE Transactions on Circuits and Systems, vol. 35, no. 10, pp. 1257–1272, 1988. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  2. L. O. Chua and L. Yang, “Cellular neural networks: applications,” IEEE Transactions on Circuits and Systems, vol. 35, no. 10, pp. 1273–1290, 1988. View at Publisher · View at Google Scholar · View at MathSciNet
  3. L. O. Chua and T. Roska, “Cellular neural networks with nonlinear and delay-type template elements,” in Proceedings of the IEEE International Workshop on Cellular Neural Networks and their Applications, p. 1225, 1990.
  4. S. Arik, “A new condition for robust stability of uncertain neural network s with time delays,” Neurocomputing, 2013. View at Publisher · View at Google Scholar
  5. S. Arik, “New criteria for global robust stability of delayed neural networks with norm bounded uncertainties,” IEEE Transactions on Neural Networks and Learning Systems, no. 99, 2013. View at Publisher · View at Google Scholar
  6. Z. Orman and S. Arik, “An analysis of stability of a class of neutral-type neural networks with discrete time delays,” Abstract and Applied Analysis, vol. 2013, Article ID 143585, 9 pages, 2013. View at MathSciNet
  7. O. Faydasicok and S. Arik, “An analysis of stability of uncertain neural networks with multiple time delays,” Journal of the Franklin Institute, vol. 350, no. 7, pp. 1808–1826, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  8. S. Arik, J. Park, T. Huang, and J. Oliveira, “Analysis of nonlinear dynamics of neural networks,” Abstract and Applied Analysis, vol. 2013, Article ID 756437, 1 pages, 2013. View at Publisher · View at Google Scholar
  9. B. Wu, Y. Liu, and J. Lu, “New results on global exponential stability for impulsive cellular neural networks with any bounded time-varying delays,” Mathematical and Computer Modelling, vol. 55, no. 3-4, pp. 837–843, 2012. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  10. S. Ahmad and I. M. Stamova, “Global exponential stability for impulsive cellular neural networks with time-varying delays,” Nonlinear Analysis: Theory, Methods & Applications, vol. 69, no. 3, pp. 786–795, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  11. H. Zhao and J. Cao, “New conditions for global exponential stability of cellular neural networks with delays,” Neural Networks, vol. 18, no. 10, pp. 1332–1340, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  12. M. U. Akhmet, D. Aruğaslan, and E. Yılmaz, “Stability in cellular neural networks with a piecewise constant argument,” Journal of Computational and Applied Mathematics, vol. 233, no. 9, pp. 2365–2373, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  13. S. Wu, C. Li, X. Liao, and S. Duan, “Exponential stability of impulsive discrete systems with time delay and applications in stochastic neural networks: A Razumikhin approach,” Neurocomputing, vol. 82, pp. 29–36, 2012. View at Publisher · View at Google Scholar · View at Scopus
  14. L. Chen, R. Wu, and D. Pan, “Mean square exponential stability of impulsive stochastic fuzzy cellular neural networks with distributed delays,” Expert Systems with Applications, vol. 38, no. 5, pp. 6294–6299, 2011. View at Publisher · View at Google Scholar · View at Scopus
  15. X. Li, “Existence and global exponential stability of periodic solution for delayed neural networks with impulsive and stochastic effects,” Neurocomputing, vol. 73, no. 4-6, pp. 749–758, 2010. View at Publisher · View at Google Scholar · View at Scopus
  16. Y.-G. Kao, J.-F. Guo, C.-H. Wang, and X.-Q. Sun, “Delay-dependent robust exponential stability of Markovian jumping reaction-diffusion Cohen-Grossberg neural networks with mixed delays,” Journal of the Franklin Institute, vol. 349, no. 6, pp. 1972–1988, 2012. View at Publisher · View at Google Scholar · View at MathSciNet
  17. Y. Kao and C. Wang, “Global stability analysis for stochastic coupled reaction-diffusion systems on networks,” Nonlinear Analysis: Real World Applications, vol. 14, no. 3, pp. 1457–1465, 2013. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  18. G. Yang, Y. Kao, W. Li, and S. Xiqian, “Exponential stability of impulsive stochastic fuzzy cellular neural networks with mixed delays and reaction-diffusion terms,” Neural Computing and Applications, vol. 23, no. 3-4, pp. 1109–1121, 2013.
  19. G. Yang, Y. Kao, and C. Wang, “Exponential stability and periodicity of fuzzy delayed reaction-diffusion cellular neural networks with impulsive effect,” Abstract and Applied Analysis, vol. 2013, Article ID 645262, 9 pages, 2013. View at Zentralblatt MATH · View at MathSciNet
  20. C. Wang, Y. Kao, and G. Yang, “Exponential stability of impulsive stochastic fuzzy reaction-diffusion Cohen-Grossberg neural networks with mixed delays,” Neurocomputing, vol. 89, pp. 55–63, 2012. View at Publisher · View at Google Scholar · View at Scopus
  21. X. Li and X. Fu, “Synchronization of chaotic delayed neural networks with impulsive and stochastic perturbations,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 2, pp. 885–894, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  22. Y. Sun and J. Cao, “pth moment exponential stability of stochastic recurrent neural networks with time-varying delays,” Nonlinear Analysis: Real World Applications, vol. 8, no. 4, pp. 1171–1185, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  23. C. Huang, Y. He, L. Huang, and W. Zhu, “pth moment stability analysis of stochastic recurrent neural networks with time-varying delays,” Information Sciences. An International Journal, vol. 178, no. 9, pp. 2194–2203, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  24. X. Li, J. Zou, and E. Zhu, “pth moment exponential stability of impulsive stochastic neural networks with mixed delays,” Mathematical Problems in Engineering, vol. 2012, Article ID 175934, 20 pages, 2012. View at Zentralblatt MATH · View at MathSciNet
  25. X. Li, J. Zou, and E. Zhu, “The pth moment exponential stability of stochastic cellular neural networks with impulses,” Advances in Difference Equations, vol. 2013, article 6, 2013.
  26. X. Mao, Stochastic differential equations and their applications, Horwood Publishing, Chichester, UK, 1997. View at MathSciNet
  27. K. L. Cooke and J. Wiener, “Retarded differential equations with piecewise constant delays,” Journal of Mathematical Analysis and Applications, vol. 99, no. 1, pp. 265–297, 1984. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  28. S. M. Shah and J. Wiener, “Advanced differential equations with piecewise constant argument deviations,” International Journal of Mathematics and Mathematical Sciences, vol. 6, no. 4, pp. 671–703, 1983. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  29. A. Cabada, J. B. Ferreiro, and J. J. Nieto, “Green's function and comparison principles for first order periodic differential equations with piecewise constant arguments,” Journal of Mathematical Analysis and Applications, vol. 291, no. 2, pp. 690–697, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  30. N. V. Minh and T. T. Dat, “On the almost automorphy of bounded solutions of differential equations with piecewise constant argument,” Journal of Mathematical Analysis and Applications, vol. 326, no. 1, pp. 165–178, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  31. Y. Muroya, “Persistence, contractivity and global stability in logistic equations with piecewise constant delays,” Journal of Mathematical Analysis and Applications, vol. 270, no. 2, pp. 602–635, 2002. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  32. P. Yang, Y. Liu, and W. Ge, “Green's function for second order differential equations with piecewise constant arguments,” Nonlinear Analysis: Theory, Methods & Applications, vol. 64, no. 8, pp. 1812–1830, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  33. M. U. Akhmet, “Integral manifolds of differential equations with piecewise constant argument of generalized type,” Nonlinear Analysis: Theory, Methods & Applications, vol. 66, no. 2, pp. 367–383, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  34. M. U. Akhmet, “Stability of differential equations with piecewise constant arguments of generalized type,” Nonlinear Analysis: Theory, Methods & Applications, vol. 68, no. 4, pp. 794–803, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  35. M. U. Akhmet and C. Büyükadalı, “Differential equations with state-dependent piecewise constant argument,” Nonlinear Analysis: Theory, Methods & Applications, vol. 72, no. 11, pp. 4200–4210, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  36. M. U. Akhmet and E. Yılmaz, “Impulsive Hopfield-type neural network system with piecewise constant argument,” Nonlinear Analysis: Real World Applications, vol. 11, no. 4, pp. 2584–2593, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  37. M. U. Akhmet and E. Yılmaz, “Global exponential stability of neural networks with non-smooth and impact activations,” Neural Networks, vol. 34, pp. 18–27, 2012. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  38. M. U. Akhmet, D. Aruǧaslan, and E. Yilmaz, “Stability analysis of recurrent neural networks with piecewise constant argument of generalized type,” Neural Networks, vol. 23, no. 7, pp. 805–811, 2010. View at Publisher · View at Google Scholar · View at Scopus
  39. G. H. Hardy, J. E. Littlewood, and G. Pólya, Inequalities, The University Press, Cambridge, UK, 2nd edition, 1952. View at MathSciNet