- About this Journal ·
- Abstracting and Indexing ·
- Advance Access ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Journal of Applied Mathematics
Volume 2014 (2014), Article ID 145061, 11 pages
Existence and Exponential Stability of Solutions for Stochastic Cellular Neural Networks with Piecewise Constant Argument
School of Mathematics and Statistics, Central South University, Changsha, Hunan 410083, China
Received 27 October 2013; Accepted 6 January 2014; Published 30 March 2014
Academic Editor: Sabri Arik
Copyright © 2014 Xiaoai Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
By using the concept of differential equations with piecewise constant argument of generalized type, a model of stochastic cellular neural networks with piecewise constant argument is developed. Sufficient conditions are obtained for the existence and uniqueness of the equilibrium point for the addressed neural networks. th moment exponential stability is investigated by means of Lyapunov functional, stochastic analysis, and inequality technique. The results in this paper improve and generalize some of the previous ones. An example with numerical simulations is given to illustrate our results.
Since Chua and Yang introduced cellular neural networks (CNNs) [1, 2] in 1988, delayed cellular neural networks (DCNNs) were proposed by Chua and Roska  in 1990; the dynamics of CNNs and DCNNs have received great attention due to their wide applications in classification of patterns, associative memories, optimization problems, and so forth. As is well known, such applications depend on the existence of an equilibrium point and its stability. Up to now, many results on stability of delayed neural networks have been developed [3–25].
In real nervous systems, there are many stochastic perturbations that affect the stability of neural networks. The results in  suggested that one neural network could be stabilized or destabilized by certain stochastic inputs. It implies that the stability analysis of stochastic neural networks has primary significance in the design and applications of neural networks, such as [13–26], but only few works have been done on the th moment exponentially stable for stochastic cellular neural networks [22–25].
The theory of differential equations with piecewise constant argument was initiated in Cooke and Wiener  and Shah and Wiener . These equations represent a hybrid of continuous and discrete dynamical systems and combine the properties of both the differential and difference equations. It is well known that the reduction of differential equations with piecewise constant argument to discrete equations has been the main and possibly a unique way of stability analysis for these equations [27–32]. Particularly, one cannot investigate the problem of stability completely, as only elements of a countable set are allowed to be discussed for initial moments. By introducing arbitrary piecewise constant functions as arguments, the concept of differential equations with a piecewise constant argument has been generalized in [12, 33–38], where an integral representation formula was proposed as another approach to meet the challenges discussed above.
To the best of our knowledge, the equations with piecewise constant arguments were not considered as models of neural networks, except possibly in [12, 36–38]. In , the authors assume CNNs may “memorize” values of the phase variable at certain moments of time to utilize the values during middle process till the next moment. Thus, they arrive at differential equations with a piecewise constant argument. Obviously, the distances between the “memorized” moments may be very variative. Consequently, the concept of generalized type of piecewise constant argument is fruitful. But these systems are deterministic; the dynamical behavior of stochastic neural networks with piecewise constant arguments has never been tackled.
Motivated by the discussion above, our paper attempts to fill the gap by considering stochastic cellular neural networks with piecewise constant arguments. In this paper, criteria on the th moment exponential stability of equilibrium point can be derived by constructing a suitable Lyapunov functional; the results obtained in this paper generalize and improve some of the existing results in [12, 38].
The remainder of this paper is organized as follows. In Section 2, we introduce some notations and assumptions. In Section 3, a sufficient condition for the existence and uniqueness of the solution is obtained. In Section 4, we establish our main results on th moment exponentially stable. A numerical example is given in Section 5 to demonstrate the theoretical results of this paper. Finally, some conclusions are given in Section 6.
In this paper, let , be the family of -valued -adapted process such that, for every , , be the family of -valued -adapted process such that, for every , . , denotes the -dimensional real space, . We fix real-valued sequences such that with as .
We study stochastic cellular neural networks with piecewise constant arguments described by the differential equations: If , , . corresponds to the number of units in a neural network, stands for the potential (or voltage) of cell at time , , are activation functions, denotes the rate with which cell resets its potential to the resting state when isolated from the other cells and inputs, and and denote the strengths of connectivity between cells and at times and , respectively. denotes the external bias on the th unit. Moreover, is -dimensional Brownian motion defined on a complete probability space with a natural filtration generated by , where we associate with the canonical space generated by , and denote by the associated -algebra generated by with the probability measure . Let be the family of -valued random variables with , and let be the th row vector of .
Throughout this paper, the following standard hypotheses are needed.(H1)Functions , are Lipschitz-continuous on with Lipschitz constants , , respectively. That is, for all , .(H2)There exists a positive number such that , .(H3)Assume that , , .(H4)There exist nonnegative constants , such that for all , .
In the following, for further study, we give the following definitions and lemmas.
denotes a vector norm defined by
Definition 1 (see ). The equilibrium point of system (1) is said to be the th moment exponentially stable if there exist and such that ,
where is a solution of system (1) with initial value .
In such a case, The right hand of (6) is commonly known as the th moment Lyapunov exponent of this solution.
When , it is usually said to be exponentially stable in mean square.
Lemma 2 (see , (Burkholder-Davis-Gundy inequality)). Let . Define, for ,
Then, for every , there exist universal positive constants (depending only on ), such that
for all .
In particular, one may take if , , if , , if ,.
Lemma 4 (see ). Assuming that there exists constant , , if , then the following inequality holds:
3. Existence and Uniqueness of Solutions
In this section, we will study the existence and uniqueness of the equilibrium point of neural networks (1).
Theorem 5. Assume that (H1)–(H4) are fulfilled. Then, for every , there exists a unique solution , of (1) such that and ; is a solution of the following integral equation:
Proof. Existence. Fix ; then there exists , such that . Without loss of generality, we assume that . We will prove that, for every , there exists a solution of (1) such that , .
For each , set and define, by the Picard iterations, Obviously, . Moreover, for , it is easy to see by induction that . For simpleness, we let Then where For any , where .
The Gronwall inequality implies Since is arbitrary, for , we must have Therefore .
Now we claim that, for all , where and will be defined below.
Firstly, we compute where Lemma 4 is used in the first inequality and Lemma 2 and Hölder inequality are used in the second inequality. So (20) holds for .
Next, assume (20) holds for ; then where .
That is, (20) holds for . Hence, by induction, (20) holds for all .
One can see from (20) that, for every , is a Cauchy sequence in . Hence we have as in . Letting in (19) gives Therefore, .
It remains to show that is a solution of system (1) satisfying . Note that Hence, we can let in (13), and (12) is derived. Again, using the same argument, we can continue from to . Hence, the mathematical induction completes the proof.
Uniqueness. Let and be the two solutions, . By noting we can easily show that The Gronwall inequality then yields that This implies that for . The uniqueness has been proved for . For every , we can prove the uniqueness by mathematical induction on . As the assumptions of the existence-and-uniqueness theorem hold on every finite subinterval of , then (1) has a unique solution on the entire interval (see ). Hence, the theorem is proved.
From the proof of Theorem 5, we easily obtain the following theorem.
Theorem 6. Assume that (H1)–(H3) are fulfilled. Then, for every , there exists a unique solution of the following system such that , :
Remark 7. When the system (1) neglects stochastic perturbations, it becomes the system (28) which is studied in [12, 38]. In order to obtain the existence and uniqueness, more restrictions on coefficients are needed in [12, 38], which might greatly restrict the application domain of the neural networks. Clearly, our results in this paper contain the results given in [12, 38], and the conditions are less restrictive compared with [12, 38].
4. th Moment Exponential Stability
In this section, we will establish some sufficient conditions ensuring the th moment exponential stability of the equilibrium point of system (1).
Let denote the family of all nonnegative functions on which are continuous once differentiable in and twice differentiable in . If , define an operator associated with (1) as where
Let ; then (1) can be written by where
For simplicity of notation, we denote
In order to obtain our results, the following assumption and Lemmas are needed:(H5).
Lemma 8. Let be a solution of (31) and let (H1)–(H5) be satisfied. Then the following inequality holds for all , where
Proof. Fix ; there exists , such that . Then from Lemma 4, it follows that
Now we compute and :
The Hölder inequality yields
Substituting (38)–(40) into (37) yields that
where , .
On the other hand, Lemma 2 and (H3) yield Substituting (41)-(42) into (36), we have where , .
By Gronwall inequality
Furthermore, for , we have Thus, it follows from condition (H5) that Therefore, (34) holds for ; the extension of (34) for all is obvious. This proves the theorem.
Lemma 9 (Mao ). Assume that there is a function and positive constants , , and , such that for all . Then for all .
For convenience, we adopt the following notation:
Theorem 10. Assume that (H1)–(H5) hold and, furthermore, that the following inequality is satisfied: Then system (31) is the th moment exponentially stable.
Proof. We define a Lyapunov function
Obviously, (47) is satisfied with . Now we prove (48) holds.
For , , the operator with respect to (12) is given by By using Lemma 8, we obtain This, together with (51), we get that (48) is true. Now, define for convenience as follows: By Lemma 9, the equilibrium of (31) is the th moment exponentially stable and the th moment Lyapunov exponent should not be greater than .
Proof. Let , ; the conclusion is straightforward.
5. Illustrative Example
In the following, we will give an example to illustrate our results.
Example 1. Consider the following model: Suppose the activation function is described by , .
Let , when , . , , , , , . , . , .
It is easy to check that , , , , .
When , , we compute , , , , , , and .
Remark 13. When , let , ; then system (58) can be viewed as the system (2) in . We can compute , where was defined in  and the condition is not satisfied. Hence results in  are useless to judge the exponential stability of system (58).
This is the first time that stochastic cellular neural networks with piecewise constant argument of generalized type are considered. Sufficient conditions on existence, uniqueness, and the th moment exponential stability of the equilibrium point are derived by applying stochastic analysis techniques and some known inequalities. The obtained results could be useful in the design and applications of cellular neural networks. Furthermore, the method given in this paper may be extended to study more complex systems, such as stochastic neural networks with piecewise constant argument and impulsive perturbations.
Conflict of Interests
The author declares that there is no conflict of interests regarding the publication of this paper.
The author would like to thank Dr. Yanchun Zhang, Dr. Yeqing Ren, the editor, and the anonymous referees for their detailed comments and valuable suggestions which considerably improved the presentation of this paper. This work was supported by the Project of Humanities and Social Sciences of Ministry of Education of China under Grant no. 13YJC630232.
- L. O. Chua and L. Yang, “Cellular neural networks: theory,” IEEE Transactions on Circuits and Systems, vol. 35, no. 10, pp. 1257–1272, 1988.
- L. O. Chua and L. Yang, “Cellular neural networks: applications,” IEEE Transactions on Circuits and Systems, vol. 35, no. 10, pp. 1273–1290, 1988.
- L. O. Chua and T. Roska, “Cellular neural networks with nonlinear and delay-type template elements,” in Proceedings of the IEEE International Workshop on Cellular Neural Networks and their Applications, p. 1225, 1990.
- S. Arik, “A new condition for robust stability of uncertain neural network s with time delays,” Neurocomputing, 2013.
- S. Arik, “New criteria for global robust stability of delayed neural networks with norm bounded uncertainties,” IEEE Transactions on Neural Networks and Learning Systems, no. 99, 2013.
- Z. Orman and S. Arik, “An analysis of stability of a class of neutral-type neural networks with discrete time delays,” Abstract and Applied Analysis, vol. 2013, Article ID 143585, 9 pages, 2013.
- O. Faydasicok and S. Arik, “An analysis of stability of uncertain neural networks with multiple time delays,” Journal of the Franklin Institute, vol. 350, no. 7, pp. 1808–1826, 2013.
- S. Arik, J. Park, T. Huang, and J. Oliveira, “Analysis of nonlinear dynamics of neural networks,” Abstract and Applied Analysis, vol. 2013, Article ID 756437, 1 pages, 2013.
- B. Wu, Y. Liu, and J. Lu, “New results on global exponential stability for impulsive cellular neural networks with any bounded time-varying delays,” Mathematical and Computer Modelling, vol. 55, no. 3-4, pp. 837–843, 2012.
- S. Ahmad and I. M. Stamova, “Global exponential stability for impulsive cellular neural networks with time-varying delays,” Nonlinear Analysis: Theory, Methods & Applications, vol. 69, no. 3, pp. 786–795, 2008.
- H. Zhao and J. Cao, “New conditions for global exponential stability of cellular neural networks with delays,” Neural Networks, vol. 18, no. 10, pp. 1332–1340, 2005.
- M. U. Akhmet, D. Aruğaslan, and E. Yılmaz, “Stability in cellular neural networks with a piecewise constant argument,” Journal of Computational and Applied Mathematics, vol. 233, no. 9, pp. 2365–2373, 2010.
- S. Wu, C. Li, X. Liao, and S. Duan, “Exponential stability of impulsive discrete systems with time delay and applications in stochastic neural networks: A Razumikhin approach,” Neurocomputing, vol. 82, pp. 29–36, 2012.
- L. Chen, R. Wu, and D. Pan, “Mean square exponential stability of impulsive stochastic fuzzy cellular neural networks with distributed delays,” Expert Systems with Applications, vol. 38, no. 5, pp. 6294–6299, 2011.
- X. Li, “Existence and global exponential stability of periodic solution for delayed neural networks with impulsive and stochastic effects,” Neurocomputing, vol. 73, no. 4-6, pp. 749–758, 2010.
- Y.-G. Kao, J.-F. Guo, C.-H. Wang, and X.-Q. Sun, “Delay-dependent robust exponential stability of Markovian jumping reaction-diffusion Cohen-Grossberg neural networks with mixed delays,” Journal of the Franklin Institute, vol. 349, no. 6, pp. 1972–1988, 2012.
- Y. Kao and C. Wang, “Global stability analysis for stochastic coupled reaction-diffusion systems on networks,” Nonlinear Analysis: Real World Applications, vol. 14, no. 3, pp. 1457–1465, 2013.
- G. Yang, Y. Kao, W. Li, and S. Xiqian, “Exponential stability of impulsive stochastic fuzzy cellular neural networks with mixed delays and reaction-diffusion terms,” Neural Computing and Applications, vol. 23, no. 3-4, pp. 1109–1121, 2013.
- G. Yang, Y. Kao, and C. Wang, “Exponential stability and periodicity of fuzzy delayed reaction-diffusion cellular neural networks with impulsive effect,” Abstract and Applied Analysis, vol. 2013, Article ID 645262, 9 pages, 2013.
- C. Wang, Y. Kao, and G. Yang, “Exponential stability of impulsive stochastic fuzzy reaction-diffusion Cohen-Grossberg neural networks with mixed delays,” Neurocomputing, vol. 89, pp. 55–63, 2012.
- X. Li and X. Fu, “Synchronization of chaotic delayed neural networks with impulsive and stochastic perturbations,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 2, pp. 885–894, 2011.
- Y. Sun and J. Cao, “th moment exponential stability of stochastic recurrent neural networks with time-varying delays,” Nonlinear Analysis: Real World Applications, vol. 8, no. 4, pp. 1171–1185, 2007.
- C. Huang, Y. He, L. Huang, and W. Zhu, “th moment stability analysis of stochastic recurrent neural networks with time-varying delays,” Information Sciences. An International Journal, vol. 178, no. 9, pp. 2194–2203, 2008.