Mean-Square Exponential Stability Analysis of Stochastic Neural Networks with Time-Varying Delays via Fixed Point Method
This work addresses the stability study for stochastic cellular neural networks with time-varying delays. By utilizing the new research technique of the fixed point theory, we find some new and concise sufficient conditions ensuring the existence and uniqueness as well as mean-square global exponential stability of the solution. The presented algebraic stability criteria are easily checked and do not require the differentiability of delays. The paper is finally ended with an example to show the effectiveness of the obtained results.
Cellular neural networks (CNNs), firstly proposed by Chua and Yang in 1988 [1, 2], have become a research focus owing to their numerous successful applications in various fields such as optimization, linear and nonlinear programming, associative memory, pattern recognition, and computer vision. Taking into account the finite switching speed of amplifiers in the implementation of neural networks, we see that the time delays are inevitable and therefore a new important model, namely, delayed cellular neural networks (DCNNs), is put forward.
On the other hand, it is noteworthy that, besides delay effects, stochastic and impulsive as well as diffusion effects are also likely to exist in the neural networks. Up to now, there have been a mass of works [3–12] on the dynamic behaviors of complex CNNs such as impulsive delayed reaction-diffusion CNNs and stochastic delayed reaction-diffusion CNNs.
Referring to the current publications of complex CNNs, we note that Lyapunov theory is always the primary method for the stability analysis. However the unavoidable reality is that there also exist lots of difficulties in the application of corresponding results to specific problems. So it does seem that some new methods are needed to resolve those difficulties.
Encouragingly, the fixed point theory is successfully applied by Burton and other authors to investigate the stability of deterministic systems, followed by some valid conclusions presented; for example, see the monograph  and the papers [14–25]. Furthermore, this new idea is developed to discuss the stability of stochastic (delayed) differential equations, turning out to be effective for the stability analysis of dynamical systems with delays and stochastic effects; see [26–32]. Specifically, in [27–29], Luo used the fixed point theory to study the exponential stability of mild solutions for stochastic partial differential equations with bounded delays and with infinite delays. In [30, 31, 33–35], Sakthivel et al. used the fixed point theory to investigate the asymptotic stability in pth moment of mild solutions to nonlinear impulsive stochastic partial differential equations with bounded delays and with infinite delays. In , Luo used the fixed point theory to study the exponential stability of stochastic Volterra-Levin equations.
The motivation of this paper is discussing the feasibility of using the fixed point theory to tackle the stability research of complex CNNs and thereupon enlarging the applications of the fixed point theory as well as enriching the stability theory of complex CNNs. In detail, via Banach contraction mapping principle, studied in this paper is the mean-square global exponential stability of stochastic delayed CNNs. Remarkably, Banach contraction mapping principle is far different from Lyapunov method. By establishing a new inequality, we first construct a proper Banach space and thereby investigate, in mean-square sense, the existence and uniqueness as well as global exponential stability of the solution simultaneously. The obtained results show that, in regard to the stability research of complex CNNs, the fixed point theory does work and has its own advantage; namely, it works with no need for Lyapunov functions. Some algebraic stability criteria are finally presented, which are easily checked and do not require the differentiability of delays, let alone the monotone decreasing behavior of delays.
Let be a complete probability space equipped with some filtration satisfying the usual conditions; that is, the filtration is right continuous and contains all -null sets. Let denote a standard Brownian motion defined on . stands for the -dimensional Euclidean space and represents the Euclidean norm. . . corresponds to the space of continuous mappings from the topological space to the topological space .
Consider the following stochastic cellular neural network with time-varying delays: where and is the number of the neurons in the neural network. corresponds to the state of the th neuron at time . ; moreover, is the activation function of the th neuron at time and is the activation function of the th neuron at time . The constant represents the connection weight of the th neuron on the th neuron at time and the constant represents the rate with which the th neuron will reset its potential to the resting state when disconnected from the network and external inputs. The constant represents the connection strength of the th neuron on the th neuron at time , where corresponds to the transmission delay along the axon of the th neuron and satisfies . denotes the diffusion coefficient. and .
Definition 1. Equation (1) is said to be globally exponentially stable in mean square if, for any , there exists a pair of positive constants and such that
Lemma 2. Assume that , where denotes the space consisting of functions satisfying ; then holds for and .
Proof. As we derive . Hence, it is easy to see
Lemma 3 (see ). Assume that , where denotes the space consisting of functions satisfying , ; then
Remark 4. In Lemma 3, it is derived from letting that
The consideration of this paper is based on the following fixed point theorem.
Theorem 5 (see ). Let be a contraction operator on a complete metric space ; then there exists a unique point for which .
3. Main Results
In this section, we discuss, by means of the contraction mapping principle stated in Theorem 5, the existence and uniqueness as well as global exponential stability of the solution to (1)-(2) in mean-square sense. Before proceeding, we introduce some assumptions as follows.(A1)There exist nonnegative constants such that, for any , (A2)There exist nonnegative constants such that, for any , (A3)There exist nonnegative constants and such that, for any ,
Let , and let () be the space consisting of -adapted processes which satisfy, for fixed , the following:(1) is continuous in mean-square;(2) on ;(3), where is a positive constant satisfying ; here is defined as shown in (2). From Lemma 2, we equip with the norm , where ; thereby is also a complete metric space.
Theorem 6. Assume that conditions (A1)–(A3) hold. If there exist constants , such that , where then (1) is globally exponentially stable in mean square.
Proof. By Ito formula, we compute the differential of along the solution of (1)-(2):
which yields after integrating from 0 to
Note in (14); hence we define the following operator acting on , for , where obeys the following rules: on and on , .
Now we will, by applying the contraction mapping principle, prove the existence and uniqueness as well as global exponential stability of solution to (1)-(2) in mean-square sense. The subsequent proof can be divided into two steps.
Step 1. We need to prove . To testify , it is necessary to show the mean-square continuity of on and for . First, in light of the expression of , we have, for a fixed time , where It is obvious that holds. In addition, it is easy to see that for . Furthermore, by Lemma 3, we get Thus, we know for , which means is continuous in mean square on . Moreover, owing to , we conclude is indeed continuous in mean square on for .
Next, we will prove as for . It is derived from (A1) that So, , which leads to
In addition, from (A2), we deduce that Therefore, , which leads to Moreover, from (A3) and Lemma 3, we get So, It then follows from (21)–(25) that as for . We therefore conclude .
Step 2. We need to prove is contractive. For and , we know that, for , where Note which implies which implies In addition, which implies
Therefore, which results in
As , is a contraction mapping and hence there exists a unique fixed point of in which means is the solution of (1)-(2) and as . This completes the proof.
Remark 7. The main idea of this proof is based on the fixed point theory rather than Lyapunov method. By using Banach contraction mapping principle with no need for Lyapunov functions, we simultaneously explore the existence and uniqueness as well as global exponential stability of solution to (1)–(14) in mean-square sense, whereas Lyapunov method fails to do this.
Lemma 8. Assume that conditions (A1)–(A3) hold. If , where , then (1) is globally exponentially stable in mean square.
Remark 9. The obtained algebraic stability criteria are easily checked and do not require even the differentiability of delays, let alone the monotone decreasing behavior of delays which is necessary in some relevant works.
Consider the following two-dimensional stochastic cellular neural network with time-varying delays: with the initial conditions , on , where , , , , , , , . It is easy to see that and . We compute, for , which yields . From Lemma 8, we conclude this two-dimensional stochastic cellular neural network with time-varying delays is mean-square globally exponentially stable.
The main contribution of this work is confirming the feasibility of utilizing the fixed point theory to address the stability research of complex CNNs and thereby enlarging the applications of the fixed point theory as well as enriching the stability theory of complex CNNs. Specifically, by Banach contraction mapping principle with no need for Lyapunov functions, we complete the proof of the existence and uniqueness as well as global exponential stability of solution to stochastic delayed neural networks simultaneously, whereas Lyapunov method cannot do this. The derived algebraic stability criteria are novel and easily checked and do not require the differentiability of delays. As we all know, the fixed point theory has various forms, for example, Krasnosleskii’s fixed point theorem. Considering many mathematical models can be transformed into a linear part and other nonlinear parts, our future work is trying to explore the application of Krasnosleskii’s fixed point theorem to the stability analysis of complex CNNs.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
This work is supported by the National Natural Science Foundation of China under Grant nos. 71171116 and 60904028, Humanities and Social Sciences Foundation of Ministry of Education of China under Grant no. 09YJC630129, Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions, and “China’s Manufacturing Industry Development Academy”—Key Philosophy and Social Science Research Center of University in Jiangsu Province.
T. A. Burton, Stability by Fixed Point Theory for Functional Differential Equations, Dover, New York, NY, USA, 2006.View at: MathSciNet
K. Mathiyalagan, R. Sakthivel, and S. M. Anthoni, “New robust exponential stability results for discrete-time switched fuzzy neural networks with time delays,” Computers & Mathematics with Applications, vol. 64, no. 9, pp. 2926–2938, 2012.View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
S. Hu, C. Huang, and F. Wu, Stochastic Differential Equations, Science Press, Beijing, China, 2008.
D. R. Smart, Fixed Point Theorems, Cambridge University Press, Cambridge, UK, 1980.