Abstract

Exponential stability in mean square of stochastic delay recurrent neural networks is investigated in detail. By using Itô’s formula and inequality techniques, the sufficient conditions to guarantee the exponential stability in mean square of an equilibrium are given. Under the conditions which guarantee the stability of the analytical solution, the Euler-Maruyama scheme and the split-step backward Euler scheme are proved to be mean-square stable. At last, an example is given to demonstrate our results.

1. Introduction

It is well known that neural networks have wide range of applications in many fields, such as signal processing, pattern recognition, associative memory, and optimization problems. Stability is one of the main properties of neural networks, which is preconditions in the designs and applications of neural networks. Time delays are unavoidable in neural networks systems, which is frequently the important source of poor performance or instability. Thus, stability analysis of neural networks with various delays has been extensively investigated; see [110].

In real nervous systems, the synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes [11]. Hence, noise should be taken into consideration in modeling. Recently, some sufficient conditions for exponential stability of stochastic delay neural networks have been presented in [1218]. Similar to stochastic delay differential equations, most of stochastic delay neural networks do not have explicit solutions. Most of existing researches related to the stability analysis of equilibrium point were focused on the appropriate Lyapunov function or functional. However, there is no very effective method to find such Lyapunov function or functional. Thus it is very useful to establish numerical methods for studying the properties of stochastic delay neural networks. There are many papers concerned with the stability of numerical solutions for stochastic delay differential equations ([1929] and references therein). But there has been a few literatures about the exponential stability of numerical methods for stochastic delay neural networks. To the best of the authors knowledge, only [3032] studied the exponential stability of numerical methods for stochastic delay Hopfield neural networks. The stability of numerical methods for stochastic delay recurrent neural networks remains open, which motivates this paper. The main aim of the paper is to investigate the mean-square stability (MS stability) of the Euler-Maruyama (EM) method and the split-step backward Euler (SSBE) method for stochastic delay recurrent neural networks.

The remainder of the paper is comprised of four sections. Some notations and the conditions of stability to the analytical solution are given in Section 2. The MS stability of the EM method and the SSBE method is proved in Sections 3 and 4, respectively. In Section 5, an example is provided to illustrate the effectiveness of our theory.

2. Model Description and Analysis of Analytical Solution

Throughout the paper, unless otherwise specified, we will employ the following notations. Let be a complete probability space with a filtration satisfying the usual conditions (i.e., it is increasing and is right continuous, while contains all -null set) and the expectation operator with respect to the probability measure. Let denote the Euclidean norm of a vector or the spectral norm of a matrix. Let and denote the family of continuous functions from to with the norm . Denote by the family of all bounded -measurable valued random variables. We assume to be a standard Brownian motion defined on the probability space.

Consider the stochastic delay recurrent neural networks of the form

Model (1) can be rewritten in the following matrix-vector form: where is the state vector associated with the neurons; with represents the rate with which neuron will reset its potential to the resting state in isolation when disconnected from the network and the external stochastic perturbation; and denote the connection weight matrix and the delayed connection weight matrix, respectively; and are activation functions, , , , , where is the transmission delay; , . Moreover, is an -dimensional Brown motion defined on the complete probability space , and , , is the diffusion coefficient matrix.

To obtain our results, we impose the following standing hypotheses.(H1), and .(H2) Both and satisfy the Lipschitz condition. That is, for each , there exist constants , such that (H3) satisfies the Lipschitz condition, and there are nonnegative constants such that

It follows from [33] that under the assumptions (H1)–(H3), system (1) or (2) has a unique strong solution , and is a measurable, sample continuous and -adapted process. Clear, (2) admits the trivial solution .

Definition 1. The trivial solution of system (1) or system (2) is said to be exponentially stable in mean square if there exists a pair of positive constants and such that holds for any . In this case

Using Itô’s formula and nonnegative semimartingale convergence theorem, [12, 14] discussed the exponential stability of stochastic delayed neural network. Employing the method of variation parameter and inequality techniques, several sufficient conditions ensuring th moment exponential stability of stochastic delayed recurrent neural networks are derived in [17]. With the help of the Lyapunov function and Halanay-type inequality, a set of novel sufficient conditions on mean-square exponential stability of stochastic recurrent neural networks with time-varying delays was established in [18]. In this paper, we will give a new sufficient condition to guarantee exponential stability in mean square of stochastic delayed recurrent neural networks (1) by using Itô’s formula and inequality techniques.

Theorem 2. If (1) satisfies (H1)–(H3), and the following holds.(H4) For ,
Then (1) is exponentially stable in mean square.

Proof. By (H4), there exists a sufficiently small positive constant such that
Set ; applying Itô’s formula to along (2), we obtain where
Notice that
Therefore, we have
Notice that , so we can obtain from the previous inequality which implies

Corollary 3. If (1) satisfies (H1)–(H3), the following holds.(H5) For ,
Then (1) is exponentially stable in mean square.

Proof. The are nonnegative constants, so we have condition (H4) from (H5). Therefore we can directly derive Corollary 3 by Theorem 2.

3. Stability of EM Numerical Solution

Let and denote the increments of the time and Brownian motion, respectively. For system (1), the discrete EM approximate solution is defined by where is a stepsize which satisfies for a positive integer , and is an approximation to ; if , we have . We assume that is -measurable at the mesh points .

Suppose that the following condition is satisfied:(H6).

Definition 4. A numerical method is said to be mean-square stable (MS stable), if there exists an , such that any application of the method to (1) generates numerical approximations , which satisfy for all with .

Now we analyze the stability of EM numerical solution.

Theorem 5. Under conditions (H1)–(H3) and (H5)-(H6), the Euler method applied to (1) is MS stable with and , where

Proof. From (16), we have
Squaring both sides of the previous equality, we obtain
Noting that , ,, and , and , where , are -measurable; hence
Let . Applying the inequalities and conditions (H2) and (H3), we obtain from (21) and (20)
Thus where
Then
By the recursion we conclude that if which is equivalent to
If , (27) reduces to
By conditions (H5) and (H6), we know that . Thus, (28) holds for . Let ; then . That is, the EM method of (1) is MS stable. This proof is completed.

Theorem 6. For ,  and , there exists a positive constant such that where depends on , and so on but not upon . is defined in (16).

The proof is similar to Theorem  7 in [20].

4. Stability of SSBE Numerical Solution

In this section, we will construct the SSBE scheme to (1) and analyze the stability of the numerical solution. The adaptation of SSBE method to (1) leads to a numerical process of the following type:

The notations are same to the definition in (16). Now we present another main results of this paper.

Theorem 7. Assume that (H1)–(H3) and (H5)-(H6) hold. Define
Then the SSBE method applied to (1) is MS stable with and , where .

Proof. From (30), we have
Squaring both sides of (32), we obtain
It follows from inequality and (H2) that
Letting , we have
On the other hand, from (30), we obtain
Noting that , , . From (36) and (H3), we have
Substituting (35) into (37), we obtain where
Then
By the recursion we conclude that if which is equivalent to , where
Since , by (H3), (H5), and (H6), we have . This implies that
Thus, (41) holds for . Let ; then . That is, the SSBE method of (1) is MS stable. The proof of the theorem is completed.

Theorem 8. For ,  and , there exists a positive constant such that where depends on , and so on but not upon . is defined in (30).

The proof is similar to Theorem  3.2 in [26].

5. Example

In this section, we will discuss an example to illustrate our theory and compare the restrictions on stepsize of the stable SSBE method with that of the EM method.

Example 1. Let be a two-dimensional Brown motion. Consider the following stochastic delay recurrent neural networks:
Let , ,
It is obvious that , , , and . So (H1)–(H3) are satisfied. By computation, Therefore conditions (H4)–(H6) also hold. By Theorem 2, system (45) is exponentially stable in mean square. The EM scheme and the SSBE scheme to (45) are also MS stable by Theorems 5 and 7.
Now, we can conclude that the EM method and the SSBE method to (45) are MS stable with from Figure 1. It verifies the validity of Theorems 5 and 7. The EM method is not stable, and the SSBE method is MS stable with from Figure 2, which shows that the stability of the SSBE method is more superior to EM. Figure 3 illustrates that the SSBE method is unstable with .

6. Conclusions

The model of stochastic neural network can be viewed as a special kind of stochastic differential equation; the solution is hard to be explicitly expressed. It not only has the characteristics of the general stochastic differential equations but also has its own features; its stability is connected with the activation functions and the connection weight matrixes. So it is necessary to discuss the stability of stochastic neural network. Different from the previous works on exponential stability of stochastic neural networks, both Lyapunov function method and two numerical methods are used to study the stability of stochastic delay recurrent neural networks. Under the conditions which guarantee the stability of the analytical solution, the EM method and the SSBE method are proved to be MS stable if the step size meets a certain limit. We can analyze other numerical methods for different types of stochastic delay neural networks in future.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (nos. 60904032, 61273126), the Natural Science Foundation of Guangdong Province (no. 10251064101000008), and the Fundamental Research Funds for the Central Universities (no. 2012ZM0059).