Abstract

The stability issue is investigated for a class of stochastic neural networks with time delays in the leakage terms. Different from the previous literature, we are concerned with the almost sure stability. By using the LaSalle invariant principle of stochastic delay differential equations, Itô’s formula, and stochastic analysis theory, some novel sufficient conditions are derived to guarantee the almost sure stability of the equilibrium point. In particular, the weak infinitesimal operator of Lyapunov functions in this paper is not required to be negative, which is necessary in the study of the traditional moment stability. Finally, two numerical examples and their simulations are provided to show the effectiveness of the theoretical results and demonstrate that time delays in the leakage terms do contribute to the stability of stochastic neural networks.

1. Introduction

During the past decades, a great deal of attention has been paid to investigate the dynamics behaviors such as stability, periodic oscillatory behavior, almost periodic oscillatory behavior, and chaos and bifurcation of neural networks. Particularly, the stability of neural networks is one of the best topics since many important applications depend heavily on the stability of the equilibrium point. Therefore, there have appeared a large number of works on the stability of the equilibrium point of various neural networks such as Hopfield neural networks, cellular neural networks, recurrent neural networks, Cohen-Grossberg neural networks, and bidirectional associative memory (BAM) neural networks [19].

As is well known, time delay is one of the most significant phenomena that occur in many different fields such as biology, chemistry, economy, and communication networks. Moreover, it is inevitably encountered in both neural processing and signal transmission due to the limited bandwidth of neurons and amplifiers. However, the existence of time delays may cause oscillation, divergence, chaos, instability, or other poor performance in neural networks, which are usually harmful to the applications of neural networks. Therefore, the stability analysis for neural networks with time delays has attracted many researchers’ much attention in the literature. The existing works on the stability of neural networks with time delays can be simply classified into four categories: constant delays, time-varying delays, distributed delays, and mixed time delays.

It should be mentioned that a new class of delays, called leakage delays (also named time delays in the “forgetting” or leakage terms), was initially introduced by Gopalsamy [10] in the study of neural networks. In [10], Gopalsamy pointed out that the leakage delays often have a tendency to destabilize the neural networks and they were very difficult to handle. Hence, to investigate the stability of neural networks with leakage delays has been an interesting and challenging topic. It is inspiring that there have been many interesting results on the stability of neural networks with leakage delays reported in the literature [1117]. For example, Liu in [11] investigated the existence of a unique equilibrium and globally exponential stability for a class of BAM neural networks with time-varying delays in the leakage terms by using the fixed point theorem and Lyapunov functional theory. By using the Lyapunov-Krasovskii functional having triple integral terms and model transformation technique, Zhu et al. in [12] obtained some novel sufficient delay-dependent conditions to ensure the globally exponential stability in the mean square of impulsive bidirectional associative memory (BAM) neural networks with both Markovian jump parameters and leakage delays. In [13], Wang et al. discussed the stability of recurrent neural networks with time delays in the leakage terms under impulsive perturbations. By applying a new stability lemma, Itô’s formula, Lyapunov-Krasovskii functional, stochastic analysis theory, and matrix inequalities technique, Xie et al. in [15] studied the exponential stability in the mean square for a class of stochastic neural networks with leakage delays and expectations in the coefficients.

On the other hand, noise disturbance is a major source of instability and poor performances in neural networks. Usually, many real nervous systems are affected by external perturbations which in many cases are of great uncertainty and hence may be treated as random. Just as Haykin pointed out, the synaptic transmission can be regarded as a noisy process introduced by random fluctuations from the release of neurotransmitters and other probabilistic causes. Therefore, we should consider the effect of noise disturbances when studying the stability of neural networks. Generally speaking, neural networks with noise disturbances are called stochastic neural networks. Recently, there have appeared a large number of results on the stability of stochastic neural networks (see, e.g., [35, 7, 9, 13, 14]). Unluckily, those criteria presented in [35, 7, 9, 13, 14] require a strict condition that the derivative of the considered Lyapunov-Krasovskii functional is negative; that is, for any , where is a weak infinitesimal operator and is a positive Lyapunov-Krasovskii functional. However, may not be negative in many real cases, which leads to the fact that the criteria obtained in [35, 7, 9, 13, 14] fail in this case.

Motivated by the above discussion, in this paper we study the stability problem for a class of stochastic neural networks with time delays in the leakage terms. Different from the previous literature, we aim to remove the restriction of for any . By using the LaSalle invariant principle of stochastic delay differential equations, Itô’s formula, and stochastic analysis theory, some novel sufficient conditions are derived to guarantee the almost sure stability of the equilibrium point. Moreover, two numerical examples and their simulations are provided to show the effectiveness of the theoretical results and demonstrate that time delays in the leakage terms do contribute to the stability of stochastic neural networks.

The remainder of this paper is organized as follows. In Section 2, we introduce the model of a class of stochastic neural networks with time delays in the leakage terms and present the definition of almost sure stability as well as some necessary assumptions. By means of the LaSalle invariant principle of stochastic delay differential equations, Itô’s formula, and stochastic analysis theory, our main results are established in Section 3. In Section 4, two numerical examples are given to show the effectiveness of the obtained results. Finally, in Section 5, the paper is concluded with some general remarks.

Notation 1. The notations used in this paper are quite standard. and denote the -dimensional Euclidean space and the set of all real matrices, respectively. The superscript “” denotes the transpose of a matrix or vector, and the symbol “” denotes the symmetric term of the matrix. Trace denotes the trace of the corresponding matrix and denotes the identity matrix with compatible dimensions. For any matrix , (resp., ) denotes the largest (resp., smallest) eigenvalue of . For square matrices and , the notation denotes that is positive definite (positive semidefinite, negative, and negative semidefinite) matrix. Let be -dimensional Brownian motion defined on a complete probability space with a natural filtration . Also, let and denote the family of continuous function from to with the uniform norm . Denote by the family of all measurable, -valued stochastic variables such that , where stands for the correspondent expectation operator with respect to the given probability measure .

2. Model Description and Problem Formulation

In this paper, we consider a class of neural networks with mixed time delays, which is described by the following integrodifferential equations:where is the state vector associated with the neurons and the diagonal matrix has positive entries . The matrices , , , and are the connection weight matrix, the constant delay connection weight matrix, the time-varying delay connection weight matrix, and the distributed delay connection weight matrix, respectively. ,   and   are the neuron activation functions. The noise perturbation is a Borel measurable function, and denotes the leakage delay. and are constant delays.

Throughout this paper, the following assumptions are assumed to hold.

Assumption  H1. There exist diagonal matrices and , , satisfying for all , and .

Assumption  H2. There exist positive definite matrices , , and such that for all .

Assumption  H3. Consider

Let denote the state trajectory from the initial data on in . Clearly, under Assumptions  H1–H3, system (1) admits a trivial solution corresponding to the initial data . For simplicity, we write .

Now we give the concept of almost sure stability for system (1).

Definition 1. The equilibrium point of (1) is said to be almost surely stable if for every where “a.s.” denotes “almost surely.”

The following lemma is needed to prove our main results.

Lemma 2 (see [18]). For any positive definite matrix , a scalar , and a function such that the integrations concerned are well defined, the following inequality holds:

3. Main Results and Proofs

In this section, the almost sure stability of the equilibrium point for system (1) is investigated under Assumptions  H1–H3.

Theorem 3. Under Assumptions  H1–H3, the equilibrium point of (1) is almost surely stable, if there exist a positive scalar , positive diagonal matrices , and , and positive definite matrices , and such that the following linear matrix inequalities (LMIs) hold:where

Proof. Fixing arbitrarily and writing , we first define an infinitesimal generator of the Markov process acting on as follows: Let denote the family of all nonnegative functions on which are continuously twice differentiable in and differentiable in . If , then along the trajectory of system (1) we define an operator from to bywhere Now, let us consider the following Lyapunov-Krasovskii functional: Then, it follows from (1) and (11) thatwhere . On the other hand, by Assumption  H2 and condition (7), we obtain which together with (14) gives By employing Lemma 2, we haveOn the other hand, it follows from Assumptions  H1 and H3 thatHence, by (14), (16), (17), (18), (19), and (20), we getwhereBy conditions (7) and (8), we see that . Let . Then we claim that . This fact together with (20) yieldsLet and . It is obvious that for any . Therefore, by Definition 1 and the LaSalle invariant principle of stochastic differential delay equations (e.g., see Corollary1 in [19]) we see that the zero solution for system (1) is almost surely stable. This completes the proof of Theorem 3.

Remark 4. If we ignore the effect of time delays in the leakage terms, then system (1) is reduced to the following:

Correspondingly, we revise Assumptions  H2 and H3 as follows.

Assumption  H2. There exist positive definite matrices , , and such that for all .

Assumption  H3. Consider and .

Under Assumptions  H1, H′2, and H′3, we have the following result.

Theorem 5. Under Assumptions  H1, H′2, and H′3, the equilibrium point of (24) is almost surely stable, if there exist a positive scalar , positive diagonal matrices , and , and positive definite matrices , and such that the following linear matrix inequalities (LMIs) hold:where

Proof. Consider the following Lyapunov-Krasovskii functional: Similar to the proof of Theorem 3, we can obtain the desired result by a direct computation. The proof of Theorem 5 is completed.

Remark 6. Theorems 3 and 5 present some novel sufficient conditions for a class of stochastic neural networks with or without time delays in the leakage terms to ascertain the almost sure stability of the equilibrium point by constructing a different Lyapunov-Krasovskii functional. These conditions are easy to be verified and can be applied in practice as they can be checked by using recently developed algorithms in solving LMIs. It is worth pointing out that Theorem 3 depends on all the delay constants , and , whereas Theorem 5 only depends on the delay constants . Therefore, Theorem 3 is less conservative than Theorem 5.

Remark 7. It is worth pointing out that in Theorems 3 and 5 may not be negative. However, the stability criteria obtained in the earlier literature [35, 7, 9, 13, 14] require for any . Hence, the LMI criteria existing in all the previous literature (e.g., see [35, 7, 9, 13, 14]) fail in our results.

4. Illustrative Examples

In this section, two numerical examples are given to illustrate the effectiveness of the obtained results.

Example 1. Consider a two-dimensional stochastic neural network with time delays in the leakage terms:where and is a two-dimensional Brownian motion. Let Consider and Then system (31) satisfies Assumption  H1 with and . Take and then system (31) satisfies Assumptions  H2 and H3 with , , , and .
Other parameters of network (31) are given as follows: By using the Matlab LMI toolbox, we can obtain the following feasible solution for LMIs (7) and (8): Therefore, it follows from Theorem 3 that network (31) is almost surely stable.
By using the Euler-Maruyama numerical scheme, simulation results are as follows: and step size . Figure 1 is the state response of network (31) with the initial condition , for .

Example 2. Consider a two-dimensional stochastic neural network without time delays in the leakage terms:where All other parameters of network (36) are the same as in Example 1. It is easy to check that system (36) satisfies Assumptions  H1, H′2, and H′3.
By using the Matlab LMI toolbox, we can obtain the following feasible solution for LMIs (26):Therefore, it follows from Theorem 3 that network (36) is almost surely stable.

By using the Euler-Maruyama numerical scheme, simulation results are as follows: and step size . Figure 2 is the state response of network (36) with the initial condition , for .

Remark 8. Examples 1 and 2 show that two-dimensional stochastic neural networks with and without time delays in the leakage terms are both almost surely stable. However, we know from Figures 1 and 2 that the stability speed of stochastic neural network with time delays in the leakage terms is clearly faster than that of stochastic neural network without time delays in the leakage terms. This fact reveals that time delays in the leakage terms do contribute to the stability of stochastic neural networks.

5. Concluding Remarks

In this paper, we have investigated the almost sure stability analysis problem for a class of stochastic neural networks with time delays in the leakage terms. Some novel delay-dependent conditions are obtained to ensure that the suggested system is almost surely stable, which is quite different from the moment stability. Our method is mainly based on the LaSalle invariant principle of stochastic delay differential equations, Itô’s formula, and stochastic analysis theory. Moreover, the stability criteria given in this paper are expressed in terms of LMIs, which can be solved easily by recently developed algorithms. In addition, we use two examples to show that time delays in the leakage terms do contribute to the stability of stochastic neural networks. Finally, we point out that it is possible to generalize our results to some more complex stochastic neural networks with time delays in the leakage terms (e.g., consider the effect of fractional-order factor [20]). Research on this topic is in progress.

Competing Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was jointly supported by the Alexander von Humboldt Foundation of Germany (Fellowship CHN/1163390), the National Natural Science Foundation of China (61374080), Qing Lan Project of Jiangsu, the Priority Academic Program Development of Jiangsu Higher Education Institutions, the Key University Science Research Project of Anhui Province (KJ2016A705), the Key Projects of Anhui Province University Outstanding Youth Talent Support Program (gxyqZD2016317), the Natural Science Foundation of Jiangsu Province (BK20140089), and the Play of Nature Science Fundamental Research in Nanjing Xiao Zhuang University (2012NXY12).