#### Abstract

This paper is concerned with th moment exponential stability of stochastic reaction-diffusion Cohen-Grossberg neural networks with time-varying delays. With the help of Lyapunov method, stochastic analysis, and inequality techniques, a set of new suffcient conditions on th moment exponential stability for the considered system is presented. The proposed results generalized and improved some earlier publications.

#### 1. Introduction

Since the seminal work for Cohen-Grossberg neural networks by Cohen and Grossberg [1], theoretical understanding of neural network dynamics has advanced greatly. The model can be described by a system of ordinary differential equations

where corresponds to the number of units in a neural network; denotes the potential (or voltage) of cell at time ; denotes a nonlinear output function between cell and ; represents an amplification function; represents an appropriately behaved function; the connection matrix denotes the strengths of connectivity between cells, and if the output from neuron excites (resp., inhibits) neuron , then (resp., ).

During hardware implementation, time delays do exist due to finite switching speed of the amplifiers and communication time and, thus, delays should be incorporated into the model equations of the network. For model (1.1), Ye et al. [2] introduced delays by considering the following delay differential equations:

Some other more detailed justifications for introducing delays into model equations of neural networks can be found in [3–13] and references therein. It is seen that (1.2) is quite general and it includes several well-known neural networks models as its special cases such as Hopfield neural networks, cellular neural networks, and bidirectional association memory neural networks (see, e.g., [14–18]).

In addition to the delay effects, stochastic effects constitute another source of disturbances or uncertainties in real systems [19]. A lot of dynamical systems have variable structures subject to stochastic abrupt changes, which may result from abrupt phenomena such as stochastic failures and repairs of the components, changes in the interconnections of subsystems, and sudden environment changes. In the recent years, the stability investigation of stochastic Neural Networks is interesting to many investigators, and a large number of stability criteria of these systems have been reported [20–30]. The stochastic model can be described by a system of stochastic differential equations

However, besides delay and stochastic effects, diffusion effect cannot be avoided in the neural networks when electrons are moving in asymmetric electromagnetic fields [31], so we must consider the activations vary in space as well as in time. In [32–36], authors have considered the stability of reaction-diffusion neural networks with constant or time-varying delays, which are expressed by partial differential equations. To the best of our knowledge, few authors have considered the problem of th moment stability for stochastic Cohen-Grossberg neural networks with both time-varying delays and reaction-diffusion terms. Motivated by the above discussions, in this paper, we consider the stochastic reaction-diffusion Cohen-Grossberg neural networks with time-varying delays described by the following stochastic partial differential equations:

where

In the above model, corresponds to the number of units in the neural network, is space variable, and denotes the state variable of cell at time in space variable ; smooth function is a diffusion operator; is a compact set with smooth boundary and the measure in , and are the boundary value and initial value, respectively; and denote the strengths of connectivity between cell and at time , respectively; is time delay and satisfies ; is the diffusion coefficient matrix, and is an -dimensional Brownian motion defined on a complete probability space with a natural filtration by standard Brownian motion . As a standing hypothesis, we assume that and satisfy the Lipschitz condition and the linear growth condition and that (1.4) has a solution on for the initial conditions.

The remainder of this paper is organized as follows. In Section 2, the basic notations and assumptions are introduced. In Section 3, criteria are proposed to determine th moment exponential stability for the stochastic Cohen-Grossberg neural networks with time-varying delays and reaction-diffusion term. An illustrative example is given to illustrate the effectiveness of the obtained results in Section 4. We also conclude this paper in Section 5.

#### 2. Preliminaries

For any , we define

As usual, we will also assume that the following conditions are satisfied.

() There exist positive constants , such that ()For each , there exists positive constant , such that () There exist positive functions , such that ()There exists positive constant , such that () There are nonnegative functions , for all , such that () There are nonnegative constants , for all , such that () The following inequality holds:*Definition 2.1. *The trivial solution of (1.4) is said to be th moment exponentially stable if there is a pair of positive constants and such that
for any , where also called as convergence rate. When , it is usually said to be exponentially stable in mean square.

*Definition 2.2. *Let be a continuous function, ; the upper right Dini-derivative of is defined as

The following lemmas are important in our approach.

Lemma 2.3 (Hardy inequality [4]). *Assume there exist constants , then the following inequality holds:
**
where and . In (2.11), if one lets , one will get
**
if one lets , one will get
*

Lemma 2.4 (generalized Halanay inequality [37]). *For two positive-valued functions and defined on , assume there exists a constant number satisfying hold for all ; is nonnegative continuous function on and satisfies the following inequality:
**
where ; is constant. Then one has
**
where is defined as
*

#### 3. Main Results

Theorem 3.1. *Under assumptions , if there exist a positive diagonal matrix , constants , , , ,,, , and , such that
**
where
**, , , , then for all , the trivial solution of system (1.4) is th moment exponentially stable, where is a constant.*

*Proof. *Consider the following Lyapunov function:
Applying Itô formula to , for , we can get
From the boundary condition, we get
in which, is the gradient operator, and
On the other hand, from Lemma 2.3, we have
using Corollary 3.2, system (4.1) is 4th moment exponential stable.

*Remark 4.2. *One can find that models considered in [20, 22–29] are special cases of model (1.4). To the best of our knowledge, few authors have considered the th moment exponential stability for Stochastic reaction-diffusion Neural Networks with time-varying connection matrix and delays. It is assumed in [22, 23, 25, 26] that delays are constants, the delay functions appear in [29] are differential and their derivatives are simultaneously required to be not greater than 1, the activation functions appear in [22, 26] are bounded. Obviously, we have dropped out these basic assumptions in this paper.

It is obvious that the results in [20–30] and the references therein cannot be applicable to system (4.1) even if we remove the reaction-diffusion terms from the system for the connection matrix and delays considered in this example are time-varying. This implies that the results of this paper are essentially new. Just choose these conclusions can be verified by the numerical simulations shown in Figure 1.

#### 5. Conclusions

In this paper, stochastic Cohen-Grossberg neural networks with time-varying delays and reaction-diffusion have been investigated. All features of stochastic systems (especially the connection matrices and delays are time-varying) reaction-diffusion systems have been taken into account in the neural networks. Without requiring the differential and monotonicity of the activation functions and the symmetry of the connection matrices, a set of new sufficient conditions for checking th moment exponential stability of the trivial solution of the considered system is presented by using of Lyapunov function, stochastic analysis technique, and the generalized Halanay inequality. The proposed results generalized and improved some of the earlier published results greatly. The results obtained in this paper are independent of the magnitude of delays and diffusion effect, which implies that strong self-regulation is dominant in the networks. In addition, the methods used in this paper are also applicable to other neural networks, such as stochastic Hopfield neural networks with time-varying delays and reaction-diffusion terms and stochastic bidirectional associative memory (BAM) neural networks with time-varying delays and reaction-diffusion terms. If we remove the noise and reaction-diffusion terms from the system, the derived conditions for stability of general deterministic neural networks can be viewed as byproducts of our results.

#### Acknowledgments

The authors are extremely grateful to two anonymous reviewers, and particularly to Professor Yong Zhou for their valuable comments and suggestions, which have contributed a lot to the improved presentation of this paper. This work was supported in part by the National Natural Science Foundation of China under Grants no. 50925727, no. 10971240, and no. 60876022, the Foundation of Chinese Society for Electrical Engineering, the Foundation of Yunnan Provincial Education Department under Grant no. 07Y10085, the Scientific Research Fund of Yunnan Province under Grant no. 2008CD186.