Research Article | Open Access
Pth Moment Exponential Stability of Impulsive Stochastic Neural Networks with Mixed Delays
This paper investigates the problem of pth moment exponential stability for a class of stochastic neural networks with time-varying delays and distributed delays under nonlinear impulsive perturbations. By means of Lyapunov functionals, stochastic analysis and differential inequality technique, criteria on pth moment exponential stability of this model are derived. The results of this paper are completely new and complement and improve some of the previously known results (Stamova and Ilarionov (2010), Zhang et al. (2005), Li (2010), Ahmed and Stamova (2008), Huang et al. (2008), Huang et al. (2008), and Stamova (2009)). An example is employed to illustrate our feasible results.
The dynamics of neural networks have drawn considerable attention in recent years due to their extensive applications in many fields such as image processing, associative memories, classification of patters, and optimization. Since the integration and communication delays are unavoidably encountered in biological and artificial neural systems, it may result in oscillation and instability. The stability analysis of delayed neural networks has been extensively investigated by many researchers, for instance, see [1–30].
In real nervous systems, there are many stochastic perturbations that affect the stability of neural networks. The result in Mao  suggested that one neural network could be stabilized or destabilized by certain stochastic inputs. It implies that the stability analysis of stochastic neural networks has primary significance in the design and applications of neural networks, such as [7, 12–16, 18, 20, 22–24, 26, 27, 30].
On the other hand, it is noteworthy that the state of electronic networks is often subjected to some phenomenon or other sudden noises. On that account, the electronic networks will experience some abrupt changes at certain instants that in turn affect dynamical behaviors of the systems [5, 6, 17–23, 28, 29]. Therefore, it is necessary to take both stochastic effects and impulsive perturbations into account on dynamical behaviors of delayed neural networks [18, 20, 22, 23].
Very recently, Li et al.  have employed the properties of M-cone and inequality technique to investigate the mean square exponential stability of impulsive stochastic neural networks with bounded delays. Wu et al.  studied the exponential stability of the equilibrium point of bounded discrete-time delayed dynamic systems with linear impulsive effects by using Razumikhin theorems. To the best of authors’ knowledge, however, few authors have considered the pth moment exponential stability of impulsive stochastic neural networks with mixed delays.
Motivated by the discussions above, our object in this paper is to present the sufficient conditions ensuring pth moment exponential stability for a class of stochastic neural networks with time-varying delays and distributed delays under nonlinear impulsive perturbations by virtue of Lyapunov method, inequality technique and Itô formula. The results obtained in this paper generalize and improve some of the existing results [5, 8, 18, 19, 26–28]. The effectiveness and feasibility of the developed results have been shown by a numerical example.
2. Model Description and Preliminaries
Let denote the set of real numbers, the -dimensional real space equipped with the Euclidean norm , the set of nonnegative integral numbers. stands for the mathematical expectation operator. denotes the well-known -operator given by the Itô formula. is -dimensional Brownian motion defined on a complete probability space with a natural filtration generated by , where we associate with the canonical space generated by and denote by the associated -algebra generated by with the probability measure . Let , and be th row vector of .
In [5, 6], the researchers investigated the following impulsive neural networks with time-varying delays: The authors in [7, 26, 27] studied the stochastic recurrent neural networks with time-varying delays:
In this paper, we will study the generalized stochastically perturbed neural network model with time-varying delays and distributed delays under nonlinear impulses defined by the state equations: where , the time sequence satisfies , ; and corresponds to the state of the th unit at time ; , , and denote the constant connection weight; is the time-varying transmission delay and satisfies , for . , , denote the activation functions of the th neuron; the delay kernel is the real-valued nonnegative piecewise continuous functions defined on ; corresponds to the numbers of units in a neural network; denotes the external bias on the th unit; represents the abrupt change of the state at the impulsive moment .
System (2.3) is supplemented with initial condition given by where . Denote by the family of all bounded -measurable, -value random variables , satisfying , where is continuous everywhere except at finite number of points , at which and exist and .
The norms are defined by the following norms, respectively:
Throughout this paper, the following standard hypothesis are needed.
(H1) Functions are continuous and monotone increasing, that is, there exist real numbers such that for all , , .
(H2) Functions , are Lipschitz-continuous on with Lipschitz constants , and , respectively. That is, for all , .
(H3) The delay kernels satisfy where is continuous and integrable, and the constant denotes some positive number.
(H4) There exist nonnegative constants , such that for all , .
(H5) There exist nonnegative matrixes such that for any , where is an integer.
We end this section by introducing three definitions.
Definition 2.1. A constant vector is said to be an equilibrium point of system (2.3) if is governed by the algebraic system where it is assumed that impulse functions satisfy for all and .
Definition 2.3 (Forti and Tesi, 1995 ). A map is a homeomorphism of onto itself if is continuous and one-to-one, and its inverse map is also continuous.
3. Main Result
For convenience, we denote that where are positive constant, , , , , , and are real numbers and satisfy
Lemma 3.1. If denote nonnegative real numbers, then
where denotes an integer.
A particular form of (3.3), namely,
Lemma 3.2. If is a continuous function and satisfies the following conditions. (1) is injective on , that is, for all . (2) as .
Then, is homeomorphism of .
Theorem 3.3. System (2.3) exists a unique equilibrium under the assumptions (H1)–(H3) if the following condition is also satisfied:
Proof. Defining a map , where
the map is a homeomorphism on if it is injective on and satisfies as .
In the following, we will prove that is a homeomorphism.
Firstly, we claim that is an injective map on . Otherwise, there exist , and such that , then It follows from (H1)–(H3) that Therefore, From (H6), it leads to a contradiction with our assumption. Therefore, is an injective map on .
To demonstrate the property as , we have where . Then, we have Using the Hölder inequality, we obtain which leads to From (3.12), we see that as . Thus, the map is a homeomorphism on under the sufficient condition (H6), and hence it has a unique fixed point . This fixed point is the unique solution of the system (2.3). The proof is now complete.
To establish some sufficient conditions ensuring the pth moment exponential stability of the equilibrium point of of system (2.3), we transform to the origin by using the transformation for . Then system (2.3) can be rewritten as the following form: where In order to obtain our results, the following assumptions are necessary.
(H7) When , for any .
When , for any .
(H8) There exist constant and , such that
Theorem 3.4. Assume that (H1)–(H5) and (H7)-(H8) hold, then system (2.3) exists a unique equilibrium , and the equilibrium point is pth moment exponentially stable.
Proof. From (H7), if , then
for any .
Then regardless of cases, (H6) and are satisfied. Therefore, system (2.3) exists a unique equilibrium point .
Now, we define For , , we obtain while Lemma 3.1 is used in the second inequality. Let for , where , applying It formula to (3.21), we can get From (H8), we have On the other hand, we have for , , by (3.23), (3.25), and (H8), we have where .
On the other hand, we observe that It follows that for , where which means that Therefore, the equilibrium point of system (2.3) is pth moment exponentially stable.
Corollary 3.5. If , under the assumptions (H1)–(H5), system (2.3) exists a unique equilibrium point , and is pth moment exponentially stable if the following two conditions are satisfied:
(H10) There exist constant and , such that
Proof. In Theorem 3.4, let , for all . The result is obtained directly.
Corollary 3.7. Under the assumptions and , system (2.1) exists a unique equilibrium point , and is globally exponentially stable if , and the following condition is also satisfied:
Proof. In Theorem 3.4, when , choosing , for , and , for all , then the result is obtained.
Remark 3.8. If , , it follows from (3.35) that
this is less conservative than the following inequality:
while (3.37) was required in Theorem 3.1 in  and in the only theorem in .
When , , , (3.37) is equal to while (3.38) was required in Theorem 3.2 in .
Corollary 3.9. Under the assumptions (H2), (H4), and (H8), system (2.2) exists a unique equilibrium point , and is pth moment exponentially stable if , and the following condition is satisfied: Especially, if , then the equilibrium is exponentially stable in mean square if the following condition is also satisfied:
4. Illustrative Example
In this section, we will give an example to show that the conditions given in the previous sections are less weak than those given in some of the earlier literatures, such as .
Consider the following stochastic neural networks with mixed time delays where , , , , and .