Modeling, Analysis, and Applications of Complex SystemsView this Special Issue
Research Article | Open Access
Global -Stability Analysis for Impulsive Stochastic Neural Networks with Unbounded Mixed Delays
We investigate the global -stability in the mean square of impulsive stochastic neural networks with unbounded time-varying delays and continuous distributed delays. By choosing an appropriate Lyapunov-Krasovskii functional, a novel robust stability condition, in the form of linear matrix inequalities, is derived. These sufficient conditions can be tested by MATLAB LMI software packages. The results extend and improve the earlier publication. Two numerical examples are provided to illustrate the effectiveness of the obtained theoretical results.
Since 1982 when American California Engineering institute physicist Hopfield proposed Hopfield neural networks models [1, 2], artificial neural networks theory and applied research have attracted more and more attentions. The application of artificial neural networks is very extensive. It can process a variety of information, such as artificial intelligence, secure communications, network optimization, military information, pattern recognition, and so forth. Information processing of neural networks greatly depends on the system's dynamic characteristics. The stability is one of very important dynamic characteristics of neural network.
External perturbations can affect the stability of a real system, so it is necessary to consider the stochastic effects for the stability property of neural networks. Recently, some results to guarantee the global asymptotic stability or exponential stability for stochastic neural networks are obtained, see [3–8]. For example, in , Blythe et al. investigated stochastic stability of neural networks with constant delay. In , Wang et al. analysed the stability in the mean square for stochastic Cohen-Grossberg neural networks with time-varying delays and continuous distributed delays.
As we known, artificial neural networks are often subject to impulsive perturbations which can affect systems’ stability, see [9–11]. Impulsive perturbations often make stable systems unstable. Therefore, we should consider systems’ stability in impulsive effects. Very recently, some results on stability of stochastic neural networks with impulses have been proposed and studied, see [12, 13].
The neural network may be disturbed by environmental noises, which cause the uncertainty of the connection weights of the neurons and affect the dynamical behaviors of system. Therefore, parameter uncertainties should be taken into account in system model. There are some global stability results of neural networks under the influence of parameter uncertainties [14–20]. In , the authors kept a watchful eye on robust exponential stability for uncertain stochastic neural networks with discrete interval and distributed time-varying delays by Lyapunov-Krasovski functional method and the random analysis theory.
The time delays happen frequently in the neurotransmission. Many researchers have a large amount interest in time delay neural networks. There are many sufficient conditions have been proposed to guarantee the stability of neural networks with various type of time delays [21–26]. For example, in , Lou and Cui investigated the global asymptotic stability of the Hopfield neural networks with bounded time delay by constructing suitable Lyapunov-krasovski functional and employing LMI method. In , Raja et al. further obtained some results of stochastic neural networks with mixed time delays by using Lyapunov stability theory and LMI technology. However, most of the results have only focused on bounded delays. As we know, in many engineering time delays depend on the histories heavily and may be unbounded [27, 28]. In this case, the conditions on delays of those existing results are too restrictive.
Recently, Chen and Wang , Liu and Chen  proposed a new concept of -stability, which can be applied to neural networks with unbounded time-varying delay. Moreover, few results have been reported in the literature concerning the problem of -stability for parameter impulsive stochastic neural networks with unbounded time-varying delays and continuously distributed delays, which inspire our interests.
This paper is concerned with the global stability analysis in the mean square for impulsive stochastic neural networks with unbounded time-varying delays and continuously distributed delays. By choosing an appropriate Lyapunov-Krasovskii functional, a novel robust stability condition, in the form of linear matrix inequalities, is derived. These sufficient conditions can be tested by Matlab LMI software packages. Our results extend and improve the some results [29, 30].
The paper is organized as following. In Section 2, the basic definitions and assumptions are given together with the statement of the problem. In Section 3, the sufficient conditions of -stability in the mean square of impulsive stochastic neural networks with unbounded time-varying delays and continuously distributed delays are obtained. In Section 4, global robust -stability criteria in the mean square are derived for uncertain neural networks model. Then, two numerical examples are provided to demonstrate the effectiveness of our results in Section 5. Finally, concluding remarks are given in Section 6.
Notations. Throughout this paper, and denote, respectively, the set of real numbers and -dimensional real spaces equipped with the Euclidean norm ; denotes the set of positive integers; denotes the set of all . Let () denote that the matrix is a symmetric and positive semidefinite (negative semidefinite) matrix; denotes the transpose of the matrix ; () denotes the maximum eigenvalue (minimum eigenvalue) of matrix . is the space of square integrable vector. Moreover, let be a probability space with a filtration satisfying the usual conditions (It is right continuous and contains all -null sets). denotes the family of all -measurable. stands for the mathematical expectation operator with respect to the given probability measure .
2. Model Description and Some Assumptions
Consider the following neural network model: where is the neuron state vector of the neural network; , is decay rate; , and are connection weight matrix; is the neuron activation function; represents the transmission delay of neural networks; is the delay kernel function; is an input constant matrix, and is the impulsive function.
Throughout the paper, we make the following assumptions.
Assumption 1. The neuron activation functions , are bounded, continuous differentiable, and satisfy where are some positive constants.
Assumption 2. is a nonnegative and continuous differentiable time-varying delay and satisfies , is a positive constant.
Assumption 3. The delay kernels , are some real nonnegative continuous functions defined in and satisfy
Assumption 4. The impulse times satisfy .
If the functions satisfy Assumption 1, the system (1) exists an equilibrium point, see . Assume that is an equilibrium point of system (1), and the impulsive function of system (1) characterized by , where , is a real matrix.
Letting , one can switch the equilibrium point of system (1) to the origin and have where , , and . By the definition of and Assumption 1, satisfies the sector condition and , where are some positive constants.
In the practical application, a neural system is affected by external perturbations. It is necessary to consider the effect of randomness to neural networks. We obtain stochastic impulsive neural networks with delays: where is dimensional Brownian motions defined on a complete probability space .
Assumption 5. Assume that is local Lipschitz continuous and satisfies the linear growth condition, that is, for all . Moreover, there exist dimension matrices , such that
The initial conditions for system (6) are , where denotes the family of all bounded -measurable, -valued variables, satisfying .
For completeness, we first give the following definition and lemmas.
Definition 1 (see ). Suppose that is a nonnegative continuous function and satisfies. If there exists a scalar such that then the system (6) is said to be globally stochastically -stable in the mean square.
Obviously, the definition of global stochastic -stability in the mean square includes the globally stochastically asymptotical stability and exponential stability in the mean square.
Lemma 2 (see ). Let be real matrices of appropriate dimensions, and be a scalar, then one has
Lemma 3 (see ). Let and be the real matrices of appropriate dimensions with satisfying , then if and only if there exists a scalar , such that
Lemma 4 (see ). For a given matrix where , is equivalent to any one of the following conditions:(1); (2).
3. Stochastic Stability Analysis of Neural Networks
Let denote the family of all nonnegative functions on , which are continuous one differentiable in and twice differentiable in . For every such , we define an operator associated with system (6), where
Theorem 5. Assume that Assumptions 1–5 hold. Then, the zero solution of system (6) is globally stochastically -stable in the mean sense if there exist diagonal matrices , and , some constants , , , , , a nonnegative continuous differential function defined on , and a constant such that, for ,
and the following LMI hold:
, , and impulsive operator .
Proof. By Lemma 4, (16) is equal to the following inequality: Based on the system (6), we construct the following Lyapunov-Krasovski functional: where By Itô's formula, we can obtain the following stochastic differential: where Along the trajectories of system (6), we have that By Lemma 2, we get that From Assumption 5, we obtain that Therefore, From (15), we have We use Cauchy's inequality and get Thus, Substituting (25), (26), and (28) to (21), we get where So, we have Taking the mathematical expectation, we get By , we get In addition, Noting , we have It is obvious that , , therefore By (33) and (36), we know that is monotonically nonincreasing for , which implies that From the definition of , we can deduce that where . It implies that This completes the proof of Theorem 5.
Remark 6. Theorem 5 provides a global stochastic -stability in the mean sense for a impulsive stochastic differential system (6). It should be noted that our results depend on the upper bound of the derivative of time-varying delay, the influence of randomness, the delay kernels of continuous distribution, and have nothing to do with the range of time-varying delay. Therefore, our results can be applied to the impulsive stochastic neural networks with unbounded time-varying delay and continuous distributed delay.
Remark 7. In [29, 30], different approaches were used to study -stability of neural networks with unbounded time-varying. However, the impulsive effect and unbounded continuous distributed delay are not considered in their models. It is known that neural networks with unbounded continuous distributed delays in impulsive effect are of great importance in many practically problems. Hence, our results in this paper are more general than those reported in [29, 30],
Corollary 8. Assume that Assumptions 1–4 hold. Then, the zero solution of system (4) is globally -stable if there exist diagonal matrices , and , some constants , , , , , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , and impulsive operator .
If we take in Theorem 5, then we obtain the following testable condition.
Corollary 9. Assume that Assumptions 1–5 hold. Then, the zero solution of system (6) is globally stochastically -stable in the mean square if there exist some constants , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , and impulsive operator .
4. Robust Stochastic Stability Analysis of Neural Networks
Consider the following uncertain stochastic neural networks: where some parameters and variables are introduced in Section 2, the uncertainties are described as follows: where are perturbed matrices satisfying where are some appropriate dimensions constant matrices, is an unknown real time-varying function with appropriate dimensions and satisfies where is appropriate dimensions unit matrix.
Theorem 10. Assume that Assumptions 1–5 hold. Then, the zero solution of system (44) is globally robustly stochastically -stable in the mean square if there exist diagonal matrices ,, and , some constants , , , , ,, , , , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , , , , and impulsive operator .
Remark 11. The neural network can be disturbed by environmental noises, which cause system parameters' uncertainty. It can lead to system instability. As far as we know, there is no literature that is published on robust -stability on uncertain neural networks with unbounded continuous distributed delays in impulsive perturbations.
There exist only parameter uncertainties and no stochastic perturbations. So the uncertain neural networks can be described as
Corollary 12. Assume that Assumptions 1–5 hold. Then, the zero solution of system (52) is globally robustly -stable if there exist diagonal matrices , , and , some constants , , , , , , , a nonnegative continuous differential function defined on , and a constant such that, for , and the following LMI hold: where , , , , , , and impulsive operator .
5. Illustrative Examples
In this section, we will give two examples to show the validity of the results obtained.
Example 13. Consider the stochastic impulsive neural network with two neurons: where the activation function is described by , , , . It is obvious that is an equilibrium point of system (55). Let and choose , , , , and the parameter matrices are, respectively, given by
In this case, we get . Let , , and . We can get via MATLAB LMI toolbox. By Theorem 5, the equilibrium point of model (55) with unbounded time-varying delay and continuously distributed delay is globally stochastically -stable in the mean square. The numerical simulation is shown in Figure 1.
Example 14. Consider the uncertain stochastic impulsive neural network with two neurons: