Abstract

This paper is concerned with the mean-square exponential input-to-state stability problem for a class of stochastic Cohen-Grossberg neural networks. Different from prior works, neutral terms and mixed delays are discussed in our system. By employing the Lyapunov-Krasovskii functional method, Itô formula, Dynkin formula, and stochastic analysis theory, we obtain some novel sufficient conditions to ensure that the addressed system is mean-square exponentially input-to-state stable. Moreover, two numerical examples and their simulations are given to illustrate the correctness of the theoretical results.

1. Introduction

Over the last few years, Cohen-Grossberg neural networks have been paid more and more attention because of their potential applications in various fields such as signal processing, image processing, pattern recognition, associative memory, programming problems, and combinatorial optimization (see [15]). Cohen-Grossberg neural networks model was first introduced by Cohen and Grossberg in 1983, which has become one of the most important neural network models (see [6, 7]). Compared with recurrent neural networks, Hopfield neural networks, and cellular neural networks, it is more challenging and interesting to build Cohen-Grossberg neural networks model. In particular, recurrent neural networks, Hopfield neural networks, and cellular neural networks can be regarded as exceptional cases of Cohen-Grossberg neural networks. Many researchers have further studied the effect of Cohen-Grossberg neural networks models and have obtained some reliable conclusions (see [810]). Therefore, it is greatly significant to investigate Cohen-Grossberg neural networks.

As we all know, the stability of Cohen-Grossberg neural networks plays an essential role in practical applications since it is a prerequisite to ensure that a real system works properly. Also, the existence of time delays cannot be ignored in real systems for various reasons, such as the finite switching speed of amplifiers, the congestions in the signal processing and transmission between neurons. The time delay may cause instability, oscillation, and divergence to stochastic neural network systems. Given the actual situation, a great number of stability results have been presented on stochastic delayed neural networks studied in the literature (see [1117]). In addition, neutral-type neural network is a particular time delay system because its main feature that the derivative of system state is connected to delays. In the physical sense, the state trend of a neutral delay system is related to the past state and the current state. Many physical systems can be described as the neutral-type model in the real world. In [18, 19], the stability of neutral-type stochastic neural network has been studied. Meanwhile, the neutral-type time delay is quite different from distributed delays and constant delays. There is no doubt that the distributed delay generally has integral term and acts on probability space. The constant delay is constant and discrete, which means that the effect of the state is delayed by a fixed value on the system. It can be observed that most of the literature works mainly focus on three simple cases containing constant delays, distributed delays, and time-varying delays (see [2022]). In fact, mixed delays have been considered to be more efficient in modeling neural network systems because these simple delays are often not feasible when neural network systems become more complex. To sum up, it is of great significance to study the stability of stochastic Cohen-Grossberg neural networks with mixed delays.

Thankfully, there are a great number of stability results about the stochastic Cohen-Grossberg neural networks with mixed delays (see [2326]). For example, by applying the LaSalle invariant principle of stochastic differential delay equations and the stochastic analysis theory as well as the adaptive feedback technique, Zhang and Deng [23] introduced several sufficient conditions to ensure the adaptive synchronization of Cohen-Grossberg neural network with mixed time-varying delays and stochastic perturbation. Based on a novel Lyapunov-Krasovskii functional and some new approaches and techniques, Zhu and Cao [26] investigated the problem of exponential stability for a class of Markovian jump impulsive stochastic Cohen-Grossberg neural networks with mixed time delays and known or unknown parameters.

Furthermore, it is worth considering that the control input has a great influence on the stochastic Cohen-Grossberg neural networks. Sontag first came up with the input-to-state stability to check robust stability. In [2729], the mean-square exponential input-to-state stability is more common than the traditional exponential stability. Meanwhile, the traditional exponential stability consists of exponential stability, global asymptotical stability, and almost surely stability. As far as we know, the traditional stability is that the states of stochastic neural networks converge to the equilibrium point as time goes to infinity. However, it does not always happen in our real life. For instance, some systems including the pendulum, finance markets, and stock markets only remain bounded but necessarily converge to the equilibrium point. To date, few researchers have discussed the mean-square exponential input-to-state stability problem of stochastic Cohen-Grossberg neural networks due to the difficulty of mathematics. Hence, this situation encourages our current research.

Inspired by the above discussion, we will investigate the problem of the mean-square exponential input-to-state stability analysis for a class of the stochastic neutral-type Cohen-Grossberg neural networks with mixed delays in this paper. Different from the previous stability works, the mean-square exponential input-to-state stability of stochastic neutral-type Cohen-Grossberg neural network has not been investigated. By using the Lyapunov-Krasovskii functional, stochastic analysis theory, Itô formula, and Dynkin formula, the mean-square exponential input-to-state stability is obtained. Moreover, several examples and their simulations are given to illustrate the correctness and effectiveness of obtained results.

The remainder of this paper is structured as follows. In Section 2, we introduce the mathematical model of neutral stochastic Cohen-Grossberg neural networks, assumptions, and definition of the mean-square exponential stability. The main theoretical result and its proofs are derived in Section 3. In Section 4, two numerical examples and their simulations are given to guarantee the correctness of the proposed results. Finally, the conclusion and further discussion are presented in Section 5.

2. Model Description and Preliminaries

In this paper, we mainly focus on the following stochastic neutral-type Cohen-Grossberg neural networks with mixed delays:With initial conditions ,, here is the state variable of the th neuron at time ; and denote the amplification functions and behaved functions of the th unit at time , respectively. The constants ,,, and are the connection weight coefficients of the th unit on the th unit at time . , , , and are the neuron activation functions of the th unit at time . is the external constant input of the th neuron at time , and belongs to , where represents the class of essentially bounded functions from to with =. The noise perturbation is a Borel measurable function, and is an -dimensional standard Brownian motion defined on the complete probability space with a natural filtration . represents the constant delay and denotes the time-varying delay; are distributed delays. are supposed to be differential and satisfy , . Let , where is a positive constant. Let denote the family of continue functions from to with the norm , where is the Euclidean norm in . denotes the family of all measurable, -valued random variables such that , where stands for the correspondent expectation operator with respect to the given probability measure .

Remark 1. In fact, let and and replace in system (1) with ; our model reduces to the model in [15]. In other words, the model discussed in [15] is a special case of the system (1). Since the delays are limited to time-varying delays and , our model is more general and useful in real life.

In order to obtain our main results, we need the following assumptions and definition.

Assumption 2. The functions , , , , and satisfy Lipschitz condition; i.e., there exist positive constants , , , , and such that for all .

Assumption 3. There exist nonnegative constants , , , and , , , such that for all .

Assumption 4. or .

Assumption 5 (see [1]). are bounded functions, and there exist positive constants and such that , .

Assumption 6 (see [26]). There exist positive constants , such that .

Assumption 7. is a matrix, denotes the matrix norm which satisfies .

Under Assumptions 24, it is not difficult to check that the Lipschitz and linear growth conditions hold, and thus there exists a unique solution of system (1). Let denote the solution which satisfies on in . It is clear that under Assumption 4, system (1) admits a trivial solution corresponding to the initial data .

Definition 8 (Zhu and Cao [11]). The trivial solution of system (1) is said to be mean-square exponentially input-to-state stable if for every and , there exist scalars , and such that the following inequality holds:

3. Main Results

In this section, the mean-square exponential input-to-state stability of the trivial solution for system (1) is discussed under Assumptions 27.

Theorem 9. Under Assumptions 27, the trivial solution of system (1) is mean-square exponentially input-to-state stable, if there exist positive constants , , , , such that the following conditions hold:

Proof. Let . To obtain the mean-square exponential input-to-state stability, we can construct the following Lyapunov-Krasovskii functional:whereBy employing Itô formula, we get the following stochastic differential equation:where , , , and is the weak infinitesimal operator which satisfies where is a matrix and ; under Assumptions 2 and 3, we can obtain By using the elementary inequality and under Assumptions 5 and 6, we have By Itô formula, we get By condition (6), there exists a sufficiently small constant such that Now, we need to define a stopping time as follows: . By means of the Dynkin formula, we can obtain that that is Letting , then . According to the monotone convergence theorem, we have Noting thatthen we havewhere . We can use the inequalities , , and it follows that We can always find a large enough constant and a sufficiently small constant such that . Therefore and we can easily derive where and . Furthermore, we get According to Definition 8, the trivial solution of system (1) is mean-square exponentially input-to-state stable. Thus, we complete the proof.

Remark 10. The mean-square exponential input-to-state stability problem of stochastic Cohen-Grossberg neural networks was investigated in [1]. However, the delay is limited to a time-varying delay . In this paper, stochastic neutral-type Cohen-Grossberg neural networks discussed are generalized to the systems with mixed delays. They are more general than time-varying delay in [1]. Also, the mean-square exponential input-to-state stability problem for a class of stochastic recurrent neural networks with mixed delays was discussed in [11, 16]. But the neutral terms and Cohen-Grossberg models were not taken into account. Meanwhile, the Cohen-Grossberg neural networks include recurrent neural networks, Hopfield neural networks, and cellular neural networks. In other words, the systems considered in [11, 16] are special cases of system (1) in this paper.

Remark 11. Obviously, when , Theorem 9 reduces to the mean-square exponential stability of the neutral stochastic Cohen-Grossberg neural network. We can obtain the following corollary.

Corollary 12. Under Assumptions 27, if the all conditions of Theorem 9 hold, the trivial solution of system (1) with is mean-square exponentially stable.

Moreover, when we replace , with in system (1), system (1) reduces to the following stochastic delayed Cohen-Grossberg neural network:Moreover, we use the following assumption instead of Assumption 3.

Assumption 13. There exist nonnegative constants , , , and such that for all .

Then we can use the similar way to the proof of Theorem 9 to prove the next theorem. Thus, we get the following.

Theorem 14. Based on Assumptions 2, 3, 4, 5, 6, and 13, the trivial solution of system (24) is mean-square exponentially input-to-state stable, if there exist positive constants , , , such that the following conditions hold:

Corollary 15. Assume that all the conditions of Theorem 14 hold. Then, the trivial solution of system (24) with is mean-square exponentially stable.

In particular, when the distributed delays are removed, system (1) reduces to the following neutral stochastic Cohen-Grossberg system with time-varying delay:Therefore, we will replace Assumption 13 with the next assumption.

Assumption 16. There exist nonnegative constants , , and such that for all .

Corollary 17. Under Assumptions 2, 3, 4, 5, 6, and 16, the trivial solution of system (24) is mean-square exponentially input-to-state stable, if there exist positive constants , , such that the following conditions hold:

Corollary 18. Assume that all the conditions of Corollary 17 hold. Then, the trivial solution of system (27) with is mean-square exponentially stable.

Remark 19. The systems are no longer stochastic neutral-type neural networks when ; of course, we can also obtain the mean-square exponential input-to-state stability results from Theorems 9 and 14 when .

4. Examples and Simulation

In this section, two numerical examples are given to illustrate the effectiveness of our theoretical conclusions.

Example 20. Consider the two-dimensional system (1) with , , , , , , , , , , ; is a two-dimensional Brownian motion.Take , , , , , , , , , , , , which satisfy Assumptions 27. Then it is uncomplicated to check the following conditions: Therefore, all the conditions of Theorem 9 are satisfied. We can obtain that the neutral stochastic Cohen-Grossberg system (1) is mean-square exponentially input-to-state stable. Figure 1(a) further shows that network (1) is mean-square exponentially input-to-state stable. When , network (1) is mean-square exponentially stable; see Figure 1(b).

Example 21. Consider the three-dimensional system (24) with , , , , , , , , , ; is a three-dimensional Brownian motion.Let , , , , , , , , , , , , , , , , , , which satisfy Assumptions 2, 3, 4, 5, 6, and 13. Then, a simple computation yields Therefore, all the conditions of Theorem 14 are satisfied. Then, by Theorem 14, the neutral stochastic Cohen-Grossberg system (24) is mean-square exponentially input-to-state stable. Figure 2(a) further shows that network (24) is mean-square exponentially input-to-state stable. It is evident that network (24) is mean-square exponentially stable when , which can be also seen from Figure 2(b).

Remark 22. According to the Example 20, we can see that the state response of the neural network (1) only remains mean-square bounded but does not need to converge to the equilibrium point. Particularly, when , the state response of the neural network (1) can converge to the equilibrium point. And a similar conclusion can be drawn from the figures in Example 21.

5. Conclusions

In this paper, a class of stochastic Cohen-Grossberg neural networks with neutral terms and a constant delay, a time-varying delay, and multiple distributed time-varying delays has been researched. By constructing the Lyapunov-Krasovskii functional and applying stochastic analysis theory, Itô formula, Dynkin formula, and some sufficient conditions, we have got some new sufficient conditions to undertake the mean-square exponential input-to-state stability for the given models. The effectiveness of our conclusions has been illustrated through two numerical examples. The input-to-state stability is a hot topic. We will treat weakening conditions of system (1) and get stable results as one of our future prospects. Furthermore, we might continue to study the input-to-state stability of the neutral stochastic Cohen-Grossberg neural networks with Markovian switching and lévy noise.

Data Availability

We declare that the materials described in the manuscript, including all relevant raw data, will be freely available to any scientist wishing to use them for noncommercial purposes, without breaching participant confidentiality.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The authors thank National Natural Science Foundation of China (No. 61563033) for its financial support.