Abstract

The exponential stability issue for a class of stochastic neural networks (SNNs) with Markovian jump parameters, mixed time delays, and -inverse Hölder activation functions is investigated. The jumping parameters are modeled as a continuous-time finite-state Markov chain. Firstly, based on Brouwer degree properties, the existence and uniqueness of the equilibrium point for SNNs without noise perturbations are proved. Secondly, by applying the Lyapunov-Krasovskii functional approach, stochastic analysis theory, and linear matrix inequality (LMI) technique, new delay-dependent sufficient criteria are achieved in terms of LMIs to ensure the SNNs with noise perturbations to be globally exponentially stable in the mean square. Finally, two simulation examples are provided to demonstrate the validity of the theoretical results.

1. Introduction

In the past few decades, there has been increasing interest in different classes of neural networks such as Hopfield, cellular, Cohen-Grossberg, and bidirectional associative neural networks due to their potential applications in many areas such as classification, signal and image processing, parallel computing, associate memories, optimization, and cryptography [16]. In the design of practical neural networks, the qualitative analysis of neural network dynamics plays an important role. To solve problems of optimization, neural control, signal processing, and so forth, neural networks have to be designed in such a way that, for a given external input, they exhibit only one globally asymptotically/exponentially stable equilibrium point. Hence, much effort has been made in the stability of neural networks, and a number of sufficient conditions have been proposed to guarantee the global asymptotic/exponential stability for neural networks with or without delays; see, for example, [719] and the references therein.

As is well known, a real system is usually affected by external perturbations which in many cases are of great uncertainty and hence may be treated as random. As pointed out in [20], in real nervous systems, and in the implementation of artificial neural networks, synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes; hence noise is unavoidable and should be taken into consideration in modeling. Moreover, in [21, 22], it has been shown that a neural network can be stabilized or destabilized by certain stochastic inputs. Therefore, the stochastic stability of various neural networks with or without delays under noise disturbance has received extensive attention from a lot of scholars in recent years, and some results related to this issue have been reported in the literature; see [2331].

When the neural network incorporates abrupt changes in its structure, the Markovian jumping (switching) nonlinear system is very appropriate to describe its dynamics. In the past few years, based on Markovian switching system theory, the dynamics of various neural networks with Markovian jump parameters have been widely explored in the existing literature. In [32], Markovian jumping BAM neural networks with time-varying delays were investigated, and some sufficient conditions were derived for the global exponential stability in the mean square by using stochastic Lyapunov-Krasovskii functional approach. In [33], uncertain Markovian jumping Cohen-Grossberg neural networks with mixed time-varying delays were discussed, and the robust stability results were obtained in terms of LMIs. In [34], the authors considered the robust stabilization of stochastic Markovian jumping dynamical networks with mode-dependent mixed delays. In [35, 36], Markovian jumping recurrent neural networks with discrete and distributed delays and with interval time-varying delays were investigated, respectively, and some criteria had been established to guarantee the existence of the state estimators. In [37], Markovian coupled neural networks with nonidentical node-delays and random coupling strengths were introduced, and several delay-dependent sufficient synchronization criteria ware derived and formulated by LMIs. Very recently, considerable efforts have been devoted to investigate the Markovian jumping SNNs; various stability conditions have been presented in the existing literature. In [38], the global stability issue for Markovian jumping stochastic Cohen-Grossberg neural networks with mixed time delays was studied, and the exponential stability results were proposed by using stochastic Lyapunov-Krasovskii functional approach. In [39], a class of SNNs with both Markovian jump parameters and mixed time delays was investigated, and some novel sufficient conditions which guarantee the exponential stability of the equilibrium point in the mean square were derived in terms of LMIs. Furthermore, the robust exponential stability was discussed for a class of Markovian jump impulsive stochastic Cohen-Grossberg neural networks with mixed time delays by using inequality techniques and Lyapunov method in [40]. In [41], some sufficient conditions were presented in terms of LMIs to guarantee the global exponential stability for stochastic jumping BAM neural networks with time-varying and distributed delays. In [42], delay-interval-dependent robust stability results were addressed in terms of LMIs for uncertain stochastic systems with Markovian jumping parameters. In [42], the stochastic global exponential stability problem was considered, and some delay-dependent exponential stability criteria and decay estimation are presented in terms of LMIs for neutral-type impulsive neural networks with mixed time-delays and Markovian jumping parameters.

It should be noted that all the results reported in the literature above are concerned with Markovian jumping SNNs with Lipschitz neuron activation functions. To the best of our knowledge, up to now, very little attention has been paid to the problem of the global exponential stability of Markovian jumping SNNs with non-Lipschitz activation functions, which often appear in realistic neural networks. This situation motivates our present investigation.

In this paper, our aim is to study the delay-dependent exponential stability problem for a class of Markovian jumping neural networks with mixed time delays and -inverse Hölder activation functions under stochastic noise perturbation. Here, it should be pointed out that -inverse Hölder activation functions are a class of non-Lipschitz functions. By utilizing the Brouwer degree properties, Lyapunov stability theory, stochastic analysis theory, and LMI technique, some novel delay-dependent conditions are obtained, which guarantee the exponential stability of the equilibrium point. The results obtained in this paper improve and generalize those presented in [14, 15, 1719] since our model and conditions are more general and weaker than those presented in [14, 15, 1719], whereas the criteria obtained in [14, 15] were not expressed in terms of LMIs, and the noise disturbance was not considered in [14, 15, 1719].

The rest of this paper is organized as follows. In Section 2, the model of SNNs with both mixed time delays and inverse Hölder activation functions is introduced, together with some definitions and lemmas. By means of topological degree theory and Lyapunov-Krasovskii functional approach, our main results are established in Section 3. In Section 4, two numerical examples are presented to show the effectiveness of the obtained results. Finally, some conclusions are given in Section 5.

Notations. Throughout this paper, denotes the set of real numbers, denotes the n-dimensional Euclidean space, and denotes the set of all real matrices. For any matrix , denotes the transpose of and denotes the inverse of . If is a real symmetric matrix, () means that is positive definite (negative definite). and denote net minimum and maximum eigenvalues of a real symmetric matrix, respectively. is the identity matrix; the notation denotes the family of all nonnegative functions on which are continuously twice differentiable in and once differentiable in ; is a complete probability space, where is the sample space, is the -algebra of subsets of the sample space, and is the probability measure on ; denotes the family of all -measurable -valued random variables such that , where stands for the mathematical expectation operator with respect to the given probability measure . Given the column vectors , , , . denotes the derivative of , and represents the symmetric form of matrix.

2. Model Description and Preliminaries

In this paper, the stochastic neural networks with mixed time delays are described by the following integrodifferential equation system: where denotes the state at time ; , , and denote the neuron activation, and , , and ; is a positive diagonal matrix; , , are the neural self-inhibitions; , , and denote the connection weight matrix, the discretely delayed connection weight matrix, and the distributively delayed connection weight matrix, respectively; is the external input; is the discrete time-varying delay which is bounded with and ; is a constant delay; denotes the stochastic disturbance; is -dimensional Brownian motion defined on a complete probability space with a filtration satisfying the usual conditions (i.e., it is right continuous and contains all -null sets) and

Suppose that the initial condition of the neural network (1) has the form for , where are continuous functions; .

Throughout this paper, we assume that the activation functions , , and , , satisfy the following.() We have the following.(1) is a monotonic increasing continuous function.(2)For any , there exist the scalars , which are correlated with , and such that () , are continuous, and , , for all , , . Denote , .

Remark 1. In [15], the function satisfying assumption is said to be an -inverse Hlder function and firstly is used as the neuron activation function in the study of the stability issue study of neural networks. It is easy to check that -inverse Hlder functions are a class of non-Lipschitz functions, and there exist a great number of -inverse Hlder functions in the engineering mathematics; for example, and are 1-inverse Hlder functions; is a 3-inverse Hlder function.

Remark 2. From (), we can get that and are positive scalars, so and are both positive definite diagonal matrices.

Remark 3. The relations among the different activation functions (which are -inverse Hölder activation functions), , and are implicitly established in Theorem 14. Such relations however have not been provided in other reported literature.
For the deterministic neural network system We have the following result.

Theorem 4. Under assumptions and , if there exist a positive definite diagonal matrix and two positive definite matrices , such that the following condition is satisfied: then the neural network system (4) has a unique equilibrium point.

Proof. See the appendix.

In order to guarantee that system (1) has equilibrium, we assume that as the system approaches to its equilibrium, the stochastic noise contribution vanishes; that is,() , .

Thus, the system (1) admits one equilibrium point under (). Let ; the system (1) can be rewritten in the following form: where

Apparently, is also an -inverse Hölder function, and , .

Let be a right continuous Markov chain in a complete probability space () taking values in a finite state space with generator given by where and . Here, is the transition probability rate from to if , while .

In this paper, we consider the following neural networks with stochastic noise disturbance, mixed time delays, and Markovian jump parameters, which are actually a modification of the system (6): where , , , , and have the same meanings as those in (6), is noise intensity function vector, and for a fixed system mode , , , and are known constant matrices with appropriate dimensions.

For convenience, each possible value of is denoted by ; in the sequel. Then we have where , , , and for any are known constant matrices of appropriate dimensions.

Assume that is locally Lipschitz continuous and satisfies(), for all and , where and are known positive definite matrices with appropriate dimensions.

Let denote the state trajectory from the initial data on in . Clearly, the system (9) admits a trivial solution corresponding to the initial data . For simplicity, we write .

Before ending this section, we introduce some definitions and lemmas, which will play important roles in the proof of our theorems below.

Definition 5. The equilibrium point of the neural networks (9) is said to be globally exponentially stable in the mean square if, for any , there exist positive constants , and correlated with , such that, when , the following inequality holds:

Definition 6. One has introduces the stochastic Lyapunov-Krasovskii functional of the system (9), the weak infinitesimal generator of random process from to defined by
Let be a nonempty, bounded, and open subset of . The closure of is denoted by , and the boundary of is denoted by .

Definition 7. Let be a continuously differentiable function, and ; then . Make a continuous function , such that(i), for all ,(ii).
Then the topological degree of about on is defined as where is the Jacobi determinant of at . Let , . Then
Let is continuous on , . Take a function which is continuously differentiable on , such that
This implies that . Set , where is the topological degree of . Then is said to be Brouwer degree of the continuous function about on .

Lemma 8 (see [43]). Let be a continuous mapping. For all , if , then Brouwer degree is constant (for all ). In this case, .
Let be a continuous mapping. If , then the equation has at least a solution in .

Lemma 9 (see [15]). If is an -inverse Hölder function, then, for any , one has

Lemma 10 (see [15]). If is an -inverse Hölder function and , then there exist constants and , such that
Moreover,

Lemma 11 (see [15]). Let , and is positive definite matrix; then

Lemma 12 (Schur complement [38]). Given constant symmetric matrices , , and with appropriate dimensions, where and , then if and only if

Lemma 13 (Jensen’s inequality [38]). For any constant matrix , , scalar , and vector function , such that the integrations concerned are well defined, then

3. Main Results

In this section, stochastic Lyapunov stability theory and a unified LMI approach which is different from the commonly used matrix norm theories (such as the M-matrix method) will be developed to establish sufficient conditions for the neural networks system (9) to be globally exponentially stable in the mean square.

Theorem 14. Under assumptions and , the neural network system (9) is globally exponentially stable in the mean square, if, for given , there exist positive definite matrices , , and   , positive definite diagonal matrix , and positive scalars , such that the following LMIs are satisfied: where

Remark 15. From condition (23), it easily follows that condition (5) holds. Hence, under the assumptions of Theorem 14, the system (9) admits an unique equilibrium point .

Proof. Define a positive definite Lyapunov-Krasovskii functional as follows: where
By assumption () and condition (22), we obtain
Hence, using Lemmas 11 and 13, it follows from (9) and Definition 6 that
By combing (28)–(32), we can obtain that where .

Let . From condition (23), it is easy to see that . This fact, together with (33), yields

Hence, it follows from the definition of , generalized Itô’s formula, and (34) that

By (35), we can get that . Furthermore,

For , by Lemma 10, there exist constants and such that

By (36), there exists a scalar , when , , , where . Hence, when , we have where , . By (38), we get

Hence, where

Let

It follows from (41) and (42) that

Therefore, by Definition 5 and (43), we see that the equilibrium point of the neural networks (9) is globally exponentially stable in the mean square. This completes the proof.

Remark 16. To the best of our knowledge, the global exponential stability criteria applying -inverse Hölder activation functions for stochastic neural networks with Markovian jump parameters and mixed time delays have not been discussed in the existing literature. This paper reports new idea and some sufficient exponential stability conditions of neural networks with stochastic noise disturbance, mixed time delays, -inverse Hölder functions, and Markovian jump parameters, which generalize and improve the outcomes in [1419].

Remark 17. The criterion given in Theorem 14 is dependent on the time delay. It is well known that the delay-dependent criteria are less conservative than delay-independent criteria, particularly when the delay is small.

Based on Theorem 14, the following results can be obtained easily.

Case 1. If we do not take the Markovian jumping into account, then the neural network system (9) is simplified to

Corollary 18. Under assumptions and , the neural network system (44) is globally exponentially stable in the mean square, if, for given , there exist positive definite matrices , , and , positive definite diagonal matrix , and positive scalar , such that the following LMIs are satisfied: where

Case 2. If there are no stochastic disturbances in the system (9), then the neural networks are simplified to

Corollary 19. Under assumptions and , the neural network system (47) is globally exponentially stable in the mean square, if, for given , there exist positive definite matrices , , and   , positive definite diagonal matrix , and positive scalars , such that the following LMIs are satisfied: where

4. Illustrative Examples

In this section, we provide two numerical examples to demonstrate the effectiveness of the theoretical results above.

Example 1. Consider the second-order stochastic neural network (9) with ; is a second-order Brownian motion, and is a right-continuous Markov chain taking values in with generator
For the two operating conditions (modes), the associated data are
Taking assumption is satisfied with , . Set , , and , . It is easy to see that, for any with , there exists a scalar such that
So , are -inverse Hlder function. In addition, for any , it is easy to verify that

That is, the activation functions , , and () satisfy assumption with . For the model with , . We choose , . Using Theorem 14 and MATLAB LMI control toolbox, we can find that the neural network (9) is globally exponentially stable in the mean square and the feasible solutions of the LMIs (22) and (23) are given as follows:

Figure 1 shows Markovian chain generated by probability transition matrix corresponding to the generator (50) with , . Figure 2 shows the state response of the neural network with the initial condition in this example. The simulation results imply that neural network in this example is globally exponentially stable in the mean square.

Example 2. Consider the three-order stochastic neural network (9) with ; is a three-order Brownian motion, and is a right-continuous Markov chain taking values in with generator
For the three operating conditions (modes), the associated data are
Taking assumption is satisfied with , .

Set ; and , , are the functions in Example 1. That is, the activation functions , , and () satisfy assumption with . For the model with , . We choose , , and . Using Theorem 14 and MATLAB LMI control toolbox, we can find that the neural network (9) is globally exponentially stable in the mean square and the feasible solutions of the LMIs (22) and (23) are given as follows:

Figure 3 shows Markovian chain generated by probability transition matrix corresponding to the generator (50) with = 0.01, (0) = 2. Figure 4 shows the state response of the neural network with the initial condition in this example. The simulation results imply that neural network in this example is globally exponentially stable in the mean square.

5. Conclusion

In this paper, we have dealt with the global exponential stability issue for a class of stochastic neural networks with -inverse Hölder activation functions, Markovian jump parameters, and mixed time delays. The delay-dependent sufficient conditions have been achieved in terms of LMIs to ensure the considered neural network with noise perturbations to be globally exponentially stable in the mean square. The criteria obtained can be tested easily by using the MATLAB LMI toolbox and applied in practice engineering. Two numerical simulation examples have been given to check the usefulness of the results presented in this paper.

When neuron activation functions are non-Lipschitz functions, it is possible that the neural network system has no global solution and equilibrium point. This leads to difficulty in solving the stability problem for various stochastic neural networks with non-Lipschitz activation functions. In the future, the stability problem for the Markovian jumping stochastic neural networks with other non-Lipschitz activation functions will be expected to be solved.

Appendix

The Proof of Theorem 4

Proof. Let . is an equilibrium point of the system (4) if and only if . Rewrite as where , , , and . By assumption , we can see that the following inequalities hold: , . According to assumption ,   is also an -inverse Hölder function, and . For the scalar , set . Define the mapping as where . By means of Lemma 11, we have
By Lemma 12, condition (5) is equivalent to
From (A.3) and (A.4), it follows that where denotes the th element of vector .
By virtue of Lemma 10, there exist constants and such that
Let , , , for all . Define
Noting that is a compact subset of and is continuous on , can reach its minimum on .
Let , , , and . Set and ; then there exist two index sets and such that where . Furthermore, there exists an index in such that
By (A.5) and (A.7), for any ; and ,
This implies that for any and . By applying Lemma 8,
That is, , where is the determinant of . By applying Lemma 8, has at least a solution in ; that is, the system (4) has at least an equilibrium point.

Let and be two different equilibrium points of the system (4); then

By means of (A.3) and (A.4), we can get

This is a contradiction. This shows that the equilibrium point of the system (4) is unique. This completes the proof.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the National Science and Technology Major Project of China (no. 2011ZX05020-006) and Natural Science Foundation of Hebei Province of China (A2011203103).