Abstract

This paper is concerned with the robust dissipativity problem for interval recurrent neural networks (IRNNs) with general activation functions, and continuous time-varying delay, and infinity distributed time delay. By employing a new differential inequality, constructing two different kinds of Lyapunov functions, and abandoning the limitation on activation functions being bounded, monotonous and differentiable, several sufficient conditions are established to guarantee the global robust exponential dissipativity for the addressed IRNNs in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in MATLAB. Furthermore, the specific estimation of positive invariant and global exponential attractive sets of the addressed system is also derived. Compared with the previous literatures, the results obtained in this paper are shown to improve and extend the earlier global dissipativity conclusions. Finally, two numerical examples are provided to demonstrate the potential effectiveness of the proposed results.

1. Introduction

Neural networks have been a subject of intense research activities over the past few decades due to their wide applications in many areas such as signal processing, pattern recognition, associative memories, parallel computation, and optimization solution. Therefore, increasing attention has been paid to the problem of stability analysis of neural networks with time-varying delays, and recently a lot of research works have been reported for delayed neural networks and system (see [117] and references therein).

It is well known that the stability problem is central to analysis of a dynamical system on an equilibrium point. However, from a practical point of view, it is not always the case that the neural network trajectories will approach a single equilibrium point that is the equilibrium point will be unstable. It is also possible that there is no equilibrium point in some situations, especially for interval recurrent neural networks with infinity distributed delays. But what we can know is that the orbits of the neural networks will always enter into a bounded region and stay there from then on. Therefore, the concept of dissipativity (or called Lagrange stability) has been introduced in [18]. Actually, the concept of dissipativity in dynamical systems is a generalization of the Lyapunov stability. The global Lyapunov stability especially can be viewed as a special case of global dissipativity by regarding an equilibrium point as an attractive set [1921]. Generally speaking, the goal of study on globally dissipative for neural networks is to determine globally attractive sets. Therefore, many initial findings on the global dissipativity [18, 2230] or Lagrange stability [3136] analysis of neural networks have been reported. At present, global dissipativity theory has been shown to be an appealing and efficient approach for dealing with the problems such as stability theory, chaos and synchronization theory, system norm estimation, and robust control of neural networks without uncertainties [18, 22, 23, 29, 30] or of those with uncertainties [2428, 37, 38].

As it is well known, the use of constant fixed delays in models of delayed feedback provides a good approximation in simple circuits consisting of a small number of cells. However, neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths. Thus there will be a distribution of conduction velocities along these pathways and a distribution of propagation delays. In these circumstances, the signal propagation is not instantaneous and cannot be modeled with discrete delays. A more appropriate way is to incorporate continuously distributed delays [11, 16, 29, 34, 35, 37, 3941]. However, these distributed delays are usually unbounded, and this fact motivates our work.

Recently, much attention has also been paid to the robust questions of interval neural networks [712, 39, 40, 42, 43]. In [7], global robust stability for stochastic interval neural networks with continuously distributed delays of neutral type has been considered. Bao et al. have investigated the robust stability problem of interval fuzzy Cohen-Grossberg neural networks with piecewise constant argument of generalized type [9]. The global robust passivity analysis for stochastic fuzzy interval neural networks with time-varying delays has been studied in [42]. Balasubramaniam et al. have studied the robust stability for Markovian jumping interval neural networks with discrete and distributed time-varying delays [11]. Xu et al. have studied the stochastic exponential robust stability problem of interval neural networks with reaction-diffusion terms and mixed delays [12]. And the stationary oscillation problem of interval neural networks with discrete and distributed time-varying delays under impulsive perturbations has been studied by using the LMI approach in [40]. Moreover, there are some works on global dissipativity for interval neural networks with time delays such as time-varying delays [24, 25, 27, 38], mixed time-varying delays [26, 37]. Despite the existence of many reported results in the literature, there are still needs for more in-depth and comprehensive investigations. For example, in almost all the existing results, the activation functions of the neural networks are limited to be sigmoid functions, piecewise linear monotone nondecreasing functions with bounded ranges. Moreover, in these recent publications, time-varying delays [44] terms are required to be continuously differentiable, and the derivative is bounded and smaller than one.

Although there has been published a rich literature on dissipativity problem for neural networks, to the best of our knowledge, few authors pay attention to the dissipativity problem for interval neural networks with both discrete and infinity distributed delays. The problem for global robust exponential dissipativity of IRNNs with mixed time-varying delays and general activation functions, particularly made on it by means of LMIs [2, 6, 27, 36, 38, 40], especially remains open. Hence, this gives the motivation of our present investigation. It is worth pointing out that the proposed results are nontrivial because of (1) establishing a generalized differential inequality which is aimed dealing with the infinity distributed delay appearing in IRNNs; (2) proposing a new Lyapunov-Krasovskii functional that should be used for the general activation function skillfully; (3) proving a lemma to handle the appropriate matrices deformation so as to make use of linear matrix inequalities.

In this paper, we focus on the problem of global robust exponential dissipativity for a class of interval recurrent neural networks with general activation functions and mixed delays, which consists of time-varying and infinite distributed delay. For the sake of comparison, Lyapunov function and Lyapunov-Krasovskii functional are constructed, respectively, which can be used to handle the interval uncertain terms masterly by the LMI approach. What is more, when the LMIs-based uncertain parameters are feasible, several sufficient conditions are established to guarantee the global robust exponential dissipativity for the addressed IRNNs. And the specific estimation of positive invariant and global exponential attractive sets is also put forward. The purpose of this paper is trebling. First, we demonstrate two important Lemmas which play a vital role in the later theorems. Second, we tackle the problem of global robust exponential dissipativity for IRNNs with both time-varying and infinite distributed delays based on the general activation functions. Third, the results are performed in LMIs, which will be efficiently solved by the MATLAB LMI Toolbox [45] and compared with those presented in [26, 36, 37]. The rest of this paper is organized as follows. In the next section, some preliminaries, including some definitions, assumptions, and significant lemmas, will be described. Section 3 will state the main results. Section 4 will present two illustrative examples to verify the main results, and finally a summery will be given in Section 5.

Notations. Throughout this paper, represents the unit matrix; in , the symbols and stand, respectively, for the -dimensional Euclidean space and the set of all real matrices. and denote the matrix transpose and matrix inverse. or denotes that the matrix is a symmetric and positive definite or negative definite matrix. Meanwhile, indicates and is the Euclidean vector norm. When is a variable, . denotes the floor function and . Moreover, in symmetric block matrices, we use an asterisk “*” to represent a term that is induced by symmetry and stands for a block-diagonal matrix.

2. Preliminaries

The interval recurrent neural networks with infinity distributed delays are described by the following equation group: where is the neuron state vector of the neural network; is an external input; is the transmission delay of the neural networks, which is time varying and satisfies , where is a positive constant; represents the neuron activation function, and represents the delay kernel function. The matrices , , , and are some unknown diagonal matrix, connection weight matrix, the delayed weight matrix, and the distributively delayed connection weight matrix, respectively, satisfying where ,  with , , , .

In addition, let where denotes the column vector with th element to be 1 and others to be 0.

By some simple calculations, one can transform system (1) into the following form: Or, equivalently, where

In this paper, the system (1) is supplemented the initial condition given by , and , where , denotes real-valued continuous functions defined on . Here, it is assumed that, for any initial condition , there exists at least one solution of model (1). As usual, we will also assume that for all in this paper.

For further discussion, the following assumptions and lemmas are needed.

(A1) The activation function satisfies , and for all , where and , are some real constants.

(A2) The delay kernels are some real value nonnegative continuous functions defined in and satisfy , in which corresponds to some nonnegative function defined in ; constants , and are some positive numbers.

Next, we first introduce the definitions of global robust exponential dissipativity for interval recurrent neural networks (1) or (5) and then state the notation of the upper right Dini derivative and some preliminary lemmas, which are needed to prove our main results.

Definition 1 (see [19]). If there exists a compact set such that, for all , for all , ; then is said to be a globally attractive set of (1) or (5), where is the complement set of . A set is called positive invariant set of (1) or (5), if, for all , for all implies for .

Definition 2 (see [19]). The neural network defined by (1) or (5) is called a globally exponentially dissipative system, if there exists a radially unbounded and positive definite Lyapunov function , which satisfies , where is a constant, and constants , such that for , the inequality always holds. And is said to be a globally exponentially attractive set of (1) or (5), where and is a constant.

Definition 3. The neural network defined by (1) or (5) is a globally robustly exponentially dissipative system if the system is globally exponentially dissipative for all , and .

Definition 4. For any function , we define its right-hand derivative as

Lemma 5 (see [40]). For any vectors , the inequality holds, in which is any matrix with .

Lemma 6 (Schur complement [13]). For a given matrix , with , , then the following conditions are equivalent: (1), (2), (3).

The following two lemmas will be used for deriving our main results.

Lemma 7. Let , and denote nonnegative constants, and function satisfies the scalar differential inequality where , satisfies for some positive constant in the case when . Moreover, when , the interval is understood to be replaced by . Assume that Then for all , where , and satisfies the inequality

Proof. We first note that condition (11) implies that there exists a scalar such that inequality (12) holds.
Consider the following equation: Because and , we follow that is a strictly monotone decreasing function. Meanwhile, we also notice that there always exists a positive constant such that . Therefore, by view of the mean value theorem, there is a constant such that . Correspondingly, there exists a constant such that , namely, .
Next, we will show for all . In order to do this, let Now we only need to show that . It is clear that for by the definition of . Next, we can prove that for . Suppose, on the contrary, that there exist some such . Let ; then Suppose that . Calculating the upper right Dini derivative along the solution of (5), by (12) and (13), we get which contradicts (13). So we have proven for all , that is; for all , where satisfies (12). This completes the proof.

Remark 8. It should be noted that it is only require that the function satisfies the assumption: is integrable if .

Lemma 9. Given constant matrices , , , , , , , and and appropriate reversible matrices , and , let Then .

Proof. Firstly, we discuss , and where ; ; .
Calculating , we obtain Comparing the above equations, we can know . The proof is finished.

Lemma 10 (see [46]). The following inequality is true:

3. Main Results

In this section, we shall derive some sufficient conditions for globally robust exponentially dissipative for the IRNNs (1) or (5) with general activation functions, continuous time-varying delay, and infinity distributed time delay by means of applying the lemmas in Section 2 repeatedly.

3.1. Results Using Lyapunov Functions

In this part, sufficient conditions for global robust exponential dissipativity of (1) or (5) are got by using Lyapunov functions and inequality techniques.

Theorem 11. Assume that Assumptions (A1)-(A2) hold; if there exist three constants , and 3, seven positive diagonal matrices , and , and , and two positive definite matrices such that the following inequalities hold: where , , Then the neural network defined by (1) or (5) is a globally robust exponentially dissipative system, and the set is a positive invariant and globally exponential attractive set.

Proof. Now, we consider the following Lyapunov function:
Calculating the derivative of along the trajectories of (5), we can obtain
From Assumption (A1) and Lemma 5, we know that there exist three positive diagonal matrices , and and a positive definite matrix such that the following inequalities hold: and by well-known Cauchy-Schwarz inequality and Assumption (A2), we get which implies that In view of the definition of , we have the following inequality: Considering Lemma 5 and (31), we derive
Now, adding the terms on the right of (26)–(30) and (32) to (22), considering conditions (17), and making use of Lemma 6, we can obtain that Transforming (33) into the following inequality, we get where .
From formula (34), we can know that it satisfies the (11) of Lemma 7. Meanwhile, noticing Assumption (A2), it can be deduced that . So (12) of Lemma 7 is also satisfied. From this, when , and , according to Lemma 7, we are able to derive where satisfies Simultaneously, judging by [1], it is easy to prove that there exists a constant such that . In terms of Definitions 1, 2, and 3, we know that the neural network defined by (1) or (5) is a globally robust exponentially dissipative system, and is a positive invariant and globally exponentially attractive set of system (1) or (5). Hence, the proof of Theorem 11 is completed.

Remark 12. It should be noted that the exponential convergence rate of IRNNs (1) or (5) is also derived in (35). Moreover, one may find that condition implies that there exists constant such that (1) or (5) holds for any given .

When , and are some known constant matrices, we have the following simple result.

Corollary 13. Assume that Assumptions (A1)-(A2) hold; then the neural network defined by (1) or (5) is a globally exponential dissipative system if there exist three constants , and 3, three positive diagonal matrices , and , and two positive definite matrices such that the following inequalities hold: where , , , and the set is a positive invariant and globally exponential attractive set.

In the IRNNs system (1), when getting rid of the term of infinite distributed delay , we get Corollary 14 based on Theorem 11.

Corollary 14. Assume that Assumption (A1) holds, if there exist three constants , five positive diagonal matrices , and , and two positive definite matrices such that the following inequalities hold: where , , , Then the neural network defined by (1) or (5) is a globally robust exponentially dissipative system, and the set is a positive invariant and globally exponential attractive set.

Proof. In front of the course of proof is almost parallel to that of Theorem 11, except for inequality (27) in Theorem 11, here no longer say. In the end, we can also obtain where .
It is noticed that ; hence according to the famous Halanay Inequality [47], when and , we are able to derive where is the unique positive root of . Similarly, it is obtained that is a positive invariant and globally exponential attractive set of system (1). Hence, the proof is gained.

3.2. Results Using Lyapunov-Krasovskii Functionals

In this part, sufficient conditions for global robust exponential dissipativity of (1) or (5) are obtained by using Lyapunov-Krasovskii functional and inequality techniques.

Theorem 15. Assume that Assumptions (A1)-(A2) hold, if there exist three constants , and 3, eight positive diagonal matrices , and , , and , and two positive definite matrices such that the following inequalities hold: where , , , , , , , , for all , , ,, Then the neural network defined by (1) or (5) is a globally robust exponentially dissipative system, and the set is a positive invariant and globally exponential attractive set of system (1).

Proof. Now, we consider another Lyapunov functional
Calculating the derivative of along the trajectories of (5) and using Lemma 10, we can obtain From Assumption (A1), for given positive diagonal matrix we have where .
By using Assumption (A1) and Lemma 5, we know that there exist six positive diagonal matrices , and and a positive definite matrix such that the following inequalities hold: Similarly, by well-known Cauchy-Schwarz inequality and Assumption (A2), we get which implies that Here, the processing method of the term is similar to the technique of Theorem 11.
Substituting (47)–(52) in (46) and using Lemma 9, we also have where
Following from , there exists such that where . In the light of Lemma 6, one gets
Meanwhile, it is noticed that
Therefore, it can be deduced that Combining the inequalities , and formulas (53) and (54), we can derive
From Assumption (A1) and formula (43), one also gets
Noticing and according to (60) and (61), we obtain Transforming (62) into the following inequality, we get where .
From formula (63), we can know that it satisfies (11) of Lemma 7. Meanwhile, noticing Assumption (A2), it can be deduced that . So (12) of Lemma 7 is also satisfied. From this, according to Lemma 7, when , , and , we are able to derive where , and satisfies Simultaneously, judging by [1], it is easy to prove that there exists a constant such that . In terms of Definitions 1, 2, and 3, we know that the neural network defined by (1) or (5) is a globally robust exponentially dissipative system. And noticing , it is said that is a positive invariant and globally exponential attractive set of system (1) or (5). Hence, the proof of Theorem 15 is completed.

Remark 16. For the dissipativity or Lagrange condition given in [22, 25, 26, 28, 3135, 37], the time delays are constant delays or time-varying delays that are differentiable such that their derivatives are not greater than one or finite. Note that in this paper we do not impose those restrictions on our time-varying delays, which means that our presented results have wider application range.
In the special condition that and ,  , and are some known constant matrices, the model (1) can be rewritten as follows: The other conditions are similar to the model (1). Through Theorem 15, the following corollary can be obtained.

Corollary 17. Assume that Assumptions (A1) holds, if there exist three constants , three positive diagonal matrices , and , and two positive definite matrices such that the following inequalities hold: where , , , , , , for all , , , then the neural network (66) is a globally exponential dissipative system, and the set is a positive invariant and globally exponentially attractive set of system (66).

Proof. The course of proof is almost parallel to that of Corollary 14 and Theorem 15.

Remark 18. When in Corollary 17, its result will turn into that of [36] right away.

Remark 19. To the best of our knowledge, few authors have discussed the dissipativity analysis of interval neural networks with general activation functions and infinity distributed delay, and there are few results made on it by LMIs [2, 6, 27, 36, 38, 40]. So, the results of this paper are novel and meaningful. Meanwhile, different from [26, 36, 37], some of the more general results are considered in this paper.

Remark 20. It is not also difficult to find that Lemmas 7 and 9 play a vital roles in the whole paper, especially Lemma 7. The full-text results are obtained based on the case of in Lemma 7. In addition, it is needed to point out that Lemma 7 is also suitable for the RNNs (1) with finite distributed delays . Consequently, the results of this text are also suitable for the case of finite distributed delay. So, the conclusions of this paper are more general and valuable than the literature [26, 37].

Remark 21. In this paper, the main results are obtained mainly by means of constructing two different kinds of , of which the difference is depended on the form of general activation function . Generally speaking, Theorem 11 is suitable for the case that is unbounded continuous function and Lurie-type function, and Theorem 15 is adapted that is bounded function and Lipschitz-type function. After a short while, in the part of Example 1, about this will be given elaborate illustration.

4. Illustrative Examples

In this section, two numerical examples are presented to demonstrate the usefulness of the developed methods on the globally robust dissipativity by comparing with the previous results of [19, 26, 28].

Example 1. Consider the interval neural networks model (1) with the following parameters: And the delay kernel is elected as for , and 3.
In this case, by simple calculation, it can be obtained that Clearly, , and we choose . In addition, let ; the activation function satisfies Assumption (A1) with , . In this case, we choose , and . Obviously, it satisfies the condition . Note that , and solving the LMIs in Theorem 15 using the Matlab LMI Control Toolbox, we obtain the feasible solutions as follows: Hence, the above results show that all the conditions stated in Theorem 15 have been satisfied, and the networks (70) is a globally robust exponentially dissipative system. Moreover, by calculating the eigenvalues of , we gain that the set is a positive invariant and globally exponential attractive set of (1).
On the other hand, we can conclude that system (70) is globally robust exponentially dissipative by Theorem 11.
All other things are being equal; if the activation function is replaced by , then it is easy to check that the LMIs in Theorem 15 do not have feasible solution in MATLAB. Therefore, Theorem 15 is ineffective in the case. However, it can be deduced that the LMIs in Theorem 11 are feasible, and Thereby, by Theorem 11, we obtain that system (70) with is globally robust exponentially dissipative, which implies that Theorem 11 can be applied to the case not covered in Theorem 15.

Remark 22. By virtue of Theorem 3.4 in [28], we have , which does not satisfy the condition of Theorem 3.4, so the conclusion of [28] is not feeble in this case. It implies that the proposed results in this paper improve and generalize [28].

Remark 23. It should be noted that the exponentially dissipative rate in Example 1 is also obtained, and it satisfies and .

Remark 24. In order to imitate the dynamic behavior of system (1), we choose some parameters in (70) randomly as follows: Figures 1 and 2 depict the state trajectories of system (1) with parameters (70), respectively. Figure 3 depicts the phase plots of system (1) with parameters (70) in the case of . These numerical results show that system (1) with parameters (70) is globally robustive exponentially dissipative.

Example 2. Considering the following two-neuron RNNs with time-varying delay: where , Clearly, . In addition, If the activation function is chosen as , it is obvious that the activation function satisfies Assumption (A1) with , .
Under the circumstance, we choose . Obviously, it satisfies the condition . Then by using the MATLAB LMI Control Toolbox, the solutions are derived as follows: Calculating the eigenvalues of , we get the eigenvalues of it are 5.2457 and 9.0631. Therefore, following from Corollary 17, we gain that the set is a positive invariant and globally exponential attractive set of system (75). However, according to Theorem 4.2 of [19], we gain By calculating we get the four eigenvalues of which are −14.2006, −2.6268, 0.0778, and 2.4997, from which we are able to know that is not negative definite. So the conclusion in [19] can not be applied to determine the positive invariant and globally exponential attractive sets of (75) and further ensure the dissipation of system (75).
Meanwhile, noting the conditions of Example 2 also which satisfies Corollary 4 in [26], we verify the effectiveness of its conclusion by means of solving the LMI in equation (34) of [26]. It is easy to check that the linear matrix inequality (34) in [26] has not got feasible solution. Hence, for this example, our results in this paper are less conservative than those in [19, 26].

5. Conclusion

In this paper, we have studied the globally robust exponentially dissipative for interval neural networks with general activation functions and infinity distributed delays. To the authors best knowledge, few scholars have investigated the globally robust exponentially dissipative for interval neural networks even without infinity distributed delays [26, 37]. By employing the inequality techniques including a novelty delay differential inequality and some LMIs, we have established some sufficient conditions to ensure globally robust exponentially dissipative for interval neural networks with both time-varying delays and infinity distributed delays. In addition, a series of positive invariant and globally exponentially attractive sets of system (1) or (5) are also obtained. Those obtained results improve and complement some recent works (e.g., in [19, 25, 26, 28, 36, 37]). These criteria are stated in LMIs, so that their verification and applications are straightforward and convenient. The results obtained in this paper have also been justified by numerical examples using computer simulations.

For the further work, we intend to generalize the techniques introduced in this paper to the stochastic neural networks based on memristor [17, 48, 49] or impulsive stochastic reaction-diffusion [12, 14] and fuzzy neural networks [4, 15, 16, 50].

Acknowledgments

The authors would like to thank the editor and the anonymous reviewers for their helpful comments and suggestions, which have greatly improved the presentation of the paper.